What if we say, your robot can respond to a real-life situation? For instance, the fire alarm going off while you’re taking piano lessons.
Would you say it’s absurd?
Facebook intends to develop robot assistants that can attend to our needs in different ways.
To be precise, Facebook is on the verge of developing “embodied AI,” – Facebook A.I. Research, the AI research wing. This so-called embodied AI predicts to go beyond the present-day AI voice interfaces like Google assistant, Alexa, and Siri by doing tasks that give them the authority to work in the physical environment.
People have thought AI agents to be disembodied chatbots, well Facebook is here to prove you otherwise. The focus is now drawn toward building a system that can act and perceive things for real, like in the real world.
As the term suggests “embodied AI,” involves working within the physical world. This involves fixing the system with a physical body and see how the body fits in the real-world scenario. Based on the theory of embodied cognition (relation between social interaction and decision-making). The idea of embodied AI aims at understanding whether intelligence is as much a part of the brain as the body.
Using this logic, AI researchers are trying to build intelligent systems to improve their functionality.
Kristen Grauman, Professor of Computer Science at the University of Texas at Austin and a research scientist at Facebook A.I. Research told Digital Trends,
“We are still far from these capabilities, but you can imagine scenarios like asking a home robot ‘Can you go check if my laptop is on my desk? If so, bring it to me,’ or the robot hearing a thud coming from somewhere upstairs, and going to investigate where it is and what it is.”
Although the end goal for Facebook is still distant, it sure managed to make some progress with the development.
Recent advancements made by Facebook on embodied AI:
- Indoor mapping system – this allows the robot to navigate an unexplored terrain.
- SoundSpaces – it is an audio simulation tool that allows us to produce realistic audio rendering depending on the geometry of the room, materials, etc. This can aid future AI assistants in understanding how sound works in the real world.
What’s next for embodied AI?
Research done by Facebook is not just about developing the physical version of smart AI assistants but maybe way beyond this – maybe have the ability to build more contextual and smart bots that can go even beyond the version of Microsoft’s Clippy avatar.
For instance, engineers would like to ask questions the AI bots could give accurate answers to, like, “what was the dessert we ate on Sunday night?” or “where did I replace my car keys?”
This means that Facebook engineers should put in more effort into building an embodied AI agent. Their focus should be involved in navigating from one place to another, developing and storing memories, planning what the next step would be, understand the gravity, and how they need to decode human activities.
Other collaborators for these projects include names like the University of Illinois, and the University of Texas Austin, Oregon State University, and Georgia Tech.
According to Dhruv Batra, professor at the Georgia Tech College of Computing and research scientist at Facebook A.I. Research,
“Facebook A.I. is a leader in many of the subfields that Embodied A.I. encompasses, spanning computer vision, language understanding, robotics, reinforcement learning, curiosity and self-supervision, and more. “It’s a significant feat to make advances in each of these sub-fields individually, and combining them in innovative ways enables us to push the field of A.I. even further.” Source: Digital Trends.
Besides Facebook, tech giants like Apple, Google, and Amazon have already explored this field before. So, Facebook is not the only company standing to come up with such projects.