Facebook has developed what it calls a foundational “breakthrough” in the race to create more humanlike robots: software that enables machines to learn to walk like toddlers.

Humans are very efficient at maneuvering. As kids, we figure out how to adjust our stride and cadence to trek through mud, water and up and down hills with ease. Through trial and error, we adapt, figuring out the best ways to move our feet according to real-time situations. And we can do this while toting a variety of objects, either in our hands or on our backs.

It’s tough to program robots to make instantaneous adjustments to their legs and feet to accommodate such a variety of tasks, mainly because it’s hard to train them to deal with corner cases, or objects and environments they’ve never seen before.

This is one of the main things artificial intelligence struggles with today, said David Cox, IMB director of the MIT-IBM Watson AI Lab, a collaborative effort between IBM and the Massachusetts Institute of Technology. “How do you build systems that can adapt to all the corner cases they may see?”

Advanced robot navigation could revolutionize services in a wide range of fields such as emergency response, agriculture, autonomous driving and manufacturing. It could hold the key to more complex chore robots. But it also requires teaching machines to act in the same way humans do subconsciously, based on lived experience — something that, at best, would be tedious and potentially impossible.

Humans learn to navigate new environments by stumbling and trying again. But that’s an expensive and lengthy undertaking when applied to robots, which need to be fixed or have their code tweaked when damaged. Researchers try to avoid this by simulating new environments and adjusting robot brains accordingly.


It’s a challenge Facebook says it has solved in collaboration with the University of California, Berkeley and Carnegie Mellon University.

First, researchers used simulations to train AI to respond to various environmental conditions, such as slippery ground or a sudden incline. Then, they taught a generic dog-like robot to learn from its mistakes and keep walking as far as possible despite sudden changes to its environment. They layered the two strategies, and together they “enable the robot to perform robust and adaptive locomotion without any fine tuning,” Facebook says.

What this means is that AI allows them to adapt to factors in their environment without having seen them first. Rather than trying to avoid disruptions, they learn from surprises and move with them. Through the observation AI, each new leg movement is informed by previous ones. Obstacles that push against the robot’s feet or legs can reveal information about the ground around it. The AI learns from that.

The so-called Rapid Motor Adaptation software might allow companies to create cheaper automated machines that figure out how to operate at peak performance with more affordable, not-as-accurate hardware.

Cheaper robots are a critical step toward more advanced robots in more fields. Facebook showed off its AI development in a video.

An eerie four-legged robot is shown pacing through the woods with relative ease. But when brought inside and tested in other situations, such as slippery surfaces, it had balance issues and difficulty walking. In one example, when weighted bags were placed on its back, the robot fell over. With Facebook’s AI software enabled, it wobbled but managed to stay upright and keep walking when the bags were tossed onto it. There are no cameras on the device. All of the robot’s movements were guided by sensors in its feet and various joints, which allow it to experience the world through “touch.”


Researchers compare it with what happens when humans try to walk on the sand at the beach for the first time. “The first few steps are awkward, but after a few steps, you’re walking on a beach as normally as you would on a hard road surface,” said Jitendra Malik, a computer vision researcher at Facebook AI and UC Berkeley. The team studied scientific literature on how children learn to walk to inform their project.

Facebook isn’t the only big tech firm trying to create software that makes robots act more like real people.

IBM, for instance, is researching AI methods that simulate raising a baby to understand what “common sense” knowledge a child gains, which can be applied to robotics. Last year, Google published a framework for wheeled robots to move through “unforeseen environments.” Boston Dynamics, a pioneer in agile, mobile robots, has a system on its Spot robots so they can “feel” the ground beneath them to try to avoid falling in low visibility.

Facebook’s system was built on a $2,700 Unitree robot. Boston Dynamics’ Spot robot costs $74,500.