ANYmal, a legged robot built by researchers, now has a new control strategy that allows it to walk rapidly and reliably across rough terrain. For the first time, the robot can integrate its visual awareness of the world with its sense of touch, thanks to machine learning.
ANYmal, a legged robot built by ETH Zurich researchers lead by Marco Hutter, has developed a novel control strategy that allows it to walk rapidly and reliably across rough terrain. For the first time, the robot can integrate its visual awareness of the world with its sense of touch, thanks to machine learning.
The approach to the 1,098-meter-high Mount Etzel near the southern end of Lake Zurich is riddled with several challenges, including steep stretches on slick terrain, high stairs, scree, and root-infested forest pathways. However, ANYmal, a quadrupedal robot from ETH Zurich’s Robotic Systems Lab, climbs the 120 vertical meters in only 31 minutes. That’s 4 minutes quicker than a human hiker would go — and without any falls or stumbles.
This is made feasible by a novel control technique recently reported in the journal Science Robotics by researchers at ETH Zurich lead by robotics professor Marco Hutter. “Based on direct leg contact, the robot has learnt to integrate visual awareness of its surroundings with proprioception (its sense of touch). This enables it to traverse hard terrain more quickly, effectively, and, most importantly, robustly “According to Hutter. ANYmal will be able to go places that are too hazardous for people or too difficult for other robots in the future.
Accurately perceiving the surroundings
Humans and animals integrate their visual perception of their surroundings with their proprioception of their legs and hands to negotiate challenging terrain. This enables them to navigate slick or mushy terrain with ease and confidence, even when vision is limited. Legged robots have only been able to accomplish this to a limited degree up until now.
“The reason behind this is because the information about the near surroundings collected by laser sensors and cameras is sometimes insufficient and confusing,” says Takahiro Miki, a PhD student in Hutter’s lab and the study’s primary author. Tall grass, small puddles, and snow, for example, seem to be insurmountable barriers or are largely undetectable, despite the robot’s ability to overcome them. Furthermore, in the field, the robot’s perspective might be obstructed by poor illumination, dust, or fog.
“That’s why robots like ANYmal must be able to judge for themselves whether to trust their visual impression of their surroundings and go ahead quickly, and when it’s best to advance carefully and with little steps,” Miki explains. “And that, my friends, is the great difficulty.”
An online training camp
The legged robot ANYmal, created by ETH Zurich researchers and marketed by the ETH spin-off ANYbotics, can now integrate external and proprioceptive awareness for the first time thanks to a new controller based on a neural network. Before the robot could put its talents to the test in the real world, the researchers put it through a virtual training camp where it was exposed to a variety of barriers and causes of mistake. This allowed the network to figure out the best route for the robot to go around obstacles, as well as when it should depend on environmental input and when it should disregard it.
Professor Hutter of ETH Zurich adds, “With this training, the robot is able to grasp the most challenging natural terrain without having seen it before.” Even if the sensor data on the near surroundings is imprecise or hazy, this method works. ANYmal then takes a conservative approach and depends on its proprioception. According to Hutter, this enables the robot to combine the best of both worlds: external sensing’s speed and efficiency with proprioceptive sensing’s safety.
Under severe circumstances, use
ANYmal robots may be employed in situations when humans are too hazardous and other robots are unable to deal with the harsh terrain, as as after an earthquake, after a nuclear accident, or during a forest fire.