Echolocation Technology Helps Robots Navigate

Echolocation Technology Helps Robots Navigate

This post is also available in: heעברית (Hebrew)

Israeli researchers have created what they said is the first robot to truly use echolocation – a biological sonar like the one that bats have – to help it explore the world autonomously. This research could help lead to unmanned systems that can navigate even when they cannot ‘see’, that is rely on visual sensors, which can benefit both flying drones and driverless cars.
A graduate student and colleagues at the mechanical engineering department of Tel Aviv University in Israel, have developed “Robat,” a robot that uses sonar like a bat to help it navigate autonomously. “Getting inspiration from animals can lead to new solutions.”
The prototype Robat rolls across the ground using the Komodo platform from Israel-based RoboTiCan. The robot was equipped with an ultrasonic speaker that imitated a bat’s mouth and two ultrasonic microphones spaced seven centimeters apart that mimicked bat ears that were all mounted on a DJI Ronin gimbal, according to insideunmannedsystems.com.
Whereas previous work involving airborne sonar for robots depended on speakers that each broadcast a narrow range of sound frequencies, Robat emitted a wide range of ultrasonic signals just like bats do. The echoes of those signals convey rich amounts of information about the objects and surfaces they bounce off. This helped Robat navigate with just one emitter instead of several.
In experiments where Robat moved at roughly a meter a minute, after every 30 seconds or so, it stopped and gave three chirps, each 10 milliseconds long, while aiming its speaker at three different angles. This helped the robot scan out to a range of about six meters.
Robat then used a kind of artificial intelligence system known as an artificial neural network to analyze the echoes. In such a system, components dubbed neurons are fed data and cooperate to solve a problem, such as recognizing images.
The neural net repeatedly adjusts the behavior of its neurons and sees if these new patterns of behavior are better at solving the problem. Over time, the network discovers which patterns are best at computing solutions, and then adopts these as defaults, mimicking the process of learning in the human brain. This is the first ground-based robot to use a neural net to help it analyze sonar data.
The robot could recognize if objects were plants or not, and it changed its heading if necessary to avoid obstacles fully autonomously. In experiments, the Robat successfully navigated two greenhouses without colliding with anything.
“We are able to map a new environment accurately and we are able to use a machine-learning algorithm to learn to classify objects. We can solve the problem of autonomous navigation using sound like bats do.”