Robots That Navigate in the Dark: Echolocation System Inspired by Nature

Image by Unsplash
Representational image of drone

This post is also available in: עברית (Hebrew)

A new navigation system developed at the University of Michigan enables robots and drones to operate effectively in environments where conventional visual tools fail. By using echolocation—similar to the method employed by bats and dolphins—this AI-powered system allows machines to move through complete darkness, smoke, or dust without relying on cameras, GPS, or lasers.

Funded by the U.S. Army Research Office and Ground Vehicle Systems Center, the project centers on ultrasonic pulses. The system sends out high-frequency sounds and measures the returning echoes to form a map of the surrounding area. Unlike vision-based technologies that depend on light or clear sightlines, this approach functions regardless of visibility conditions, making it particularly suited for disaster response and military operations.

At the heart of the system is an ensemble of specialized convolutional neural networks (CNNs), each trained to interpret echoes from specific object shapes, and capable of adding new shapes without retraining the entire system, saving time and resources.

According to Interesting Engineering, rather than using real-world trials, the team trained their system using synthetic data and simulated 3D environments. These simulations included realistic sound distortions to mimic real-world acoustic conditions. This method enabled the AI to learn how echoes vary depending on material type, shape, angle, and ambient noise.

The specialized CNNs were trained to recognize subtle distinctions in echo patterns. The researchers deliberately included objects with similar shapes and acoustic signatures to challenge the system. Despite these complications, the model demonstrated strong performance in correctly identifying geometric forms from ultrasound data.

Because the system is based entirely on sound, it avoids the limitations of optical sensors and can perform in settings where light-based methods are unreliable. According to the researchers, this technology moves machine perception closer to how biological systems process information, opening doors for broader applications beyond robotics—such as medical imaging, industrial inspection, and autonomous vehicles.

The full study appears in the Journal of Sound and Vibration.