Bat-Inspired Drones Navigate Where Cameras Can’t

Representational image of a drone in dust

This post is also available in: עברית (Hebrew)

Researchers at Worcester Polytechnic Institute (WPI) in Massachusetts are developing small aerial robots that rely on sound rather than cameras to navigate, offering new capabilities in environments where visual sensors fail. The work draws inspiration from birds and bats, which use echolocation to move through complex surroundings.

The project focuses on enabling drones to operate in smoke, dust, and darkness, conditions that typically limit traditional vision-based systems. With support from a National Science Foundation (NSF) Foundational Research in Robotics grant, the team will advance the development of sound-guided navigation over a three-year period starting in September 2025.

These drones are designed to be smaller than 100 millimeters and weigh under 100 grams. To overcome challenges such as propeller noise and limited ultrasonic resolution, the researchers are employing specially designed metamaterials that minimize interference. Adjusting material geometry allows sound waves to reflect more predictably, while alternative propulsion methods, including flapping-wing designs, further reduce acoustic disruption.

On the software side, the drones rely on physics-informed deep learning to interpret ultrasonic signals and navigate autonomously. Hierarchical reinforcement learning algorithms allow them to reach targets while avoiding obstacles dynamically, and onboard computation ensures that all processing occurs locally without external infrastructure. Sensor fusion integrates echolocation with inertial and other sensor data, improving situational awareness in complex environments.

According to Interesting Engineering, potential applications include search and rescue operations, disaster monitoring, war zones and inspection of hazardous areas. Future iterations could use ultrasonography to detect human heartbeats, making it easier to locate survivors in challenging conditions. Even when obstacles are detected, research aims to enable flight speeds above two meters per second to respond rapidly in real-world scenarios.

The project represents a shift from conventional vision-based autonomy toward bio-inspired approaches that rely on sound. By combining material innovations, adaptive flight mechanics, and advanced algorithms, the team seeks to produce compact, energy-efficient drones capable of operating reliably in conditions that would challenge standard robotic systems.

Field deployment is expected within three to five years, potentially transforming how drones are used in emergency response, environmental monitoring, and other situations where visibility is limited.

The original press release can be found here.