Nature-Based Deep Learning Model Inspired by Insects Set to Revolutionize Vehicle Autonomy

Nature-Based Deep Learning Model Inspired by Insects Set to Revolutionize Vehicle Autonomy

image provided by pixabay

This post is also available in: heעברית (Hebrew)

While companies all over the world are working on deep learning techniques that will enable machines to operate autonomously, this UK-based startup is on a mission to transform the world of autonomy with what they call “advanced natural intelligence”.

Opteran has spent a decade trying to understand how insect brains work, how they see, find their way around, navigate, and react to the world.  They then turned these findings into algorithms hey call “natural intelligence,” which they believe could completely change how autonomy works, making it more practical and efficient.

“Insects like honeybees have about a million neurons by comparison to around 86 billion in a human being, but the central systems are there,” explained Opteran’s CEO David Rajan. “They see the world, localize themselves in space and can even navigate up to 10 kilometers consuming just micro watts of power. If you really want to see state of the art autonomy, don’t go to California… look at a garden.”

According to Interesting Engineering, the team aims to enhance the system in the upcoming months with additional functions (like collision avoidance), and by next year they plan to integrate decision-making algorithms that will enable machines to prioritize tasks.

What also makes Opteran’s technology stand out from others is that it doesn’t require extensive training on large datasets before deployment – the algorithms are innate, enabling them to autonomously navigate the world and adapt to dynamic variability without needing continuous data gathering and training.

Setting up an autonomous guided vehicle (AGV) in a warehouse usually involves a lot of work, with operators carefully scanning the entire facility and deal with a lot of data before letting the robots move around the warehouse. The Opteran Mind significantly simplifies the setup process. “All it takes for them to do is to drive one robot at the speed that robot operates and off it goes,” explained Charlie Rance, the company’s chief product officer. “We’re essentially creating a solution that allows their AGVs to autonomously map in one shot, so it’s a very quick set up time and you can share that with the rest of your AGVs. We can remove fixed infrastructure, so we’re not using any kind of reflectors or QR codes or anything. And we’re keeping the system at a lower cost.”

Possible real-world applications include applications for indoor security drones in the consumer market, implementations in the mining and automotive sectors, logistics, and many more.