Making UAV Operations Safer

Making UAV Operations Safer

Photo illust. US Navy Wikimedia
060619-M-8788S-003 Yuma, Ariz. (June 19, 2006) - Boeing Company Advanced Tactical Systems Engineers Mark LaVille and Kris Kokkely, mount an Unmanned Aerial Vehicle (UAV) called Scan Eagle on a pneumatic wedge catapult launcher before it flies over Marine Corps Air Station Yuma for training during Desert Talon 2006. Scan Eagle is an UAV system that is designed to provide persistent intelligence, surveillance and reconnaissance data, battle damage assessment and communications relay. U.S. Marine Corps photo by Cpl. Michael P. Snody (RELEASED)

This post is also available in: heעברית (Hebrew)

A new technology can improve the safety of UAV flights, especially those performing beyond line of sight (BLOS) flights. The new Automated Emergency Safe Landing (AESL) functionality for UAV is designed to capture and classify images, at altitude, allowing UAV to autonomously identify a safe landing area in the event that something goes wrong. 

The technology was developed by Black Swift Technologies (BST) for its S2 UAV. This functionality processes large amounts of data quickly and efficiently, allowing objects and terrain to be identified to be avoided in order to land the aircraft without harm to people or property, according to the company.

While the technology’s functionality is currently exclusive to the company’s purpose-built UAV platform, pairing its technology with third-party systems could be possible.

Jack Elston, Ph.D., CEO, said: “The goal of AESL is to be able to take a snapshot and within 60 seconds of something like a catastrophic engine failure, be able to identify a landing zone, calculate a landing trajectory, and safely land a UAS away from people and obstacles. We remain convinced that a thorough understanding and integration of artificial intelligence and machine learning can help serve as a catalyst for accelerating UAS growth and adoption industry-wide.”

According to auvsi.org, the technology was developed thanks to a NASA SBIR Grant that it was awarded. The company also leveraged an ongoing collaboration with a Colorado-based technology company called Luxonis that specializes in embedded machine learning, artificial intelligence, and computer vision.

Brandon Gilles, CEO, Luxonis explains: “This technology uses video or still imagery of the ground to determine what those objects are, and classifies them as humans, vehicles, and/or structures—things you have to avoid at all costs, even if it’s at the expense of the aircraft—to identify safe landing areas for a UAV in distress”.

“Leveraging machine vision and artificial intelligence, AESL enables a human-like perception of the world where autonomy doesn’t have to rely entirely on GPS, altimeters, or the like. This system can visually understand what’s around it and make decisions accordingly, in real-time.”