This post is also available in: עברית (Hebrew)
In line with its strategy of innovation for the future of vertical flight, Airbus Helicopters is developing an experimental on board image processing management system aimed at performing automatic approaches and landing in challenging conditions, such as at urban environments, as well as paving the way for future sense & avoid applications on autonomous vertical take-off and landing (VTOL) systems.
Codenamed Eagle, for Eye for Autonomous Guidance and Landing Extension, this system federates the entire helicopter’s image processing functions and feeds them into the avionics system, thus improving the crew’s situation awareness and reducing the pilot’s workload by automating and securing approaches, take-off and landing in the most demanding environments.
Ground tests of Eagle have been ongoing since May this year and initial flights tests on a testbed helicopter will begin shortly, according to the company’s website.
The system, which could be embedded in a variety of existing and future Airbus VTOL vehicles, relies on a gyro-stabilized optronics package, which includes three high resolution cameras and state-of-the-art processing units, as well as on-board video analytics providing advanced functionalities such as object detection and tracking, digital noise reduction as well as deep learning.
Future versions of the Eagle system will also integrate a laser, which combined with the high processing capability could open the door to other applications such as a new generation of search lights, obstacles detection and 3D terrain reconstruction.
“While existing missions such as search and rescue and offshore transportation will benefit from Eagle’s capabilities, the system will also help address future requirements for operations in urban environments”, said Tomasz Krysinski, Airbus Helicopters Vice-President Research & Technology. “Ultimately, thanks to its ability to provide increased situation awareness, Eagle will also contribute to improve the safety, autonomy and performance of future unmanned vehicles.”