This post is also available in: heעברית (Hebrew)

The ability of sensors to perceive things through the fog will be essential for future air vehicles such as urban air taxis. Instruments such as optical and infrared cameras, radar, and lidar devices will become their “eyes.” Fog presents a major challenge to those sensors, and how well today’s technology can take it in stride is an important question. 

The answer will drive the next phase of their development and help these aircraft fly autonomously.

Fog is an extreme environment for perception technology. Sensors are affected by fog to different degrees. With different strengths and weaknesses, optical, radar, lidar, and other systems are complementary. 

The signals emitted by a lidar device scanning an area for a safe landing spot might reflect off the water droplets in fog, instead of the objects they’re meant to detect. Each of the sensors that might be used on unpiloted passenger aircraft in the future is impacted differently by fog, and designers need to know how they are different.

A team from NASA’s Ames Research Center in California’s Silicon Valley used for their study a special fog chamber at Sandia National Laboratories, that can repeatedly produce, with scientific precision, a fog with the specific density needed.

The team studied the capabilities of various sensors under fog conditions. These tests to study how far and how well today’s technology can see in foggy weather will help answer how safe an aircraft relying on them would be. NASA will release the data for use by companies and researchers working to develop information processing techniques and improve sensors for Advanced Air Mobility vehicles. They need this kind of data to build accurate computer simulations, discover new challenges, and validate their technology for flight.

A fusion of different sensors combined in the smartest ways will help make the market opened by Advanced Air Mobility a safe, productive reality, according to