This post is also available in: עברית (Hebrew)
One of the major issues with the technology of fully or semi-autonomous vehicles is that they may not be able to accurately predict the behavior of other self-driving and human-driving vehicles. This prediction is essential to properly navigate autonomous vehicles on roads.
A new technology for autonomous systems is responsive to human emotions based on machine-learned human moods. Adaptive Mood Control provides a convenient, pleasant, and more importantly, trustworthy experience for humans who interact with autonomous vehicles. The technology can be used in a wide range of autonomous systems, including self-driving cars, autonomous military vehicles, autonomous airplanes or helicopters, and even social robots.
The solution has been developed by Mehrdad Nojoumian, Ph.D., an associate professor in the Department of Computer and Electrical Engineering and Computer Science and director of the Privacy, Security and Trust in Autonomy Lab at Florida Atlantic University. “The uniqueness of this invention is that the operational modes and parameters related to perceived emotion are exchanged with adjacent vehicles for achieving objectives of the adaptive mood control module in the semi or fully autonomous vehicle in a cooperative driving context,” he said.
“Human-AI/autonomy interaction is at the center of attention by academia and industries. More specifically, trust between humans and AI/autonomous technologies plays a critical role in this domain, because it will directly affect the social acceptability of these modern technologies.”
The patent uses non-intrusive sensory solutions in semi or fully autonomous vehicles to perceive the mood of the drivers and passengers. Information is collected based on facial expressions, sensors within the handles/seats and thermal cameras among other monitoring devices. Additionally, the adaptive mood control system contains real-time machine-learning mechanisms that can continue to learn the driver’s and passengers’ moods over time. The results are then sent to the autonomous vehicle’s software system allowing the vehicle to respond to perceived emotions by choosing an appropriate mode of operations such as normal, cautious or alert driving mode, according to fau.edu.
Prepared to dive into the world of futuristic technology? Attend INNOTECH 2023, the international convention and exhibition for cyber, HLS and innovation at Expo, Tel Aviv, on March 29th-30th
Interested in sponsoring / a display booth at the 2023 INNOTECH exhibition? Click here for details!