This post is also available in: heעברית (Hebrew)

Researchers from the U.S. Army Combat Capabilities Development Command Army Research Laboratory (CCDC ARL) led experiments where a Soldier used a system using only hand gestures and head gaze (i.e., where the Soldier was looking) to deliver commands for tasking an autonomous robot to conduct intelligence, surveillance and reconnaissance missions.

The experiments were held in New York during an international strategic challenge, the Technical Cooperation Program Contested Urban Environment, or TTCP CUE 2019.

CCDC ARL’s Dr. Christopher Reardon leads the Soldier-signaled ground robot navigation and mapping, or SSGRNM, team said the “artificial intelligence-enabled robot navigated and explored autonomously under the Soldier’s direction.” “The robot’s sensor readings were used to map the urban environment and locate an object of interest, and this information – including map, robot’s location and plan and object location – were transmitted to the human and displayed on a head-worn augmented reality device,” Reardon said.

The information provided situational awareness for the Soldier as he walked through the environment with the robot. “Conducting experiments in these real urban environments in one of the biggest cities in the world allowed participants to explore the limits of their technologies,” Reardon said.

The Army team collaborated closely with Australia’s Defence Science Technology, led by Geoffrey White and Ki Ng. “Our Remote Autonomous Systems of the future will need to protect the Soldier and enable them to fight with a technological overmatch,” White said. “These systems allow the Soldier to have a virtual presence removing them from the harm of an initial contact. Gesture and other modalities of control such as speech and haptics allow these autonomous systems to be controlled in a more intuitive and effective manner, without distracting from the task at hand. Working collaboratively with our coalition partners means we can leverage from each other’s expertise, reduce duplication and focus on our combined problems to bring our soldiers home safely.”

The technology combines emergent human-robot teaming technologies — mixed reality and gesture control — with fully autonomous robots in field environments and tested the system with Soldier input, according to