This post is also available in: heעברית (Hebrew)

The emergence of next-generation immersive devices like the Oculus Rift and Microsoft HoloLens has increased interest in using mixed reality to simulate training, enhance command and control, and improve the effectiveness of fighters on the battlefield.
The US Army Research Lab (ARL) is conducting experiments to assess the usefulness of virtual and augmented reality (VR and AR) systems for soldiers. However, the researchers said, there is little scientific evidence that using immersive technology provides measurable benefits such as increased task engagement or improved decision accuracy. It is hard to measure how effective these technologies are when it comes to training, according to photonics.com.
In response, the ARL-developed Mixed Reality Tactical Analysis Kit, or MRTAK, is an experimental platform used to assess the value of AR and VR during collaborative mission planning and execution. MRTAK is now being developed as the mixed reality module of project AURORA (Accelerated User Reasoning for Operations, Research, and Analysis), as AURORA-MR.
AURORA-MR serves as a testbed to perform tightly controlled basic and applied research of multi-user decision-making with distributed immersive systems.
Putting mission-relevant battlefield data, such as satellite imagery or body-worn sensor information, into an immersive environment could allow soldiers to retrieve, collaborate, and make decisions more effectively than traditional methods. The research conducted with AURORA-MR could lead to a better understanding of when visualizing and interacting with battlefield information is best done in an immersive system or through traditional systems.
“Through virtualization of some or all elements of the Tactical Operations Center, commanders and intelligence analysts can communicate and collaborate without the constraints of a physical building and with a reduced footprint to enemy intelligence, surveillance, and reconnaissance,” researcher Mark Dennison said.
The design of AURORA-MR enables easy integration with other databases, sensors, and machine learning so that joint research can occur across ARL and its academic and industry partners.
“Currently, we are evolving the network powering the platform, to allow for greater control over the information that is sent and received by clients, while ensuring that the virtual environment is rendered at a comfortable frame rate to minimize the crippling effects of motion sickness on immersed users,” Dennison said. “This will enable us to conduct research on how ingestion and analysis of data from noisy systems, such as the Internet of Battlefield Things, can be augmented through distributed collaboration in mixed reality.”