Reversed Augmented Reality Tested

Reversed Augmented Reality Tested

This post is also available in: heעברית (Hebrew)

Augmented reality and virtual reality technology is expected to be a multi-billion dollar industry for the US government by 2021. Augmented reality technology is growing in the US military and government market. In addition to Microsoft’s Hololens head-mounted display contract, BAE Systems is working on an augmented-reality system onboard a Royal Navy warship as part of a £20 million ($27 million) investment in advanced combat systems technology.

Augmented reality tools are commonly seen as mapping new features onto existing terrain, but what if they could also do the opposite? What if augmented reality was used to reveal features hidden by existing terrain?

Within the framework of a new AR development, the areas where the augmented processing struggles the most are vehicles that don’t conform to existing expectations. Trained on a similar data set, a military heads-up display could flag technical vehicles as distinct from passenger vehicles, identify up-armored cars or unusual load configurations.

Applied to a difference formation, and with, say, a database of trees, the program could provide an immediate visual way to isolate unusual objects in a forest from the natural foliage, and display how the unusual object would stand in a clearing.

The “Biophillic Vision – Experiment 1” project has been created by software developer Chris Harris. What’s fascinating is the way it creates real-time image distortions, with the code processing cars and backgrounds and generating new images to replace them. At the present level of the technology, the effect is a shimmer, a flicker in and out of being. It makes the viewer feel like the reality they are observing through the camera is an overtaxed video game, unable to hold together its entire virtual reality.

To make the effect work, Harris adapted open-source code. The first is vehicle detection, and the second one does image completion. With roughly a day of work putting it together, Harris had an AI that produces images at roughly 2 frames-per-second. He speculates that more time could make it run in real time on high-end mobile devices, as reported by c4isrnet.com.

Since the tools are open source, Harris’ video provides a proof of concept for anyone interested in an augmented reality tool that can isolate objects in the lived environment, and illustrate ways around them. More haunting, tools that erase objects from digital displays could be put to malicious purpose, as machines that rely on vehicle detection see instead a clear path.