This post is also available in: heעברית (Hebrew)

Companies are using augmented reality technologies in their manufacturing processes. Lockheed Martin, for example, uses augmented reality goggles in assembling its space systems for NASA. With the goggles on, technicians can see relevant information and instructions in the space around them as they go about their work, saving them from having to constantly walk back and forth to consult physical manuals or computer monitors.

Now, the US DoD is advancing the development of an artificial intelligence system able to scan instruction manuals and convert that data into instructions for augmented reality systems. 

The Defense Advanced Research Projects Agency has issued a $5.8 million contract to a team developing the technology. Under the contract, PARC, a Xerox company, will work with the University of California at Santa Barbara, the University of Rostock in Germany and Patched Reality on the Autonomous Multimodal Ingestion for Goal-Oriented Support (AMIGOS) project for the Perceptually-enabled Task Guidance Program. 

The goal is to take the existing paper and video manuals used today and automatically convert them for use in augmented reality systems.

According to the project leader, “Augmented reality, computer vision, language processing, dialogue processing and reasoning are all AI technologies that have disrupted a variety of industries individually but never in such a coordinated and synergistic fashion.” “By leveraging existing instructional materials to create new AR guidance, the AMIGOS project stands to accelerate this movement, making real-time task guidance and feedback available on-demand.”

The teams will deliver two different but related systems to DARPA. The first is an artificial intelligence system that will be able to extract task information from texts, illustrations, and videos. 

The second system will take that information and create augmented reality guidance based on it. Moreover, the second AI will be able to deliver tasks and information in a personalized way based on the user’s skills and emotional state, according to