December 17, 2021
PARC, a Xerox company, announced it has been awarded a $5.8 million (€5.1 million) contract by the Defense Advanced Research Projects Agency (DARPA) to work with the University of California at Santa Barbara, the University of Rostock and Patched Reality, an Augmented Reality company, to develop an Artificial Intelligence (AI) system that will guide users in complex physical tasks beyond their abilities.
The system will convert text and video-based manuals to a format that can be processed by a computer and guide users to complete tasks using AR guidance while also monitoring task execution.
PARC will be the lead contractor on the project, known as “Autonomous Multimodal Ingestion for Goal-Oriented Support” (AMIGOS) for the Perceptually-enabled Task Guidance Program. The project will introduce new methods to extract procedural knowledge from content, perceive the environment, reason about physical tasks, and enable conversational and AR guidance that is personalised to an individual user’s level of expertise. The goal is to enable mechanics, medics and other specialists to perform tasks within and beyond their skillsets by providing just-in-time feedback and instructions for physical tasks.
“Augmented reality, computer vision, language processing, dialogue processing and reasoning are all AI technologies that have disrupted a variety of industries individually but never in such a coordinated and synergistic fashion,” said Dr. Charles Ortiz, the principal investigator for AMIGOS. “By leveraging existing instructional materials to create new AR guidance, the AMIGOS project stands to accelerate this movement, making real-time task guidance and feedback available on-demand.”
The project will deliver two major systems to DARPA. The first will be an offline component that can learn from multiple modalities, such as language and vision, to extract the steps from unlabelled content required to complete a task, including text instructions from manuals, illustrations and instructional videos.
The second will be an online component that uses a hybrid AI approach combining both symbolic and neural AI elements to create interactive AR guidance based on the information extracted by the offline component. The guidance delivered by the system will be personalised to the individual user’s abilities and sensitive to the user’s emotional state during performance.
Categories : Around the Industry