DARPA has selected Northrop Grumman and the University of Central Florida to develop a prototype augmented reality headset embedded with an AI assistant to help train rotorcraft pilots to deal with unexpected tasks and emergencies.
One of the challenges of being a military pilot is that the job requires a lot of multitasking and a very high level of continuous attention. Where civilian pilots act largely as airborne administrators making sure that their aircraft, passengers, and cargo get to their destination safely, military pilots have to deal with tactical situations in which emergencies can arise at any moment.
The problem is in training the pilots how to recognize a particular situation and how to react. A common remedy is to install some sort of alarm that goes off if there’s a fire, an unseen obstacle ahead, a missile radar locks on, or some other threat. But this isn’t enough. The pilot must also know that the alarm is important, what it signifies, and what action to take.
For decades, this has been a problem for designers. For example, tests conducted in the 1980s showed that if people are placed in a room in a strange building and the fire alarm goes off, they’ll simply sit there for several minutes wondering about what’s going on. This is why there are fire drills so the occupants of a building know what a fire alarm sounds like and what to do if it goes off, and why many modern alarms unambiguously announce, “There’s a fire. Please go to the nearest marked exit.”
For military pilots it is even worse. They have to deal with a symphony of alerts and alarms that can produce counterproductive unanticipated cognitive burdens that result in important alarms not only being ignored, but not even being heard.
Part of DARPA’s Perceptually-enabled Task Guidance (PTG) program, the Operator and Context Adaptive Reasoning Intuitive Assistant (OCARINA) prototype will be designed to support UH-60 Blackhawk helicopter pilots who fly day and night in all weather conditions both by eye and instrument as they deal with being close to the ground and in the vicinity of buildings, trees, terrain, and hostile radar beams seeking a target.
The PTG AI assistant will be developed to see what the pilot sees and use advanced information processing and an augmented reality interface in the headset to provide feedback and guidance in the form of project graphics on the pilot’s view, along with text and speech. In this way, pilots can be drilled in new tasks in a realistic manner by an assistant that can adapt to a given situation.
“The goal of this prototype is to broaden a pilot’s skill set,” said Erin Cherry, senior autonomy program manager, Northrop Grumman. “It will help teach new tasks, aid in the recognition and reduction of errors, improve task completion time, and most importantly, help to prevent catastrophic events.”
AI assistant to teach military pilots how to deal with the unexpected [New Atlas]