In May, Piper Aircraft announced that its M600/SLS had become the first Garmin Autoland equipped aircraft to receive certification. Autoland is available on select G3000 flight deck-equipped aircraft. During a Nov. 12 webinar hosted by the Royal Aeronautical Society (RAeS), Wes Ryan, unmanned and pilotless technology lead at the FAA said Autoland is the closest evidence of the use of artificial intelligence starting to establish a presence in new aircraft systems.(Garmin)
While artificial intelligence/machine learning (AI/ML) is a priority for technologists seeking to advance military, civil and commercial aerospace, machines will not likely be running the air and space domains any time soon.
AI/ML algorithms so far lack the cognitive abilities required to make trusted, safety-critical decisions. In terms of “general artificial intelligence,” aviation authorities and aerospace firms are “decades away from something like that,” said Alex Georgiades, the United Kingdom Civil Aviation Authority’s (CAA) innovation services principal for autonomous systems.
CAAs, including the U.K. CAA and the Federal Aviation Administration (FAA), have been bounding AI/ML systems under constraints so that a chain reaction will not lead to a catastrophic failure.
“The place that we’re just beginning to delve into is systems like the recently certified Garmin Autoland system that in the event of an emergency can look for and identify the best landing site to put the aircraft back on the ground with a single button push,” said Wes Ryan, the FAA’s unmanned and pilotless aircraft technology lead. “The key to certifying that system was to bound where it can be used and how it can be used so that we could gain some experience with its use to show that it can be used safely. That assistive or emergency use idea is the first step for the on ramp to using this type of technology for more regular use, more critical use in the future.”
The next step is likely to be increased scrutiny of how AI/ML functions under such bounded conditions on an aircraft. An understanding of the functioning of AI/ML in whatever limited role it performs on an aircraft could yield significant safety benefits
“The problem is the black box where we don’t fully understand what’s going on, but something has happened–we may like the result; we may not like the result,” said Monty Christy, the founder of London-based Christy Aerospace and Technology. “I think we should be striving towards a white box where we know everything about what’s gone on inside [the box] and why it’s come to a decision, why the AI system has made a decision. We know the input, and we know the output, and what’s gone on in between. Traceability is key because with full traceability we know where it’s gone wrong; we know why it’s gone wrong, and we can stop it before it gets to any critical region, particularly the safety critical environment.”
A “white box” would allow personnel to identify and correct systems’ inadvertent automation biases in data sets and existing programming languages. The hope is that programmers will be able to write a new, AI/ML language that lacks such biases.
One significant challenge is retrofitting some mature AI/ML systems that lack traceability.
“We’re not talking about taking an off-the-shelf program and modifying it for our own purposes in the aviation/aerospace industry,” Christy said. “We may be talking about a new language that’s built from scratch to provide that traceability and the identification and removal of bias on Day One when a code is put together to support a system.”
Developing AI/ML systems that lack automation biases and that can reliably analyze and correlate data to aid pilot decision making will be a challenge, as will not granting AI/ML systems too much autonomy.
For its part, Boeing for the last two years has made changes to its Maneuvering Characteristics Augmentation System (MCAS) automated flight control system after two, fatal 737 MAX crashes. MCAS had pushed the noses of the planes down based on faulty sensor readings, and the pilots could not regain control.