Thales’ research and development engineers showed a business jet concept of FlytX during their InnovDays 2019 event in Paris.
The first virtual co-pilot powered by artificial intelligence (AI) could be ready for next-generation business jet cockpit avionics within the next few years, according to comments made by engineers working on the industrial and regulatory challenges to making AI a reality inside future safety critical aircraft systems.
Eurocontrol hosted the first in a series of “Fly AI” webinars on Feb. 24 designed to foster more collaboration and address the regulatory barriers and challenges that currently exist around establishing a certification path for the use of artificial intelligence and machine learning in safety-critical aircraft systems and applications. Baptiste Lefevre, advanced technologies regulation manager for Thales, said that the French avionics supplier first approached EASA about a version of its FlytX cockpit system that will feature an AI assistant back in November 2018.
Now, the two sides are among a variety of government and industry stakeholders participating in a series of working groups tasked with establishing an agreed upon process for demonstrating that any and all newly developed artificial intelligence applications can essentially meet the same type of certification requirements as those that are widely used in obtaining certification for new and novel aircraft technologies today. Lefevre said that the timeline they have established for the use of an AI co-pilot in the next-generation version of the FlytX cockpit matches EASA’s roadmap for certification.
“We foresee the development of an augmented pilot assistant by 2025, artificial pilot, reduced operations, or single-pilot operations by 2030, and autonomous aircraft by 2035,” Lefevre said. “And it’s really the basis of EASA to develop their regulatory roadmap and development guidance with first usable guidance in 2021 for human assistant to have first approval by 2025.”
FlytX is an integrated modular avionics system first developed by Thales for the French military’s fleet of H160 helicopters. All of the processing power normally stored in an aircraft’s electrical equipment bay are eliminated through FlytX’s use of smart displays that provide all necessary computing for all communications, navigation and surveillance applications.
According to Lefevre, another new element that will become an AI enabler for FlytX is its native connection to “the open world,” where environmental and air traffic system data is available on a per-flight basis for pilots. The main goal for FlytX’s AI pilot assistant is to augment routine tasks for the human pilot, such as frequency selection or suggestions for actions they should take based on a change in aircraft behavior or the external flight environment.
Lefevre believes a reliance upon science will be key to establishing a regulatory framework for the artificial intelligence that will be featured in FlytX, which will also feature machine learning capabilities.
“Machine learning is a new science, it’s still developing and there are still a lot of scientific challenges so we have to work on it first on the industry side. But also, work is needed on the regulatory side because it’s very likely that the demonstration that will provide a measure of safety or quality of our product will be based on science. To enable that, we have to share together what are the scientific knowledge that is needed for AI certification. It is not only about regulation, not only about processes: but also about science,” Lefevre said.
On the regulatory side, EASA took its first major step toward developing a regulatory framework where applicants can apply for and demonstrate their AI system’s ability to meet safety-critical regulatory certification requirements last year with the release of its first AI roadmap. At the time, the agency confirmed in a statement to Avionics that they had received the first formal applications for the certification of AI-based systems in 2019 and they expect the first certification to occur in 2025.
The next milestone in EASA’s regulatory framework roadmap comes this year, with the agency expected to provide its first official guidance for the certification of AI-based systems, Guillaume Soudain, senior software expert for EASA said during the webinar. Soudain confirmed they’re still on track to meet the 2021 timeline they set out last year despite the challenges of working under the impact of COVID-19.
“We’re still on the way to publish it by end of March [or] beginning of April. We’re still on time,” Soudain said.
Both Soudain and Lefevre pointed to two key working groups that are currently working on building a common understanding of certification objects and industry standards for AI, while also sharing knowledge on their latest AI research results. Eurocae’s Working Group 114 (WG-114) “Artificial Intelligence,” established in June 2019 to help aerospace manufacturers and regulatory agencies implement common sense approaches to certifying the non-deterministic qualities of the use of AI within safety-critical avionics software.
The DEEL Working Group, based in Quebec, also has participation from EASA, Thales, and other academic researchers, data scientists, and engineers tasked with solving scientific challenges raised by their industrial partners trying to obtain certification for AI systems featured in aircraft. Some of the examples of the DEEL group’s research areas include the number of tests needed to reach an acceptable level of confidence, the concept of AI “explainability” and how to monitor machine learning at run time.
Soudain said EASA’s focus in the roadmap is safety-critical applications as well as how to address the need for explainability associated with AI systems. EASA is also using what is calls “Innovation Partnership Contracts” as a way of establishing dialogue between applicants with new AI technologies as early as the development stage.
“As a regulator, we have the duty to create the framework, we will be working on topical objectives to be achieved to go enable the approval capabilities. The means of compliance themselves will come from the operational flow and our stakeholders,” Soudain said. “Every day I’m pleased to receive emails with research papers and new considerations on whatever methodology can be applied to explainability, it’s a constant exchange and we’re getting a lot and that’s where the partnerships are so important.”