Search
Artificial intelligence and machine learning technologies enhance future security systems.
Artificial intelligence and machine learning technologies enhance future security systems.
( Source: Public Domain / Unsplash)

Aviation and Avionics New research project on AI/ML Guidance

| Editor: Florian Richert

The Swiss company Daedalean AG has signed another innovation partnership agreement with EASA. The new research project bases on a previous project in which AI and ML concepts have been developed. The goal of the continuing project is to optimize the learning assurance concepts and the embodiment of AI in safety-critical applications.

European public authorities want to encourage and help develop a diverse technological landscape. As early as 2018, EASA set up an AI (artificial intelligence) task force whose task was to certify "AI-based" applications under European law and also to develop an AI roadmap. In early 2019, the AI Task Force adopted the European Commission's Ethical Guidelines for Trusted AI. EASA published its AI Roadmap in February 2020.

Pioneering AI in aviation and avionics

In January 2020, the joint research work of the expert group of EASA and Daedalean concluded after ten months of the first phase of the project. The project examined the challenges arising from the application of neural networks (NN) in aviation in connection with the certification of machine learning onboard aircraft. The aim was to create concepts and safety standards for the application of AI in safety-critical avionics. A Joint Report Concepts of Design Assurance for NN (CoDANN) published in March 2020. The core of the report are guidelines for "Learning Assurance" (as opposed to the traditional "Development Assurance"), which provided the first building blocks for the future certification of AI systems.

Learning assurance guidelines for adapting evolving technologies.
Learning assurance guidelines for adapting evolving technologies.
(Source: Daedalean)

The report included an overview of realistic performance and safety assessments to define failure tolerances; data set sizes, and further specifications for appropriate safety levels.
The quantitative analyses indicated that security for neural networks could be guaranteed at the appropriate levels of criticality.

Applicable guideline for safety-critical ML applications

The new project will implement and refine the concepts developed in the first stage for the requirements of stage 1 AI and ML (machine learning) in the practical context of specific use cases and scenarios. The resulting report, scheduled for early 2021, will include proposals for guidelines for NN-based systems. In particular, the inclusion of NN in safety assessments and the refinement of the concept of "explainability" in a practical context shall receive special attention. "We are very happy to partner with EASA for the next phase in our joint effort on the certification of machine-learned systems for avionics," said Luuk van Dijk, CEO and founder of Daedalean. "In the first project, EASA has shown a firm commitment to moving this topic forward in the interest of all of the aviation industry. And for the coming IPC, the Agency stepped up with an increased team of strong specialists. We will be taking a concrete in-flight system through a certification trajectory to find the open questions, with the intent to provide concrete usable answers. Daedalean is proud to be EASA's partner in this ambitious undertaking. "

(ID:46753738)