Photo Info

How Daedalean plans to certify its AI-based autopilot

By Elan Head

Published on: July 30, 2019
Estimated reading time 10 minutes, 15 seconds.

The Swiss startup is using a more explainable form of artificial intelligence as a first step toward fully autonomous eVTOL air taxis.

Recent advances in machine learning and other forms of artificial intelligence (AI) are a key enabler for aeromobility, promising the type of sophisticated autonomy that will be necessary to deploy eVTOL air taxis on a large scale. However, many of these algorithms are essentially “black boxes,” deriving their results in ways that aren’t transparent or explainable, even to their programmers. That makes it difficult, if not impossible, to prove that they’re safe and reliable enough to meet today’s standards for airworthiness certification.

Daedalean computer vision on Volocopter eVTOL
Daedalean recently tested its computer vision system on a Volocopter eVTOL aircraft. Daedalean Image

The Swiss startup Daedalean AG, which is developing AI-based autopilots for future air taxis, thinks it has a way around this problem. Rather than leveraging the most advanced — but also the most opaque — forms of AI, Daedalean is focusing its attention on a subset of convolutional neural networks that do lend themselves to explanation. These neural networks are already powerful tools for image recognition, which gives Daedalean confidence that its vision-based flight control systems will not only outperform human pilots, they’ll also be certifiable in the near term.

“A basic misconception that is widespread in the industry is that AI is necessarily ‘non-deterministic’ and that nobody can possibly understand why it does something when it does something,” explained Daedalean founder and CEO Luuk van Dijk. With the specific convolutional neural networks Daedalean is developing, he said, “we have tools to look at what goes on inside the neural net, so that we are able to make predictable and testable statements about how it will behave in critical situations.”

Van Dijk founded Daedalean in 2016 along with Anna Chernova, a mathematical biologist who is also a licensed helicopter pilot. To develop a roadmap for their technology, they consulted the list of tasks that must be demonstrated successfully in order to obtain a commercial helicopter pilot license. “We thought OK, we’ll just systematically build systems that outperform the human convincingly on each of those, because that will allow us to roll out [an AI-based autopilot] with minimal disruption,” said van Dijk.

Daedalean image segmentation
Daedalean uses segmentation to train its neural networks to “see” the terrain beneath the aircraft. By classifying various features such as trees, buildings, and cars, the system can identify safe emergency landing sites in between various obstacles. Daedalean Image

According to their analysis, around 30 to 40 percent of those tasks involve the pilot using his or her eyes. For human pilots, van Dijk said, “using your eyes is really fundamental, and it’s also something that computers haven’t been able to do very well until fairly recently.” Because the current regulatory framework assumes “that there’s always a pilot on board with eyes and a visual cortex,” the partners decided to start by developing a computer vision system that can handle those visual tasks — including seeing and avoiding traffic and obstacles and identifying safe landing zones.

This is something for which convolutional neural networks are well suited. These are a specific category of artificial neural networks, in which layers of connected nodes, or “artificial neurons,” perform many successive computations on complex data inputs like visual images. Through training on labeled examples, these algorithms “learn” to answer questions such as, “Is there a cat in this picture? Or, where is the runway?” van Dijk explained. “These convolutional neural networks are actually quite capable of doing this, and in the last couple of years there’s been quite some progress in understanding why they do what they do.” While these explanations aren’t necessarily straightforward, he said, “it’s definitely not the case where we have no idea why it came up with something.”

Now, Daedalean is working with the European Aviation Safety Agency (EASA) under an Innovation Partnership Contract (IPC) to develop design assurance concepts specifically for neural networks. This will entail coming up with ways to ensure that any neural networks incorporated into aircraft systems are at least as safe, predictable, and reliable as existing flight-critical software. The project is expected to generate a final report in January 2020, from which a public version will be adapted. “We hope that becomes a way for the industry to talk about this set of problems and constraints,” van Dijk said.

Daedalean traffic spotting
Daedalean’s system can spot other traffic and birds at greater distances than the average human pilot. That will allow it to provide collision avoidance assistance to human pilots even before aircraft achieve full autonomy, the company said. Daedalean Image

These design assurance concepts will likely involve constraints on the types of algorithms that can be certified. For example, it won’t be feasible to use a neural network that continues “learning” in flight, “because then I have no idea how to prove anything about it anymore,” van Dijk explained. Instead, Daedalean is training its neural networks in the lab with extensive use of simulation, then will freeze successful configurations for applications in flight.

These systems will also need to know their limitations, he said: “You have to prove somehow that when the network sees something completely different than it’s ever seen before it says, ‘I have no idea what I’m seeing here’ . . . instead of randomly saying, ‘There’s a landing strip here.’” And, of course, the systems will need to identify actual landing strips at least as reliably as humans can.

For the particular question of incorporating neural networks into aircraft systems — one stepping stone on the way to full autonomy — van Dijk said that existing safety and design assurance processes should be able to work mostly unmodified. “The only place we expect a new means of compliance is needed is at the level of [design] requirements for the subsystem that contains the neural net, and we are examining this assumption and working on a concrete proposal for that in the context of the IPC,” he said.

Daedalean is developing its AI solution to be aircraft agnostic, applicable to conventional helicopters as well as new eVTOL concepts. Although van Dijk declined to identify the aircraft developers the company is working with at the moment, Daedalean recently demonstrated its computer vision system on a Volocopter electric multicopter (albeit outside of the control loop). Daedalean — which has received funding from Carthona Capital, along with recent investments by the venture capital funds Redalpine and Amino Capital — is also partnering with avionics developers including Mercury Mission Systems and Avidyne/Autonodyne on hardware for its software solutions.

Although Daedalean expects that its autopilots will initially be incorporated into piloted aircraft to help increase situational awareness and reduce workload, its ultimate goal is to demonstrate that its systems are safer than human pilots. Not only will autonomous systems be simpler if they don’t have to accommodate human operators, van Dijk said, “we also think that quite soon after the piloted eVTOLs become operational, it will become clear that as long as there is a dependency on human pilots, in the absence of full autonomy, the urban mobility business model and operations concepts cannot work.”

He emphasized, however, that the public is justified in being skeptical of autonomous systems until they have amply demonstrated their value and reliability — and that the economic promise of urban air mobility doesn’t justify any regulatory shortcuts.

“If people claim that something is safe and you should trust your life to it, you have a right to demand proof,” he said. “I think it’s a process you can’t rush, and you have to do this by responsibly proving safety above and beyond what avionics systems have today.”

Leave a comment

Your email address will not be published. Required fields are marked *

Notice a spelling mistake or typo?

Click on the button below to send an email to our team and we will get to it as soon as possible.

Report an error or typo

Have a story idea you would like to suggest?

Click on the button below to send an email to our team and we will get to it as soon as possible.

Suggest a story