Photo Info
General Atomics Aeronautical Systems, developer of the Gray Eagle series of unmanned aircraft, was one of the participants in Collins’ MOSA demonstration. Shown here is a rendering of the latest variant, the Gray Eagle 25M, with air-launched effects. GA-ASI Photo

Collins, partners explore how UAS and ALE control will work from the cockpit

By Elan Head | December 15, 2022

Estimated reading time 8 minutes, 55 seconds.

Someday in the future, a U.S. Army helicopter in contested battlespace will command and control multiple unmanned aircraft systems (UAS) in flight. So-called air launched effects (ALE) will fly ahead of the helicopter, relaying video and other information back to its crew. They will find and fix the location of enemy ground positions while the helicopter remains out of harm’s way, effectively serving as aerial scouts for the aerial scout.

General Atomics Aeronautical Systems, developer of the Gray Eagle series of unmanned aircraft, was one of the participants in Collins’ MOSA demonstration. Shown here is a rendering of the latest variant, the Gray Eagle 25M, with air-launched effects. GA-ASI Photo
General Atomics Aeronautical Systems, developer of the Gray Eagle series of unmanned aircraft, was one of the participants in Collins’ MOSA demonstration. Shown here is a rendering of the latest variant, the Gray Eagle 25M, with air-launched effects. GA-ASI Photo

That’s the concept, anyway. The Army has already proven its general feasibility with in-flight demonstrations, and ALE capability is now embedded in the Army’s Future Vertical Lift (FVL) efforts. Yet, much of that capability is still being defined. One overarching question is how the ALE will be controlled, and to what extent the crew responsible for the helicopter will have the cognitive bandwidth to manage multiple drones, too.

“Managing where multiple aircraft are going at one time, means you’ve got to be very judicious with the kind of information that you’re displaying,” said Mike Hubler, co-founder of the software design firm Tektonux. “You’ve got to avoid information overload, information paralysis — just tell me what I need to know when I need to know it.”

A startup based in Huntsville, Alabama, Tektonux specializes in bringing the usability of video games to military and other mission-critical applications. Its team was already working with the Army on software user interfaces for controlling uncrewed vehicles when Collins Aerospace asked them to join a demonstration project last year.

With its demonstration, Collins wanted to illustrate not just a specific solution, but also its ability to implement a modular open systems approach (MOSA) — using open systems standards to rapidly integrate capabilities from multiple vendors, including Tektonux. According to Tom von Eschenbach, Collins’ program manager for Army Avionics, MOSA is leading defense contractors like Collins to change the way they do business as the Army seeks to maximize its procurement agility and avoid vendor lock.

“When we were faced with how to implement MOSA and how to deliver that promise of ‘better, faster, more affordable’ to the Army, we realized that it was best to bring a team of teams together . . . and find a way to integrate all that into a seamless capability,” von Eschenbach said. “Certainly we’re showcasing products, but I think the more important part about MOSA and this demonstration was the process in which we brought these best-of-breed vendors together to deliver the best quality product to the Army in a faster and more affordable manner.”

Other partners in the demonstration included the MQ-1C Gray Eagle manufacturer General Atomics Aeronautical Systems Inc., the artificial intelligence (AI) company Palantir Technologies, and Parry Labs, which contributed its Stellar Relay mission computer. Each team worked more or less independently on their contribution to the demonstration using the Collins-provided Transport Segment Service (TSS), toolkits and an Army Scalable Control Interface (SCI) reference architecture, which adhere to the Future Airborne Capability Environment (FACE) technical standard. Then, the partners came together to integrate their individual elements into a comprehensive cockpit management solution over a period of around six weeks.

“That’s part of the beauty [of MOSA], is being able to mature things at different rates, at different levels, and then being able to integrate them because we’re all following the same rules for data bus messages and using the same kind of interfaces,” said Hubler. “It’s not quite as simple as pulling out your HDMI cable and putting in a new 8K television and plugging it back in. But that’s the general metaphor that we’re going for.”

The MOSA solution was demonstrated toward the end of last year at Collins’ Customer Experience Center in Huntsville. The integration allowed command and control of a Gray Eagle and its electro-optical/infrared sensor using FACE software components from General Atomics and Tektonux, and included a simulated ALE launch from the Gray Eagle. The next step will be to demonstrate the solution in flight using real aircraft, but von Eschenbach said the timeframe for such a demonstration has yet to be determined.

AI and autonomy are key features of the solution and what will make it practical for flight crews to use ALE. Rather than being manually controlled by an operator onboard the helicopter, the drones will execute their missions largely autonomously. Flight crews won’t need to keep their eyes glued to the drones’ video feeds, because AI will identify enemy weapons systems and automatically transmit information about them to all users.

“We’re trying to get to a point where what we’re giving [the ALE] is a mission rather than controlling them as if they were an aircraft. So instead of me giving them waypoints to fly on, instead of me having a joystick and literally directing where they fly, I’d like to tell them: I need to know if there’s a bad guy over there. I need you to find a target over here. I need you to confirm this target here,” Hubler explained. “It’s [thinking] about these unmanned vehicles as teaming with us and trying to get an objective done, rather than machines that we have to carefully control.”

At the same time, the solution will preserve the ability for crews to take more control of the drones when they can and want to. Hubler described it as a “sliding scale of autonomy depending on your level of expertise, the environment you’re in, the stress levels. We’ve actually, from a UX [user experience] perspective, worked on things like, let’s measure physiologically your pulse, your eyes and help gauge how stressed you are and slide that scale of autonomy based on that.”

According to von Eschenbach, the ability to continuously develop and mature these types of capabilities, then efficiently integrate them into the final product, is a key advantage of MOSA.

“The really, really powerful thing about what we’re doing is we don’t all have to start at the beginning, run this program for years, come out the end and then get graded,” he said. “We can have all our capabilities being developed all the same time, being improved and modified, and then still meet together at the end to have the latest and greatest of what we feel like is possible for the warfighter.”

Leave a comment

Your email address will not be published. Required fields are marked *

Intermountain Life Flight | Rescue With The AW109SP

Notice a spelling mistake or typo?

Click on the button below to send an email to our team and we will get to it as soon as possible.

Report an error or typo

Have a story idea you would like to suggest?

Click on the button below to send an email to our team and we will get to it as soon as possible.

Suggest a story