Estimated reading time 8 minutes, 55 seconds.
Someday in the future, a U.S. Army helicopter in contested battlespace will command and control multiple unmanned aircraft systems (UAS) in flight. So-called air launched effects (ALE) will fly ahead of the helicopter, relaying video and other information back to its crew. They will find and fix the location of enemy ground positions while the helicopter remains out of harm’s way, effectively serving as aerial scouts for the aerial scout.

That’s the concept, anyway. The Army has already proven its general feasibility with in-flight demonstrations, and ALE capability is now embedded in the Army’s Future Vertical Lift (FVL) efforts. Yet, much of that capability is still being defined. One overarching question is how the ALE will be controlled, and to what extent the crew responsible for the helicopter will have the cognitive bandwidth to manage multiple drones, too.
“Managing where multiple aircraft are going at one time, means you’ve got to be very judicious with the kind of information that you’re displaying,” said Mike Hubler, co-founder of the software design firm Tektonux. “You’ve got to avoid information overload, information paralysis — just tell me what I need to know when I need to know it.”
A startup based in Huntsville, Alabama, Tektonux specializes in bringing the usability of video games to military and other mission-critical applications. Its team was already working with the Army on software user interfaces for controlling uncrewed vehicles when Collins Aerospace asked them to join a demonstration project last year.
With its demonstration, Collins wanted to illustrate not just a specific solution, but also its ability to implement a modular open systems approach (MOSA) — using open systems standards to rapidly integrate capabilities from multiple vendors, including Tektonux. According to Tom von Eschenbach, Collins’ program manager for Army Avionics, MOSA is leading defense contractors like Collins to change the way they do business as the Army seeks to maximize its procurement agility and avoid vendor lock.
“When we were faced with how to implement MOSA and how to deliver that promise of ‘better, faster, more affordable’ to the Army, we realized that it was best to bring a team of teams together . . . and find a way to integrate all that into a seamless capability,” von Eschenbach said. “Certainly we’re showcasing products, but I think the more important part about MOSA and this demonstration was the process in which we brought these best-of-breed vendors together to deliver the best quality product to the Army in a faster and more affordable manner.”
Other partners in the demonstration included the MQ-1C Gray Eagle manufacturer General Atomics Aeronautical Systems Inc., the artificial intelligence (AI) company Palantir Technologies, and Parry Labs, which contributed its Stellar Relay mission computer. Each team worked more or less independently on their contribution to the demonstration using the Collins-provided Transport Segment Service (TSS), toolkits and an Army Scalable Control Interface (SCI) reference architecture, which adhere to the Future Airborne Capability Environment (FACE) technical standard. Then, the partners came together to integrate their individual elements into a comprehensive cockpit management solution over a period of around six weeks.
“That’s part of the beauty [of MOSA], is being able to mature things at different rates, at different levels, and then being able to integrate them because we’re all following the same rules for data bus messages and using the same kind of interfaces,” said Hubler. “It’s not quite as simple as pulling out your HDMI cable and putting in a new 8K television and plugging it back in. But that’s the general metaphor that we’re going for.”
The MOSA solution was demonstrated toward the end of last year at Collins’ Customer Experience Center in Huntsville. The integration allowed command and control of a Gray Eagle and its electro-optical/infrared sensor using FACE software components from General Atomics and Tektonux, and included a simulated ALE launch from the Gray Eagle. The next step will be to demonstrate the solution in flight using real aircraft, but von Eschenbach said the timeframe for such a demonstration has yet to be determined.
AI and autonomy are key features of the solution and what will make it practical for flight crews to use ALE. Rather than being manually controlled by an operator onboard the helicopter, the drones will execute their missions largely autonomously. Flight crews won’t need to keep their eyes glued to the drones’ video feeds, because AI will identify enemy weapons systems and automatically transmit information about them to all users.