Estimated reading time 7 minutes, 14 seconds.
Artificial intelligence (AI) techniques, computational fluid dynamics (CFD) simulations and data analytics procedures are being used jointly to help improve the availability of a critical U.S. Air Force helicopter while reducing maintenance costs and extending how long the aircraft can remain in service.
The Virtual Sensing Technologies for Accelerometer Reconstruction (VSTAR) program, sponsored by the U.S. Joint Artificial Intelligence Center (JAIC), uses these techniques to fill gaps in flight data measurements collected by HH-60G Pave Hawk helicopters. This flight data, taken from an accelerometer onboard the aircraft at the base of the main rotor, helps maintainers understand the loads that the helicopter was subjected to during flight. These load levels, in turn, help determine when each helicopter must be taken out of service for maintenance, and ultimately when each aircraft will reach the end of its expected lifetime.
The challenge is that data from this specific 4G accelerometer on the HH-60G is sometimes unusable. When that happens – on as many as 10% of all flights – maintainers assume the affected aircraft was subjected to the worst possible airframe stresses, known as the “composite worst-case” scenario, during an entire flight. Doing that can lead to removal of helicopters from service sooner, replacement of components prematurely, and negatively impact warfighter readiness – compared to what would be required had the complete set of correct flight data been available.
“This drives sustainment, maintenance and logistical planning, but the most critical impact has been that the warfighters had fewer and fewer aircraft available to them,” said David Alvord, a senior research engineer who led the research program at the Georgia Tech Research Institute (GTRI).
Developed by GTRI in collaboration with the JAIC and U. S. Air Force, VSTAR recreates the missing accelerometer data using a Deep Neural Net (DNN) with additional data streams collected by the aircraft’s health and usage monitoring system (HUMS) at the same time as the corrupted accelerometer data. This AI DNN model, combined with load coupling from corresponding CFD models, reconstructed corrupt data affecting 6,500 hours of flight data. VSTAR improved more than 270 days’ worth of flight time measurements by applying these machine learning and neural network techniques that correlated the HUMS information with accelerometer records.
“The digital version recreates the data that is lost so we can take the bad data out of the accelerometer information and replace it with good data,” explained Alvord. “With the recreated data, the accelerometer information passes back into the maintenance and sustainment decision stream to make it more accurate.”
VSTAR has been transitioned, deployed, and adopted by the Air Force as part of its Aircraft Structural Integrity Program (ASIP) post-flight analysis. Discussions have been held with the Navy, Army and Coast Guard about how variants of the VSTAR tool might be applied to their versions of ASIP H-60 maintenance tool to create a benefit from similar digital twin model. While the potential cost savings run into the millions of dollars, the impacts go beyond dollars.
“The value for the Air Force has been in increased availability of the vehicles for the warfighter,” Alvord said. “This gives them more platforms that can service more missions with more confidence. That doesn’t include the parts costs and costs associated with the personnel time needed to service the vehicles.”
An aerospace engineer by training, Alvord collaborated with GTRI experts in artificial intelligence, machine learning, neural networks, data analysis and computational fluid dynamics. A collaboration of disciplines was necessary to develop the capability to recreate the missing data as a predictive maintenance “digital twin virtual sensor” and integrate it into the maintenance ASIP flow for the HH-60G.
“We had four terabytes of heritage flight data from the aircraft, so we could get all the parameters and data streams that we needed,” he said. “We took that data to the neural net and trained it to disregard the one bad sensor, and based on the other sensors, to determine what it had done historically.”