Estimated reading time 12 minutes, 6 seconds.
Editor’s Note: Jack Schonely recently retired after a 31-year career with the Los Angeles Police Department, including 18 years with its Air Support Division (ASD) as a tactical flight officer, pilot, and flight instructor. This is the second article in a two-part series in which he and the ASD’s former safety officer, Mark Bolanos, recall some experiences that caused Schonely to rethink his personal approach to risk management. Read part 1 here.
Jack Schonely: A very smart safety officer once told me that it is important to slow down and consider completing a risk analysis before jumping into a mission, even one that you assume is “routine.” That former safety officer happens to be co-writing this article.
This was very sound advice and as I evolved as a pilot it was clear to me that many do not follow that advice. Countless times over the years I witnessed various personnel at my unit taking calls from “command staff” who made a wide variety of requests of Air Support. The usual response was, “Yes sir, we will make that happen.” That is the cop reaction, not one of an aviator.
What I learned over time from lots of discussions with Mark was that the prudent answer should be, “Tentative yes, I will call you back shortly with an answer.” That allows a quick risk analysis to be done to see whether the mission should be attempted at all and, if so, how it will be completed safely.
Sounds like common sense, right? Well it is, but law enforcement aviation sometimes falls back into the “cop mode” and forgets that saying “no” to a mission is in some cases the best response for everyone involved. Completing a risk analysis will clearly show whether or not a mission should be accepted.
A real-life scenario will demonstrate exactly what we are attempting to explain. Mark came to me and asked if I was aware of a training day that was being planned to deploy scuba divers into the water from the skids of a one of our Airbus AStar helicopters. I was not aware of that training and was surprised to hear about it. It had been years since our aviation unit had done this from a Bell UH-1 platform, and we had denied the request several times after the UH-1 was no longer in our fleet.
We inquired about this rumor and learned that, indeed, this training was being planned after a request came in from “command staff.” We found it odd that the safety officer and the lead pilot of the cadre who would be flying the mission were not involved in the planning.
We immediately had a long discussion of the obvious problems and concerns of completing this high-risk training, and advised our supervisor that a formal risk analysis must be completed. Our first question was simple: What mission were we training for? When would we ever use this technique in a real-life mission? “Putting divers in the water” was the usual response. That is not a mission; for what purpose would we be putting divers into the water?
It was obvious that we needed to reach out and learn what the real experts thought about this training. I was aware that the New York Police Department (NYPD) Aviation Unit frequently deployed divers from a Bell 412 and that dive team members were assigned to the NYPD hangar 24 hours a day. They were well trained and had lots of practical experience. I contacted a close friend who was a supervisor at NYPD Aviation and told him what was being planned for our over-water training day. His first question was, “You are doing this in a single-engine helicopter without floats?” I said yes, and his quick response was, “Are you crazy?”
We laughed, but he then provided me with a long list of concerns relating to hovering over water and deploying a scuba diver. Many of his concerns were already on the list Mark and I completed, but I wanted to hear it from an expert. Some of these concerns included the use of a single-engine aircraft in a low hover over water, plus some things we were lacking: aircraft floats, a radar altimeter, a hoist on the aircraft, helicopter emergency egress device (HEED) bottles, current dunker training, seat belt cutters, proper life vests, and pilot experience flying over water.
But the big concern remained, “What mission are you actually training for?”
Consulting the experts
Mark Bolanos: In the previous article, I wrote, “Carelessness and overconfidence are usually more dangerous than deliberately accepted risks.” Unfortunately, management’s decision to accept a mission without managing the risks was unnecessarily placing us in harm’s way. Was it carelessness, overconfidence, arrogance, or ignorance? Whatever the reason, it was wrong.
Fortunately, Jack and I recognized that we didn’t have the training or the experience to conduct this mission as safely as was practical. We knew there were many hazards associated with this mission that we didn’t know about. In order to adequately identify any hazards and manage risks, we knew we had to quickly learn more about any known hazards, and especially any unknown hazards.
We both, independently, reached out to different operators who were trained in and who had experience with inserting personnel into water. We quickly learned how much we didn’t know about over-water insertion operations.
There was one more contact we needed to make, but it had to be done together. I had arranged a meeting with an aviation safety officer (ASO) of an aviation operation that had more experience in over-water operations than anyone: the U.S. Coast Guard.
From the onset, the Lieutenant ASO recommended not using single-engine helicopters for over-water operations, especially at low altitudes without floats. He explained Coast Guard standard operating procedures (SOPs) for free swimmer insertion from beginning to end. He also explained Coast Guard risk management philosophies for the various Coast Guard missions.
We listened attentively as he explained Coast Guard SOPs requiring 10 percent power margin for all operations, as well as the need for aircraft floats, windscreen wipers, multi-engine aircraft, and water survival training for over-water missions. From experience, he knew operations below 15 feet above water, depending on conditions, had a high probability of “whiteout.” After the meeting, he showed us their personal protective equipment: dry suits, life vests, HEED bottles, seat belt cutters, personal electronic locator beacons, flares, and signal lights.
He closed by again recommending not operating a single-engine aircraft without floats over water, and definitely not at night without instrument flight rules capabilities and night vision goggles. What an eye-opening education it ended up being.
Jack and I left the meeting unable to stop talking about the need for windscreen wipers. We had not thought about something so simple, yet so important. It was further proof of how little we really knew about this type of operation. It also confirmed that reaching out to experts and completing a risk analysis should be required for any new mission.
His input helped to solidify our position: We were going to recommend a no go!
To further support our “no go” recommendation, we used the Human Factors Analysis and Classification System (HFACS) as a proactive hazard identification tool. This framework can be used as a post-accident investigative guide or as a proactive tool to identify possible causal factors or latent conditions prior to a mishap. It is a great way of finding holes in the “Swiss cheese” of human systems.
Using HFACS proactively helped identify significant issues at the Unsafe Acts level, Pre-Condition level, Unsafe Supervision level, and Organizational Influences level. Had we gone forward with the mission, these issues would have been significant “holes” in the Swiss cheese.
More knowledgeable about the mission and the hazards we faced, we looked at the four basic risk management principles with a new perspective:
Accept risk when benefits outweigh the cost
- The level of risk was unacceptable for the “photo op” mission
Accept no unnecessary risk
- The department was unable or unwilling to mitigate risks
- Inserting divers into the water could be done with significantly less risk from a boat
Anticipate and manage risk by planning
- The training mission was accepted and planned without risk management
Make risk decisions at the right level
- The decision-makers were not willing or not able to allocate the appropriate resources to adequately mitigate the hazards and reduce the risks
In his book, Against the Gods: The Remarkable Story of Risk, Peter Bernstein wrote, “When we take a risk, we are betting on an outcome that will result from a decision we have made, though we do not know for certain what the outcome will be.”
My views on “betting on an outcome” and “accepted risks” (risk tolerance) have evolved over the years. Today, I am much less willing to “bet on an outcome” and I am definitely more deliberate about my risk decisions.
Through education, training, and experience, I am better at identifying hazards and managing risk. I am better at recognizing when I don’t know enough about a mission, and I am more willing to reach out to someone who does.
More importantly, when my individual risk tolerance significantly differs from that of the organization, I am much more confident about communicating my concerns and recommending not going forward with a mission!
Making the right call
Schonely: It became very clear to us that with the proper aircraft, training, equipment, and experience this mission could be completed safely, but at that time we had none of these, which made this training very high risk and unacceptable. We believed this was all for the “photo op” since an actual mission was never presented to us.
After listening to a detailed brief of the upcoming training day, Mark and I presented our risk analysis and advised the group that we recommended a “no go” for the training. The training day was cancelled, and although we know that not everyone was happy with that, it was the correct call.
That chain of events solidified everything Mark had been teaching me. We would continue to receive mission and training requests from command and specialized units and many of those were safely completed, but we said no to quite of few of the requests and altered others based on a simple risk analysis. We also attempted to share our views on safety and risk with our peers, supervisors, and command so that others would benefit from what we had learned.
I believe the most valuable lesson I learned over the years from Mark was that the time to ask the tough questions is before you accept the mission, because if there is an incident or accident you can be assured that the National Transportation Safety Board, the Federal Aviation Administration, your chief, or your sheriff are going to ask those same questions. Those questions become very obvious after a tragedy. That is a very simple philosophy that law enforcement aviation crews — or actually any aviation crew — can immediately adopt to make things safer where they work. It is about having basic foresight, being diligent, and having the willingness to sometimes say NO.
Certainly a flight risk analysis tool (FRAT) is an excellent start in the decision-making process, but the FRAT is designed to catch big issues related to the crew, weather, machine, and limitations. History shows us that many times it is the little things that are missed that cause the accident. I encourage you to be open-minded, learn as much as you can from the experts, and look at risk in a different way starting today — beyond the FRAT.