The new aviation Safety Regime depends on current, valid data
Labor Management Distrust lowers the reliability of SMS solutions
If Relations deteriorate, should remedies be analyzed in light of diminished data
Horizon Air hostility between management and union
SMS, ASAP, ASIAS, CAST, ASRS, FOQA VDRP and other voluntary submission processes constitute the information backbone of global aviation safety advances of the recent past. The word which assures the efficacy of this new safety regimen is VOLUNTARY. Instead of becoming aware of a safety problem AFTER an accident or incident, this approach depends on collecting vast sums of data and to collate those points EARLY.
Another critical predicate of this methodology is TRUST. Initially, the FAA had to demonstrate that the tender by the regulated, admitting that an error occurred, would not result in enforcement actions by the regulator. The compliance/collaboration/cooperation program instituted by the FAA demonstrated that the dreaded civil penalty was not the response by the government in these situations. In fact, the reliance by both sides on these communications has been attributed as one of the reasons why aviation safety has improved.
Gradually another layer of TRUST has emerged as a threat to the SMS et al risk reduction; as evidenced by these examples, unions are not comfortable with management in this process; thus impairing the efficacy of the SMS system.
A recent example of this breech comes from Horizon Air:
In an urgent internal message on the eve of Thanksgiving, Horizon Air’s head of flight operations warned that a lax safety culture among the airline’s pilots had led to multiple potentially dangerous incidents in recent days. He called for urgent action to prevent a serious air accident.
“We should be very uncomfortable with what has happened over the past two days,” wrote captain John Hornibrook, Horizon Air’s vice president of flight operations, in a Nov. 27 email message to handful of top managers and pilot leaders. “If we sit back and do nothing, we will have an accident. Nothing good can come of the trajectory we are currently on.”
“We do need to use the past 48 hours as a (wake-up) call before we have a more serious event,” added Hornibrook, who oversees about 800 pilots flying to more than 45 cities for the regional airline owned by Seattle-based Alaska Air Group. “The leadership team needs to get the pilots heads in the game before we have an accident.”
The incidents Hornibrook listed ranged from pilots going over the airspeed limits to aircraft approaching a stall, and also included weather-induced threats that perhaps could have been avoided.
Though the email suggests some alarm about pilot safety standards, in an interview Wednesday both Hornibrook and Horizon president Joe Sprague downplayed its significance and declared it a sign of Horizon’s high safety standards.
“The memo was meant to respond to the spike we saw in irregular events,” said Hornibrook. “I’m not sitting back and waiting for something bigger … I wanted everybody to take a pause, take a hard look at what was going on, refocus, and get back to the Safety First philosophy.”
Sprague added that “a safe airline recognizes a spike and takes proactive action.”
“That’s a positive from a safety culture standpoint,” he said. “Horizon is a safe airline. This internal communication was a good sign of that.”
In contrast, a Horizon pilot — who declined to be identified out of fear of losing his job — said he thought the memo was “incredibly melodramatic” and evidence of a disconnect between Horizon management and its pilot cadre.
Pilot union at Horizon Air blames management for ‘deteriorated’ safety programs, highlighting distrustful relations
The union representing pilots at Horizon Air, the regional carrier for Alaska Airlines, sent a note to its members Thursday alleging that the airline’s management is undermining long-standing industry safety programs by focusing on penalizing individuals.
The note includes a link to a letter the union wrote this summer to the board of directors of Alaska Air Group and CEO Brad Tilden drawing attention to “the deteriorated state of Horizon Air’s safety programs.”
Pilot unions and airline management typically work closely together to ensure safety. Yet the union’s July letter, obtained by The Seattle Times along with the Thursday note and earlier union documents, reveals a severely distrustful relationship at Horizon between management and its pilots.
The union’s Thursday message was in reaction to a Seattle Times story that day highlighting an internal memo to senior pilot leaders, in which John Hornibrook, Horizon Air’s vice president of flight operations, expressed concern about a lax safety culture among the airline’s pilots and listed a series of incidents in the couple of days before Thanksgiving that he deemed unsafe.
Hornibrook wrote that “if we sit back and do nothing, we will have an accident.”
In response, the executive council of the Airlines Professional Association Teamsters Local 1224 that represents Horizon pilots told its members that “we are truly dismayed by the presumptive nature, negative attitude and broad-brush descriptions of our Horizon pilots.” It said the incidents listed by Hornibrook “are not often the result of pilot error or unprofessionalism.”
The Teamsters message points to the letter the union wrote to CEO Tilden and the Alaska Air board in July with specific concerns alleging that management at Horizon was undermining a key safety program called FOQA (Flight Operations Quality Assurance, pronounced FO-KWA) that is designed to spot and remedy any dangerous trends in flight operations.
Trust and openness
The dispute over assessment of pilots between the union and management centers on the balance between encouraging an open safety culture and holding individuals accountable for mistakes.
The FOQA system, implemented across the airline industry and administered by an external company, automatically gathers data from every flight and flags any unusual conditions such as excessive speed, stalls or engine problems. This data is compiled and analyzed and used to spot trends that indicate any safety or maintenance issue and to inform pilots through training or sending out alerts.
At each carrier, a FOQA team of senior pilots from both union and management collaborates to analyze the data for that specific airline and to disseminate to its pilots any needed alerts and actions. Crucially, though the FOQA team may describe a specific incident, unless it is very egregious, they don’t identify the individual pilots involved.
This is a guiding principle in U.S. aviation: The belief is that it’s safer to encourage pilots to be open about errors and report them, without fear that they will be penalized for doing so — so they don’t conceal mistakes or problems.
Each FOQA team has union pilot “gatekeepers” who have access to the identities of the individuals involved; if an incident is outside the norm, they will talk to the pilots about what happened. If a safety incident is serious enough, pilots can be penalized or terminated. But outside that, the FOQA data almost always remains anonymous to encourage openness and trust.
A senior captain with Alaska Airlines, who contacted The Seattle Times after Thursday’s story, described how it works by citing how, following a Federal Aviation Administration (FAA) directive in 2002 on Boeing 737s, Alaska specified that its pilots shouldn’t deploy the spoilers on the wings at speeds greater than 270 knots.
The following month, Alaska posted FOQA data for its pilots showing 45 instances that month when this rule was breached, mostly just for a matter of seconds, though in a few cases for longer. After this alert to pilots, “the next month it went down to one incident,” he said.
The FOQA system is “not something punitive,” the captain said. “It’s about seeing trends and doing something about it.”
The Teamsters July letter to the board alleges that after a “safety-critical event” in the summer of 2018, which is not detailed further, Horizon management “maliciously and improperly used protected (FOQA) information” against the pilots involved. As a result of this breach of protocol, the entire FOQA team, both union and company representatives, resigned, the union said. The letter calls this action “unprecedented in the history of the airline.”
The letter alleges that in multiple communications to Horizon’s pilots, Hornibrook overrode the decisions of the FOQA gatekeepers and improperly used protected FOQA data that “admonished, embarrassed and misled Horizon pilots.” It cites a couple of instances when management in pilot training classes described safety incidents and identified the crew.
These quotes demonstrate that the bonds between Horizon management and the IBT are so frayed that they pose a threat to the basic principles of SMS and thus question whether the basic regime of this safety methodology can function. If information is not flowing BECAUSE the pilots assert that FOQA is being used by management punitively, the system suffers from a serious breech.
If the pilots cease submitting information under FOQA, the reliability of the remedial recommendations formulated by management and the union with FAA support may fail. Past problems, which have addressed by the Event Review team, have defined responses to the risks identified and it is quite possible that the specific parameters of the solution depend on the continuous flow of the information, i.e. is there a change in the risk as expected or is it necessary to enhance the “fix”? If FOQA fails due to a lack of trust, the past and future remedies may not be reliable.
Cancelling SMS or FOQA is too severe of a response in most cases. Perhaps, as a mechanism to track this requisite TRUST, the team must commission a Safety Culture Audit periodically. If the needle moves in a negative direction, then the FAA would oversee SMS training throughout the organization by third party instructors.
While such monitoring is in place, the FAA, the unions and management would have to reexamine past agreed to answers to assure that the absence of reliable risk data on a real time basis calls into question the remedy in place. Equally when designing a response to a newly defined risk, the Committee cannot rely on information flow to track performance.
Share this article: