Study suggests that cognitive outcome bias may result in future repeat of same behavior
SMS learns from errors
Might SMS try to capture this possibly repetitive behavior
The evolution of the FAA’s Safety Management System grew away from the fault-finding of enforcement and towards cooperation, learning from mistakes plus collaboratively designing solutions to the problem identified. Part of the premise is: if you tell the FAA about a mistake, no civil penalty action will be initiated. The point of an Event Review Board is for the FAA, management, unions and all other stakeholders to fashion a response to the specific issue.
Dr. Stephen Walmsley has coauthored an article “Understanding the past: Investigating the role of availability, outcome, and hindsight bias and close calls in visual pilots’ weather‐related decision making”. In addition to earning his PHD in 2016, the pilot was recognized by the RAS in 2016 as a young high achiever demonstrating excellence in work ethics, values, application and results.
He summarized his findings as follows:
Past events, such as “close calls,” can provide valuable learning opportunities, especially in aviation, where learning from past errors could potentially help to avoid future incidents or accidents. This study investigated whether three cognitive biases (availability, outcome, and hindsight bias) could influence pilots’ perceptions of past events, which in turn might influence their perception of events yet to occur. Study 1 found that pilots were influenced by the outcome of a flight when judging decision quality. Of particular interest was that pilots interpreted events that led to a close call very similarly to those that had positive outcomes, which may reinforce risky behaviour. However, although adequately powered, Study 1 found no evidence of availability bias: Exposure to one of four outcomes did not appear to influence later decisions. Study 2 found that having read a flight report, particularly if it ended in a crash, pilots consistently overestimated their likelihood of predicting the actual outcome, which may reduce any opportunity to learn. These findings suggest that two of the three cognitive biases explored in this study could influence a pilot’s perception of past events in ways that may adversely affect how they make future decisions that in turn may affect flight safety.
Walmsley added “I have been involved in aviation for many years; one of my key research areas has been the role cognitive biases play in pilot decision making. In aviation, when a poor decision is made, the consequence can be catastrophic, therefore having a greater understanding of what leads to poor decisions is an important step to improve aviation safety.” The study focused on flying “VFR into IMC” — when pilots operating under visual flight rules inadvertently fly into low-visibility conditions that require instrument flight.
“Although the eventual outcome for a close call is the same as for a positive outcome, considerably more luck may be required in the close‐call situation to achieve that outcome. This may limit the learning opportunity from close-call events and reinforce risky behaviour,” Walmsley said.
Outcome bias will be difficult to capture in a real world situation and creating a quantitative marker even more troublesome; including this phenomenon in SMS not analytically feasible. However, the FAA SMS gurus and individual airline ERBs may reflect on how to isolate this predictive factor. The regimen of this risk reducing tool is to anticipate problems and this cognitive outcome bias may provides a powerful indicator of a consequential nascent error.
Share this article: