Greek Philosophers may have the ANTIDOTE to ANTICIPATION BIAS

JDA Aviation Technology Solutions

 

AOPA recently published an informative article about EXPECTATION BIAS in aviation. The author, Jeff Simon[1], focuses on the GA aspects with great illustrations of real world cases where this cognitive phenomena impacted safety. In the closing paragraphs, he offers 5 questions and 3 approaches that all aviation professionals should use.

To grow upon his observations, a broader approach may be based on this principle:

Expectation bias pushes pilots, maintainers, and controllers to see what they expect to see, while skepticism is the counter‑force that asks, “But what if I’m wrong?” Aviation safety depends on managing the tension between these two tendencies.

Seven Ancient Greek Philosophers[2] are credited with creating the SKEPTICISM SCHOOL with this principle as its foundation—

“Nothing is beautiful nor ugly, nor just and unjust, and the attitude that nothing is true is equally valid for everything, and everything that people do is based on assumptions and habits because nothing is what it seems like, but is different than that.”

In more terms, the moral sounds like this:

Because equally compelling arguments can be made for and against any claim, the skeptic suspends judgment to avoid error and attain mental tranquility.

Expectation bias is one of those deceptively simple cognitive traps that shows up everywhere in aviation—quiet, persistent, and incredibly powerful.

  • It stems from cognitive shortcuts (heuristics) that help humans process complex environments quickly.
  • In aviation, it can cause pilots, maintainers, and controllers to miss anomalies, misinterpret cues, or continue unsafe plans.

Examples from aviation investigations:

  • A pilot continues to destination after an engine “stutter” because the engine briefly smooths out—classic wishful thinking.
  • A pilot departs from the wrong runway because they default to the runway they usually use.
  • A mechanic assumes a low oil‑pressure reading is a faulty sensor and keeps running the engine, worsening the situation.

To try to counteract this largely unconscious mental mechanism, skepticism serves as a deliberate habit of questioning assumptions, challenging first impressions, and verifying data rather than accepting it at face value.

In aviation human‑factors practice, skepticism is not cynicism—it’s disciplined doubt.

It shows up as:

      • “Let’s confirm that.”
      • “Does this match all the evidence?”
      • “What else could this be?”
      • “Is this consistent with the checklist?”

Skepticism is a core component of Crew Resource Management (CRM) and active confirmation practices recommended by FAA safety programs.

Active Confirmation

        • Verbalize and cross‑check data (frequencies, coordinates, runway assignments).
        • Do not assume—verify.

Crew Resource Management (CRM)

        • Encourage challenge‑response culture.
        • Empower junior crew to question assumptions.

Checklist Discipline

        • Avoid interruptions during preflight or maintenance.
        • Restart the checklist if interrupted.

Aviation culture often celebrates confidence and decisiveness, but the safest operators cultivate structured skepticism—a habit of questioning that protects against the brain’s natural tendency to oversimplify.

Expectation bias is inevitable.
Skepticism is optional.
Safety depends on making it non‑optional.

That’s a great textbook information, but all involved in aviation safety must induce this awareness every day, all day- without any risk triggering this doubt “synapse.” Why? Skepticism isn’t a mood; it’s a switch you have to deliberately flip. The safest operators don’t rely on personality or “being in the right mindset.” They build a daily activation ritual that forces skepticism to turn on before they enter the operational environment.

Before the first operational task of the day, ask yourself:

“What assumption am I most likely to get wrong today?”

This single question does three things at once:

      • Interrupts autopilot thinking
      • Activates metacognition (thinking about your thinking)
      • Primes your brain to look for disconfirming evidence

It’s the mental equivalent of flipping the battery master switch.

Aviation psychology shows that expectation bias thrives when the brain is unchallenged. A Cognitive Forcing Question pushes the brain to:

      • Slow down
      • Scan for weak signals
      • Treat familiar tasks as potentially deceptive
      • Re-engage the analytical system instead of the automatic one

It’s a micro‑dose of Pyrrhonian skepticism applied to a modern cockpit.

A Few Variants You Can Use Daily

“What here could fool me today?”

“What am I assuming without evidence?”

“If I’m wrong about something, where will it be?”

“What would I see if the opposite were true?”

“What’s the first thing today that deserves a second look?”

This insidious mental mechanism can create risks of great magnitude and because the brain does not detect when the EXPECTATION BIAS hits, the suggested mantras are a useful ways to assure that your SKEPTICISM FUNCTION IS VIGILENT.

Aircraft Maintenance: The dangers of expectation bias

March 4, 2026By Jeff Simon

Humans are naturally wired to recognize patterns, and to perceive patterns that fit our expectations. When flying or maintaining an aircraft, perceiving what we expect to see and missing what is right there in front of us can be costly—or even fatal.

EXPECTATION BIAS, a well-established idea in aviation psychology, is a cognitive phenomenon in which our experiences, preconceived beliefs, or desires influence how we perceive, interpret, or act on information. This is very closely linked to optimism bias in which an evaluation of a situation is influenced by our desires instead of an analysis of facts. We commonly refer to this as “wishful thinking,” yet we rarely acknowledge when we are guilty of it. Human beings are preprogrammed to seek results that match our expectations and desires, and this creates a dangerous bias that can lead to faulty perception and errors in judgment.

Consider a Cirrus crash in 2022: The pilot contacted air traffic control and reported a brief “stutter” of the engine, followed by rough running. He requested a diversion to a nearby airport en route to his destination. However, as he neared the alternate airport, the engine smoothed out. Rather than land, the pilot chose to revert to his original plan and continue to his original destination. About 15 nautical miles from that destination, the engine began running rough again, followed by a catastrophic engine failure. The pilot activated the airframe parachute and landed in a field. He and the passenger survived, but the aircraft was destroyed. The pilot changed his action plan based on the unrealistic hope that the engine problem had spontaneously resolved and the flight had somehow magically become safe enough to continue.

I have witnessed several cases of questionable maintenance-related decisions that I would attribute to the same expectation bias or wishful thinking. In one case, an aircraft landed with a completely flat main-gear tire. The pilot was anxious to get back to his home airport rather than suffer the delay of performing a repair away from home. The pilot chose to add air to the tire, evaluate whether it appeared to hold pressure, then depart.

In another case, an aircraft owner noted very low engine oil pressure in flight. They landed and brought the airplane to a mechanic for evaluation. Thinking (or hoping) that the low oil pressure was caused by a false indication, the mechanic ran the engine repeatedly while testing and ruling out the sensor and display. Only after confirming that the low oil pressure reading was accurate did the mechanic remove the pressure relief valve and discovered a mechanical failure.

In a final case, a mechanic was attempting to identify the cause of an intermittent in-flight engine failure. Although it could not be reproduced on the ground, the mechanic made several attempts to diagnose and correct the problem. Each time it appeared that the issue had been resolved, the failure reappeared during subsequent test flights. Ultimately, the aircraft crashed during one of the test flights, severely injuring the mechanic.

In each of these cases, the individuals involved instinctively chose to accept risk in exchange for indulging their expectation or optimism bias.

In the case of the flat tire, the pilot’s desire to return home allowed him to rationalize that simply adding air would remedy the problem. He avoided confronting the question of how all the air in the tire had been lost, and downplayed the risk of landing with a flat tire, should the air leak out again during the flight home. I’m not sure how the story ended, but even an uneventful landing would only serve to reinforce a risky approach to aviation safety.

In the case of the low oil pressure issue, the mechanic’s bias toward a failure in the indication system, rather than a mechanical failure, resulted in repeatedly running the engine with a questionable oil supply. While this may not have posed an immediate safety risk, it did place the long-term health of the engine at risk. In the end, it only took a few minutes to remove the oil pressure relief valve and identify the mechanical failure.

The case of the engine failure was the most tragic. Without the ability to faithfully reproduce the failure on the ground, the mechanic chose to conduct a test flight based on the expectation that the problem had been remedied, nearly losing their life in the process.

How to protect against expectation/optimism bias

Awareness: Simply recognizing our natural desires and expectations can help break the cycle of poor decision making. If you acknowledge that you can be influenced into unsafe situations, you are more likely to implement steps to avoid them.

Questioning: When you are analyzing a situation and developing a plan of action, ask yourself:

  • Am I being objective, or is this wishful thinking?
  • Have I acknowledged all the facts available to me?
  • Are any of my assumptions illogical or highly unlikely?
  • What are the consequences if I am wrong?
  • What options do I have to reduce or eliminate this risk?

Systematic approach: Follow the FAA’s guidelines for using a risk assessment matrix when approaching every operational or maintenance situation.

  • Identify the potential consequences and severity of the risk in a worst-case scenario.
  • Assess the probability of the risk.
  • Develop a plan of action that acknowledges the facts and mitigates the risk.
  • Establish clear boundaries that ensure you will never knowingly place your life at risk.

The highest-severity risks include engine lubrication, fuel delivery, propeller operation, flight controls, electrical system, and fire. For these items, an in-flight abnormality warrants immediate landing. In other cases, you must carefully evaluate the probability and risk of failure, and make a disciplined decision that prioritizes safety while preserving an alternate plan should the situation deteriorate. On the ground, approach maintenance issues with even more scrutiny. You should never fly an aircraft unless you are completely confident in its safety.

As pilots and mechanics, we routinely encounter situations that do not always have clear solutions. OBJECTIVITY, HOWEVER, IS THE KEY TO SAFETY. Approach problems with a logical plan—one that you are comfortable saying out loud to yourself and your passengers, that addresses and acknowledges your internal bias, and prioritizes the safety of everyone on board. Until next time, I hope you and your families remain safe and healthy, and I wish you blue skies.

 


 

[1] A&P mechanic, IA, pilot, and aircraft owner. He has spent the last 22 years promoting owner-assisted aircraft maintenance.

[2] Pyrrho (365-275) ; Timon (320-230); Arcesilaus (315-240); Carneades (129-214) ; Aenesidemus (80-10 BCE); Agrippa (1st century CE); Sextus Empiricus (2nd century CE)

 

Sandy Murdock

View All Posts by Author