Two working environments—both have dire consequences for minor errors; both require that highly-skilled, strong-willed professionals work together as teams; both now involve heavy reliance on technology in their performing their precise tasks (both with interfaces which sometimes fail). Doctors and pilots are considered quite disparate occupations, but as the title of this article suggests there may be similarities of sufficient consequence for cross-training.
Mr. Hunter’s opening paragraph makes the case for increased vigilance for errors. He points out that
“Patient safety has always been a major issue in healthcare, and the call for more concrete solutions became louder after medical errors were cited to be the third leading cause of death (if it were to be included in the list of reasons for patient mortality). According to AHRQ’s research report, Patient Safety Initiative: Building Foundations, medical error reports state that among several reasons for mistakes in care provision, failures in communication tops the list. Digging deeper down the list, you can discover other causes of errors, such as lack or disrupted flow of information, deficiencies in employee education and training, problematic work flows, and inadequate policies and procedures regarding patient safety.”
Those observations can be traced to the seminal work of Institute of Medicine; Committee on Quality of Health Care in America; Linda T. Kohn, Janet M. Corrigan, and Molla S. Donaldson, Editors—To Err is Human
….breaks the silence that has surrounded medical errors and their consequence–but not by pointing fingers at caring health care professionals who make honest mistakes. After all, to err is human. Instead, this book sets forth a national agenda–with state and local implications–for reducing medical errors and improving patient safety through the design of a safer health system.
This volume reveals the often startling statistics of medical error and the disparity between the incidence of error and public perception of it, given many patients’ expectations that the medical profession always performs perfectly. A careful examination is made of how the surrounding forces of legislation, regulation, and market activity influence the quality of care provided by health care organizations and then looks at their handling of medical mistakes.
Mr. Hunter cites the aviation industry’s use of Crew Resource Management. In 1979, NASA psychologist Dr. John Lauber labeled this training tool as CRM. His research delved into the cockpit communications dynamic. One of his findings was that many civilian pilots were trained in the military where chain of command was an important discipline. CRM taught a less authoritarian cockpit culture, where co-pilots were encouraged to question captains if they observed them making mistakes.
The operating room is filled with trained individuals, each with separate duties, all with collaborative responsibilities, with a hierarchy and usually with one person in charge. Dr. Lauber’s concepts, as refined by iterations of subsequent CRM technique enhancements, should serve as models for training during the education of all of these medical personnel and even later on a reoccurring basis at hospitals. Perhaps, like the CRM training given in simulators, the teamwork of the operating room could be improved through such a similar exercise.
The article also points to the FAA’s recurrent training and proficiency evaluations. Aviation’s use of remedial skill curricula when deficiencies are found. The author recommends that the medical professions adopt these rubrics.
What is missed is the FAA’s recently instituted Safety Management System discipline. Many find this preventative approach to be attributable for much of aviation’s recent safety improvement. SMS collects massive amounts of data, then performs highly sophisticated trend analysis to identify risks with the goal of addressing them before they become problems.
One of the key reasons cited for SMS’ success is the collaborative, 3600 process used to identify practical, immediate solutions. The team which reviews the list of identified risks is not limited to the prime actors (pilots, mechanics, flight attendants) but expands the perspectives and thus, a range of solutions, to include collateral disciplines- personnel, training, purchasing, finance, legal, etc. Thus, for example, if a particular problem in the parts QA review is the failure of the inspector to identify minute deviations in measurements, the personnel department might suggest that hiring of future QA inspectors might include greater visual acuity testing.
Hospitals and medical training are already using more teamwork exercises; so, the use of SMS on a more routine basis might be possible. Hospital culture may be resistant to the 3600 problem solving, but by merely putting all of the disciplines in a room, with a good facilitator, may allow for the leveling of the participants and thus, the stimulation of creative ideas to fix problems practically.
Relevant to both endeavors and the pursuit of reducing human errors is a book by Michael Lewis, The Undoing Project. As described in a review of this book, the author traces the ground-breaking research of decision making:
“Forty years ago, Israeli psychologists Daniel Kahneman and Amos Tversky wrote a series of breathtakingly original studies undoing our assumptions about the decision-making process. Their papers showed the ways in which the human mind erred, systematically, when forced to make judgments in uncertain situations. Their work created the field of behavioral economics, revolutionized Big Data studies, advanced evidence-based medicine, led to a new approach to government regulation, and made much of Michael Lewis’s own work possible. Kahneman and Tversky are more responsible than anybody for the powerful trend to mistrust human intuition and defer to algorithms.
The lessons of this book should help medicine and aviation explore even further advances of risk reduction and error prevention.Share this article: