There was a joke being bantered about a few years ago:
- Speaker 1: Boeing has come up with the most important safety addition to the cockpit.
- Speaker 2: What, some incredible new computer?
- Speaker 1: No, something far more low tech.
- Speaker 2: I give up.
- Speaker 1: A dog to sit between the pilots!
- Speaker 2: What?
- Speaker 1: The dog was trained to bite the pilots when either of them tries to touch the instruments.
Humor may be a poor choice as an introduction to a serious subject, but the old, bad (?) joke does put a facetious focus on an increasingly identified issue as these three stories highlight.
First, a few key quotes from the below literature:
- Fortune article: “The report cites a study by Eurocontrol, the EU agency that coordinates air-traffic control for all of Europe. The study found that roughly 25% of pilots ‘failed to take the correct evasive action’ after receiving computer-generated warnings. That rate rose to 36% for follow-up alarms, according to the report.”
- Bloomberg Business: “An aborted takeoff last year on a US Airways plane in Philadelphia, which smashed down so hard it broke the landing gear, was triggered by the crew’s failure to enter the proper runway into a flight computer, among other errors, according to a preliminary investigation.”
- WSJ/Andy Pasztor: “Reacting to such commands, which typically pop up less than 30 seconds before a possible collision, roughly 8% of pilots did the opposite of what the technology commanded, such as pulling the plane up when the alert told them to push it down. Another 17% climbed or descended too slowly or too quickly, according to analyses by Eurocontrol, which handles and coordinates European air traffic.”
Those are quotes of specific incidents or quantitative analyses; so certainly as to both US and European carriers which are heavily invested in SMS, it would be expected that the airlines and the regulators would be quoted as saying something like “in response to these warnings we have initiated a vigorous program of…” But the articles stop without explaining what the SMS process has determined to be the proper remedial response.
A superficial, intuitive list of preventative actions might include increased data entry training, more time in simulators practicing reaction maneuvers in response to these computer alerts, education of pilots to recognize individuals’ skills for computer interaction, an additional layer of computer protection to review whether the information entered is right and to provide a specific command or even something extreme like hiring pilots for ab initio training with computer emphasis. The analytical capabilities of SMS would point the team towards measures which are more likely to correct the specific errors; so their output would be far more telling.
This issue seems to be precisely the type of problem which SMS should have discovered and which its process would likely have decided what must be done. It is curious why the answers are not being announced.