Posted on
May 17, 2023

How a tweak in team psychology helped define modern airline safety

In 1978, United Airlines Flight 173 was headed from New York City to Portland. On arrival, a mechanical failure caused the landing gear to deploy incorrectly.

The captain, at the time, was one of the most senior pilots on United’s staff and had been with the airline for almost 30 years.

The first officer (second in command) on the other hand, had about half the tenure and only a few months of experience on that particular aircraft. The flight engineer on the crew that day wasn’t much more experienced.

After hearing the gear slam and observing a warning light, the captain decided to delay the landing until the landing gear issue was identified. A little over an hour later, the aircraft lost all four engines and went down in a Portland suburb, tragically killing 10 onboard - miraculously, most survived.

But investigators with the National Transportation Safety Board were stunned…

“I was clearly very interested in how a highly experienced captain could fly around for over an hour in sight of the airport, with good weather, and not put this airplane on the ground safely” - Alan Diehl, Investigator and Human Factors Specialist with the NTSB

Though the flight engineer confirmed that manual indicators (common on planes at that time) suggested the landing gear was down and locked, the captain decided to delay landing to figure out why the indicator light showed a malfunction.

But they were facing a much more fatal threat - running out of fuel.

“The crew members were trying to get this captain’s attention, but he was apparently totally focused on the gear problem to the exclusion of all else” - Alan Diehl, Investigator and Human Factors Specialist with the NTSB

On the cockpit voice recorder, almost in the tone of “We’ve been trying to tell you this!”, the conversation clearly changes as the real problem becomes clear…

United Flight 173 Aircraft Accident Report

They were no longer troubleshooting a landing gear issue, they were now dealing with total engine failure.

This tragedy could have been far worse - the plane miraculously landed in a wooded area of a Portland suburb, but investigators concluded that it was completely avoidable.

This accident was primarily due to human error - a lethal combination of communication errors in the cockpit.

On multiple occasions, the crew articulated the fuel problem but didn’t seem to feel comfortable challenging the captain’s decision to continue troubleshooting the landing gear warning light.

"Flying with a very senior captain, it would be very difficult to challenge that captain in those days about something like fuel." - John Cox, Commercial Pilot & Air Safety Consultant

In the 70s, there were a handful of disasters involving commercial airlines that were due to cockpit communication errors. Eastern Airlines Flight 401 crashed into the Everglades on its way to Miami under almost identical circumstances to United Flight 173.

Though we may believe we would challenge authority in these situations, even in dire circumstances, research demonstrates that we typically don’t. In reality, we have a deep desire for conformity.

A common theme in this research is our tendency to second guess and suppress our own perspectives based on our observation of leadership, peers, and social norms.

Changing how decisions are made in the cockpit

In response to these disasters, Alan Diehl and the NTSB turned to a training program at NASA developed by David Beaty - author of ‘The Human Factor in Aircraft Accidents’.

NASA’s John Lauber called the program ‘Crew Resource Management’ (CRM) and its been a global standard in commercial aviation for the last 30 years. Since its implementation, accidents of this kind have declined to near zero - and no major accidents of this kind have occurred since.

In fact, CRM has been cited as a contributing factor to situations like ‘The Impossible Landing’ of United Airlines Flight 232 where the entire crew was able to troubleshoot and respond quickly after catastrophic mechanical failures. Even the famous landing of US Airways Flight 1549 in the Hudson River by Captain Chesley (Sully) Sullenberger is often referenced as an example of effective CRM.

What can we learn from Crew Resource Management?

CRM trains pilots and crews on threat detection, situational awareness, communication, and decision making.

It’s a set of guiding principles that encourages pilots to surface problems and challenge superiors while maintaining accountability and decisiveness.

“The principles of CRM are that no one individual in the cockpit can possibly understand and see all the threats that are out there. It requires the entire crew.” - John Cox, Commercial Pilot & Air Safety Consultant

It creates a leadership culture in the cockpit that actively expects every member of the crew to speak up without fear of being reprimanded.

"It helps to improve crew coordination and improve collective decision making in the cockpit" - Alan Diehl, Investigator and Human Factors Specialist with the NTSB

Most importantly, it fosters productive dissent - the ability for the crew to disagree and evaluate options that might oppose the captain while preserving the captain’s ultimate decision authority.

The principles of CRM have been implemented across many industries over the last 50 years as leadership has shifted from ‘The Great Hero Theory’ to more of a ‘Great Team Theory’. It’s also been expanded throughout aviation to include ‘Team’ and ‘Maintenance’ Resource Management inclusive of air traffic control and maintenance teams.

If you are a pilot, you probably groaned at the mention of CRM - it’s probably synonymous with an old-school grainy VHS training video.

But the principles, particularly around threat detection and how teams make decisions in high-stakes environments are incredibly relevant to any team.

The most important learning from CRM is quite simple - it took an implicit process, made it explicit, studied it, and improved it - resulting in a measurable impact on a broken system that literally saved lives

Side note: one of the more interesting resources we found was this CRM handbook implemented with firefighting times - it’s particularly interesting that they call out ‘Barriers’ as a core principle. The most comprehensive resource we stumbled upon was this book on CRM.

Tools that help teams disagree

An interesting take from the story of CRM is how, through a series of tragedies, an industry identified a human factors problem with group decision making - it breaks down in systems where teams can’t effectively disagree.

Over the next couple of posts, we’ll cover a series of tools that help teams disagree in productive, effective ways. These include:

  • Psychological Safety: Creating an environment for productive dialog - particularly focused on a team’s ability to disagree effectively.
  • Speculative Design: ‘Mental time traveling’ can help teams take different points of view, evaluate possible futures, release inhibitions, and break out of their comfort zones.
  • Red Team Thinking: A toolkit for systematically fostering opposing views, surfacing contrarian perspectives, and actively challenging the status quo.
  • Nominal Groups: An async way to share feedback, information, and ideas that removes many of the biases that keep groups from being able to disagree effectively.

Are we missing something we should dig into over the next couple of weeks? let us know!