What the 1996 Everest Disaster Teaches About Leadership
Free Book Preview: Unstoppable
Leading a team requires more than just an ability to make decisions. To be a strong leader, you need to possess an awareness of how people tick. What is your team capable of, and what problematic behavior are they enacting? More importantly, do you know when your decision-making and leadership abilities might be compromised, so you can head catastrophe off at the pass?
Michael A. Roberto’s article, Lessons From Everest, illuminates some essential lessons on leadership, drawing on psychology from the 1996 Everest disaster. During an attempt to summit Everest in 1996 -- immortalized in Jon Krakauer’s book Into Thin Air -- a powerful storm swept the mountain, obscuring visibility for the 23 climbers on return to base camp. Five people died on the mountain that day, the most disastrous tragedy on Everest until an earthquake-caused avalanche in 2014.
Of course, there are many conflicting ideas as to what may have contributed to such an unprecedented disaster, and the argument stands that the climbers were well aware of the inherent danger of climbing Everest.
But Roberto’s article calls into account the decision-making abilities of experienced trek leaders Rob Hall and Scott Fischer. Drawing from behavioral decision-making, Roberto concludes that the two leaders were operating under some common cognitive biases that likely contributed to the Everest disaster.
These common and predictable cognitive biases impair judgement and the choices that people make and are especially disastrous in leaders. I’ve condensed some common cognitive biases that Roberto found in action on Everest. If you want to be a truly powerful leader, take them time to learn more about your decision-making -- you may even recognize a few of these biases in yourself.
The sunk-cost effect.
People who are operating under the sunk-cost effect tend to double down on their commitment to a course of action in which they have invested considerable resources, like time, money or effort. The sunk cost effect causes normally rational people to throw away common sense, allowing past investment decisions to effect their future choices, “despite seeing consistently poor results,” writes Roberto, escalating what could have been non-situation.
All three of these resources -- time, money and effort -- had been spent on Everest. The trek cost more than $70,000. Each climber had spent years training and preparing, and the final push to the summit was over 18 hours long, following weeks of difficulty acclimating and hiking to base camp. The final push to the summit was incredibly dangerous and required perfect timing so auxiliary oxygen bottles would last and climbers would not get caught in darkness on their return to camp. Hall and Fischer “knew that individuals would find it difficult to turn around after coming so far and expending such an effort,” and past Everest guides even reported climbers laughing in their face when told that they would not be able to summit.
Hall and Fischer, despite mentioning numerous times that climbers would be turned around if they could not make the summit by 1:00 pm or 2:00 pm at the very latest, did not turn the climbers around. None of the 23 climbers made it to the summit by 1:00 pm, and only six climbers made it by 2:00 pm. Roberto writes: “Doug Hansen, a climber on Hall’s team, expressed this very sentiment during his final descent: ‘I’ve put too much of myself into this mountain to quit now without giving it everything I’ve got.’ Hansen had [previously] climbed the mountain with Hall’s expedition in 1995, and Hall had turned him around just 330 vertical feet from the summit.”
Related: 10 Signs That You Suck As a Leader
Unfortunately, Hansen did give the mountain everything he had. He did not reach the summit until after 4:00 pm, and he perished on his way back to base camp. Hansen’s thinking was clouded by the sunk-cost effect, and he paid with his life. Other climbers experienced severe blindness and sickness, yet still pressed on.
The sunk cost effect can be incredibly motivating to those people who should have, and otherwise would have, changed course to a more logical path -- experts and novices alike. As a leader, your awareness of this cognitive bias is necessary to save you and your team from massive failures and even tragedies.
According to Roberto, most leaders experience an overconfidence bias in such widely varying fields as medicine, engineering, academia and business. Overconfidence is fairly normal among high-achieving individuals because possessing strong confidence is necessary to reach such heights of success and expertise. The remarkable achievements of starting and running a multi-million dollar business, or planning and executing one of the most difficult climbs on the planet, requires more confidence than most possess.
However, that same overconfidence can circle back around to create distressing problems. Hall had made the summit of Everest four times and led more than 39 people to the summit, leading him to believe that he could not fail. In fact, Hall is recorded by Krakauer expressing his belief that some future team would experience disaster on Everest, but did not believe that disaster would befall his team. He was only worried that his team would be the one called on to rescue the hypothetical struggling team. Hall’s overconfidence blinded him. He couldn't see that he could be the one who failed.
Successful leaders often have an overabundance of confidence, so this cognitive bias is essential to understand. While confidence is crucial for any leader, this should not preclude preparing for possible failures in a comprehensive way. You should never be caught off guard.
The recency effect is a cognitive bias that leads decision makers to rely very heavily on the most readily available information and evidence, particularly information that appeared most recently. Instead of looking for all possible sources of information and weighing them equally, the recency effect bias leads people to believe -- wrongly -- that recently presented evidence is good enough and will hold. Roberto gives the example of a study where chemical engineers misdiagnosed a product failure “because they tended to rely too heavily on causes that they had experienced recently.”
This critical error played out on Everest in Hall and Fischer’s incorrect assumption that the weather would be calm and agreeable. They both led expeditions on Everest for several previous seasons that experienced only agreeable weather, however, this was the outlier, not the norm. For many seasons prior to Hall and Fischer’s expeditions, storms were the norm. In fact, there were three consecutive years in the mid-eighties where no one made the summit due to terrible winds. Both guides failed to look at past weather patterns and did not realize that they had experienced strangely calm weather.
Effective leaders must take all information, not just recent results, into account when making high-pressure decisions. This bias is also exacerbated when interacting with the overconfidence bias, as high-powered leaders who already have a propensity for overconfidence are more likely to assume that the most recent evidence is good, having been correct in the past.
Roberto goes into more depth about the causes of the tragedy on Everest (and I highly recommend you read the entire study for even more insight into leadership dynamics and group communication), but it’s clear that if Hall and Fischer had possessed a working knowledge of cognitive biases, they may have been able to avert the disaster.
Your effectiveness as a leader depends on your ability to foresee small problems before they engorge into uncontrollable complications. Doing the work to identify where your problematic areas are can not only make you a better leader, but can help others to be better leaders, too.