Consider what it takes to summit Mount Everest. The journey involves careful calculations based on numerous factors, such as weather patterns, the body’s response to extremes in altitude and temperature, the number of climbers attempting to summit and much, much more.
One of the most dreadful examples of how complex climbing Mount Everest occurred in May 1996. Over the course of two days, eight climbers died – some while attempting to summit the mountain and others while descending from the peak to basecamp. In his paper, “Lessons from Everest,”, opens in new tab Michael Roberto proposes that three main forces contributed to the disaster: cognitive bias, group dynamics and the complexity of climbing the mountain itself. Similarly, cognitive bias—or systematic errors in thinking that affect the decisions and judgments that people make—can have a disastrous effect, even when the stakes don’t involve scaling a mountain. Understanding the different ways our own patterns of thinking can lead to poor decision-making can improve our chances of reaching goals more often over time.
The bottom line is this: If preparation and planning are at all lacking, or if the smallest thing goes awry during the climb, the attempt to reach the summit can easily fail.
Complex problems have complex solutions.
I’ve never climbed Mount Everest, nor do I really have the desire to do so, but success in my line of work requires the same approach. The truly rewarding thing about a career in medical quality advancement is knowing I am improving medical outcomes for beloved family pets visiting Banfield hospitals. Our team gathers data, case studies and anecdotal evidence about every aspect of our medical practice, and learns from industry organizations and medical professionals across the spectrum of human and animal healthcare to help ensure we are providing reliable, quality care in our more than 1,000 Banfield locations across the United States and in Puerto Rico.
What’s the connection with the 1996 disaster on Mount Everest? It all comes back to basic human thought processes and how they can lead to flawed decision-making. The same modes of thinking lead to “stories of one” in healthcare. I hear about these events across the industry – stories of individual actions that result in some level of harm to the patient – and while they are sometimes tragic, they inevitably reveal opportunities to learn and improve, to help increase the chances that these events don’t recur. Advancing medical quality means examining the full system to determine what happened, but more importantly, to learn how to improve.
What is Cognitive Bias, and How Does It Lead to “Stories of One”?
In behavioral decision theory, cognitive bias is the concept that human nature often clouds our ability to make good decisions. Human decision-making can rely too heavily on past efforts, experience, skill or knowledge, rather than clear and complete evaluation of the reality of a situation. In the medical profession, cognitive bias can lead to actions that may ultimately impact patient health and safety. Diving deeper into the three specific types of cognitive bias contributing to the Everest disaster, we can see parallels in hospital settings. By becoming aware of these biases and how they affect our behavior when we’re interacting with our clients and patients, we can proactively improve and ultimately limit “stories of one.”
“But we’ve Come So Far!” – The Sunk Cost Effect Bias
The further you get from the start of a journey, the harder it is to turn around. People tend to “throw good money after bad” because the alternative can carry with it the feeling of defeat. During the disaster on Mount Everest, an extremely high number of climbers were attempting to make the summit during a window of clear weather, causing delays at narrow steps and complicating traverses on the mountain. This meant they spent more time on the mountain than anticipated, summited Everest later in the day than was deemed safe and descended in the dark, in blizzard conditions, with several of the climbers having consumed all of their spare oxygen during the longer ascent. All of these components played a role in the resulting tragedy.
In hospital settings, the sunk cost effect bias can present itself in similar ways. Although it can feel like admitting defeat, we must remember that, just as there is value in an effort already made, there is value in stopping to consult with other veterinarians or veterinary specialists. There are times when we must stop a procedure during which abnormalities are occurring and pursue additional diagnostic testing and evaluation rather than push the envelope and risk patient safety.
“We’ve Done It a Thousand Times” – The Overconfidence Bias
Talent, education and experience all combine to create confidence. And as people are successful over time, confidence in their ability to make the best decisions can sometimes lead to unanticipated complications. No amount of experience can completely insulate anyone against error.
The leaders of the two Everest expeditions who experienced the greatest losses were incredibly skilled and confident climbers who had summited the mountain and led large expeditions multiple times. Despite the reality that more than 120 experienced climbers have died attempting to summit Everest since 1922, these expedition leaders were unwilling to believe they could be unprepared for any aspect of the climb, and thus they found themselves exposed high on the mountain late in the day and far from safety in a surprise storm.
In hospitals, overconfidence can also be an issue. Ensuring the organization is set up in a way that limits the opportunity for overconfidence to drive decision-making (for example, by requiring certain steps to be taken during a clinical evaluation or during administration of medication) can also reduce the potential for overconfidence bias to negatively impact patient outcomes.
“It’s Been Going So Well!” – Recency Effect Bias
Human beings have a tendency to focus on their most recent experiences. For example, if we had recent success, we anticipate success in the future despite our past experiences. This can be dangerous, as it can lead people to believe they will be successful at a specific task, regardless of what it entails. The recency effect is a small part of what’s known in cognitive behavior theory as availability heuristics – the mental shortcuts we take based on events that are recalled most easily, such as recent or significant experiences.
For several years leading up to 1996, climbers on Mount Everest experienced remarkably good weather. Both of the lead climbers in the two previously mentioned expeditions had been on the mountain during those years. These climbers may have ignored the turnaround time and the deaths that had occurred in the past when climbers began their descents later in the day. They may have believed that, even as the hours passed, the weather would stay clear and they would be able to descend safely.
Veterinarians may experience this bias most often with routine procedures, such as anesthesia, and may presume that because all similar procedures have gone well for the last month or even the last year, they are likely to go well in the future. But basing decision-making on past experiences, rather than on the current situation, sometimes leads to oversight or error. What’s more, while it may not sound dangerous to use past experiences to guide future decision-making, the recency effect bias can be especially pernicious when it operates alongside other biases. For example, if a veterinarian multitasks during a procedure or strays from a process or protocol without negatively affecting the outcome, that veterinarian may presume that similar actions in the future will have similar outcomes (we call this “normalization of deviance”). Combined with the recency effect bias, this mode of thinking can lead individuals to look to recent nonstandard actions as reference points for future decisions, believing that those actions will not have a negative impact.
It is for those potential “story of one” instances that we institute processes and procedures to prevent cutting corners and reduce the opportunity for those types of occurrences.