Toolbox Talks: Cognitive Bias isn’t a thing; it’s a ton of things (Part 1)

(iStock)

Not to pick on hospitals, but these things from that industry just keep popping up in the news. Last week the results of a hospital study on human error were released. You can read the article here, but this is the Cliff’s Notes version. For six months, researchers studied over 5,300 surgery results at three hospitals. Of those surgeries, 188 resulted in an adverse event like a death or a major complication. Over half of those events (106) were the result of “human error.”

We know from a previous article that hospital errors are the third leading cause of death in the United States, but what makes this study worth mentioning is that it breaks down the type of human error involved. Over half of the 106 adverse events caused by human error were cognitive, meaning there was a lack of attention, lack of recognition, or a cognitive bias present. The remaining errors involving communication, teamwork, and system errors were relatively low in comparison.

“Lack of attention” and “lack of recognition” are what we commonly call in human performance situational awareness—our go-to term for “just lost focus.” I think we are all familiar with what they are saying there, but what do they mean by “cognitive bias,” and should we be thinking more about how it influences us?

The term “cognitive bias” is even more of a catch-all than “situational awareness,” but simply put, a cognitive bias is a break in our normal, rational judgement. The thing is, there are literally dozens of reasons why we stray from our rational thinking and they all get lumped into “cognitive bias.”

Over the next few articles we will look at some of the cognitive biases that could influence us into making an error. I’ll keep these definitions short, but try to see if any of these are something you could fall into yourself. Usually just the awareness of the bias will help you avoid its influence, but I’ll pepper in some human performance tools that could help as well.

In no particular order:

Anchoring Bias: giving too much weight to one piece of information; usually the first piece of information you were given. The pre-job brief told us how the work should go today and it’s hard to let go of that idea, though we have been getting other information as the day progresses. Use a good questioning attitude to challenge if you are putting too much weight onto the earlier information.

Groupthink or Bandwagon Effect: to do or believe something because many other people do. If it looks like everyone has the same idea, then you want to take a step back and evaluate if a contrary opinion could benefit the process.

Confirmation Bias: the tendency to only pay attention to the information that confirms your preconceptions. Oh, boy. Look at MSNBC’s and Fox News’ coverage of the same event. They will usually come to vastly different conclusions based on their confirmation bias. Ask yourself: “Did I objectively arrive at this conclusion or did I start with this conclusion and work my way backwards?”

Status Quo Bias: the tendency to like things just the way they are; no need for change. This bias prevents us from truly improving. I watched a supervisor push back on the status quo bias in a subtle way. His workers were arguing against a more cumbersome, but safer, work process saying, “We’ve never done it this (new) way!” The supervisor said, “Oh good. It’s hard to know what day it was when we improved as a team, but you’re right, today is that day. Thank you!” The workers just suppressed a you-got-me smile and the work moved on per the supervisor’s plan.

There’s so many more that we are making this a three-parter. Stay tuned and we’d love your feedback at knowledgevine.com.

Toolbox Talks offers quick insights and thoughts to use for your toolbox (tailboard) talks. Dave Sowers is a founding member of Knowledge Vine, a veteran-owned human performance training and consulting organization that strives to reduce the frequency and severity of human errors in the workplace. He has almost 30 years of experience in power generation and the utility industry. He is a veteran of U.S. Navy Nuclear Power Program and holds a bachelor’s degree in resources management and a master’s degree in both management and emergency management and homeland security.