Toolbox Talks: Curse of knowledge, blind spots and other cognitive biases that impact safety

Welcome to the final part of our cognitive bias trilogy, “Cognitive Bias isn’t a thing; it’s a ton of things.”

Previously we discussed a hospital study that looked at surgeries with adverse events like death or major complications. Most of those adverse events were the result of “human error” and most of those were chalked up to lack of attention, lack of recognition, and cognitive bias. This makes it seem like “cognitive bias” is a thing we should look out for, but it’s really a lot of things that could impact our behavior and result in human error. This led us to look at the different types of cognitive biases.

In part 1 we discussed anchoring bias, confirmation bias, group think, and status quo bias.

In part 2 we covered the Dunning-Kruger effect, focusing effect, hot-hand fallacy, and outcome bias.

And now, in no particular order:

Risk Compensation: When safety measures increase, we tend to take greater risks than the safety measure can control. In other words, we think we are more protected than we actually are. One real example of this is the results of a study about anti-lock brakes. People driving cars with anti-lock brakes were more likely to speed and tailgate than drivers in cars without anti-lock brakes. Rugby and tackle football have about the same rate of concussions even though one group is wearing a helmet. The thought is football players will more often risk head impacts because the risk compensation bias has them thinking the helmet is protecting them more than it actually is. The same could be true for any safety measure—hard hats, safety glasses, or hand protection may make you feel more protected than you are and more likely to take greater risks.

Reactance bias: This is the urge to do the opposite of what someone tells you because you feel as though they are constraining your freedom. We think of reactance bias when we think of teenage drinking or smoking. In an effort to exert and control their own freedom,  they are going to do the opposite of what the adults tell them. As immature or self-defeating as that might sound, reactance bias can still impact some adult workers. This is why any “Change Management 101” process will tell you that early on, you must involve the people who are most likely to resist. Make them leaders of change instead of leaders of the resistance. If it’s perceived as their idea, they will be less likely to have reactance bias. If you’re feeling the reluctance to embrace a new initiative or requirement ask yourself, “Do I hate the idea or do I hate the lack of choice?”

Curse of knowledge: A person with more information will find it difficult to think about a problem from the perspective of a person with less information. This bias sets us up for failure because the more knowledgeable person will omit information they assume everyone knows. The missing information will cause others to make assumptions or “fill in the blanks.” When this happens, things are missed and mistakes are made. If you are the person giving instructions, don’t make any assumptions about the other person’s knowledge. Ensure everything is clearly understood; even the things you consider to be “common knowledge.” If you are the person receiving the information, ask questions to fill in the blanks; don’t try to “figure it out” on your own.

And last but not least…

Bias Blind Spot: This is the tendency to see the influence of biases on others while not seeing the impact of biases on yourself. There was a study which asked respondents if they were less biased than the average American—85% thought they were. You don’t need to be a statistician to see the problem with those numbers. A bias blind spot does not necessarily make you riskier than the average person; after all, most biases have an influence because they go unrecognized and unchallenged. However, people with bias blind spots are less likely to heed the advice of others or benefit from training to eliminate the influence of a particular bias. Again, you can’t fix it if you can’t see it.

The term “cognitive bias” gets thrown around a lot but is seldom explored. We often just lump a bunch of very distinct concepts into one vague idea called “cognitive bias.” The best way to not be influenced by a cognitive bias is to know exactly which types might be impacting you. We hope that this series of articles has helped you think about the influence of specific biases and through this recognition lessens their impact on your safety. If not, maybe you should consider the impact of your pessimism bias.

Toolbox Talks offers quick insights and thoughts to use for your toolbox (tailboard) talks. Dave Sowers is a founding member of Knowledge Vine, a veteran-owned human performance training and consulting organization that strives to reduce the frequency and severity of human errors in the workplace. He has almost 30 years of experience in power generation and the utility industry. He is a veteran of U.S. Navy Nuclear Power Program and holds a bachelor’s degree in resources management and a master’s degree in both management and emergency management and homeland security.