Toolbox Talks: Cognitive bias isn’t a thing; it’s a ton of things (Part 2)


Welcome back, or welcome if you’re just now jumping in. If you are new, or have slept since the last article, here is quick summary of how we got here:

In Part 1, found here, we discussed a hospital study that looked at surgeries with adverse events like death or major complications. Most of those adverse events were the result of “human error” and most of those were chalked up to lack of attention, lack of recognition, and cognitive bias. This makes it seem like “cognitive bias” is a thing we should look out for, but it’s really a lot of things that could impact our behavior and result in human error. Thus, the idea of discussing the different types of cognitive bias so our awareness, and maybe using a human performance tool, could help us avoid them was born. Now here we are.

In no particular order:

Dunning-Kruger effect: The less you know, the more you think you know. This isn’t something born out of arrogance, it’s just that we often don’t know what we don’t know. We can’t objectively recognize our own inability or lack of competence. It’s not too hard to draw a straight line from this cognitive bias to human error. This is where deferring to expertise comes in handy. Don’t assume you have it handled when there is an expert available. Talk to the safety person, get with an engineer, or elevate it to your supervision if there is any chance there’s a gap in your knowledge.

Focusing effect: Putting too much importance on just one aspect of an event. If you’ve been in industry any amount of time, you probably have heard about the accident or error that “came out of left field.” While the person was focused on watching tank level, the pump burned up. While the crane operator was focused on the suspended load, the tip of the boom contacted something. Yes, there are details and areas of concern to focus on, but we also can’t lose the big picture view. This is where a good peer check will help. Having a second set of eyes will certainly prevent the focusing effect.

Hot-hand fallacy: Believing that past success gives you a greater chance of future success. This bias is what shapes our overconfidence and encourages us to let our guard down. I’ve successfully knocked out three of these before lunch; I’m in automatic mode. Your past success was due, in part, to your attention to the work. As soon as you start to feel like “I got this, been there done that,” your attention is allowed to drift and you are more likely to make an error. There is no “hot-hand”; not even in basketball. Statistically, there is no correlation between previous baskets made and the odds of making the next basket. I know that sounds wrong because the most casual “HORSE” play has been on a “can’t miss” heater, but it’s true. Treat each task as its own and don’t get overconfident when you start stringing together some “wins.”

Outcome bias: The tendency to judge a decision by its result, not by the choices made at the time. Human Performance teaches us that “Bad behaviors, with a little bit of luck, can still get you the results you want.” Workers could take every shortcut in the book or violate every safety precaution but with a little bit of luck, still get the job done without injury or error. Then the outcome bias kicks in and they think, “Well, we got the result we wanted so decisions must have been right.” This is what encourages workers to keep pushing their luck. Ask yourself if you got the outcome you wanted because you planned for it or because you got lucky.

We still have a few more cognitive biases to cover but we’ll save those for Part 3. Stay tuned and we’d love your feedback at … and that’s not just our optimism bias talking.

Toolbox Talks offers quick insights and thoughts to use for your toolbox (tailboard) talks. Dave Sowers is a founding member of Knowledge Vine, a veteran-owned human performance training and consulting organization that strives to reduce the frequency and severity of human errors in the workplace. He has almost 30 years of experience in power generation and the utility industry. He is a veteran of U.S. Navy Nuclear Power Program and holds a bachelor’s degree in resources management and a master’s degree in both management and emergency management and homeland security.