Toolbox Talks: Human error in the news

(iStock)

People magazine published an interesting article about “human error” this past weekend.

You can read about it here but this is the Cliff Notes version: Sarah Boyle, a young mother, went to the doctor because she was having trouble breast feeding. A biopsy and scan were taken and she was diagnosed with breast cancer. She underwent a double mastectomy, reconstructive surgery, and aggressive chemotherapy cancer treatment.

A year later she found out that she had been misdiagnosed and never had breast cancer. The hospital admitted the misdiagnosis was due to a pathologist incorrectly recording the results of the biopsy sample. The hospital apologized for the misdiagnosis, stating it was due to “human error.”

One simple clerical error and this poor woman’s life has been forever changed. Not only did she have to endure the rigors of unnecessary surgery and chemotherapy, but now there are long-term concerns around the type of implant they used in the reconstruction and the chronic effects the chemo.

This is what a hospital representative had to say: “A misdiagnosis of this kind is exceptionally rare and we understand how devastating this has been for Sarah and her family. Ultimately, the misreporting of the biopsy was a human error so as an extra safeguard, all invasive cancer diagnoses are now reviewed by a second pathologist.”

This statement is what got my attention. First, yes, a misdiagnosis is probably rare but that mindset indicates the wrong organizational culture when it comes to errors. They are looking at the likelihood of an error and not the consequences. Odds are there won’t be a misdiagnosis today but, if there is, the consequences could be horrific. Workers look at risk the same way. “What are the odds I can get hurt? Exceptionally rare? I’ll take it!” If the likelihood is low then we don’t worry about the consequences, but that is looking through is the wrong lenses.

Consider this: What is the likelihood you will get in a car accident today? Exceptionally rare is probably the answer. So why did you put on your seatbelt this morning? Because you know the consequences could be severe in the unlikely event you did get in an accident and you weren’t wearing it. As you approach work the likelihood of something going wrong is a consideration, but the severity of the consequence is paramount and should inform your defenses.

Second, “as an extra safeguard all invasive cancer diagnoses are now reviewed by a second pathologist.” In Human Performance terminology, this is Peer Check, Second Check, or Independent Verification. This is a good step, but falls woefully short. So they looked around the hospital and ‘invasive cancer diagnoses’ is the only area where an unlikely error could lead to devastating consequences? According to Johns Hopkins, 250,000 people die in U.S. hospitals each year from medical errors, making it the third leading cause of death. Maybe we want to look around little more and see where the wrong “likelihood vs. consequence” mindset is having an impact.

Encourage your team to consider the severity of the consequences if something were to happen; not just the likelihood that it will. You wore your seatbelt this morning even though you didn’t need it. Treat your PPE and other safeguards the same way. “I probably won’t need it, but when something unlikely happens, I’ll be glad it’s there.”

Toolbox Talks offers quick insights and thoughts to use for your toolbox (tailboard) talks. Dave Sowers is a founding member of Knowledge Vine, a veteran-owned human performance training and consulting organization that strives to reduce the frequency and severity of human errors in the workplace. He has almost 30 years of experience in power generation and the utility industry. He is a veteran of U.S. Navy Nuclear Power Program and holds a bachelor’s degree in resources management and a master’s degree in both management and emergency management and homeland security.