Toolbox Talks: ‘Human error’ and the downing of the Ukrainian airplane

Iran has acknowledged that its armed forces "unintentionally" shot down the Ukrainian jetliner that crashed earlier this month, killing all 176 aboard. (The Associated Press/Ebrahim Noroozi)

About a month ago, we published an article titled Using Safety as a Tool or a Weapon? You can read about it by following this link, but here’s an overview: When misapplied, tools intended to help us, like safety rules and Human Performance processes, can be used as a weapon to assign blame without fixing the underlying issues.

When we weaponize these things, post-accident talking points become “Aha! The worker violated our policy. Nothing to fix, we need more compliance. Let’s punish the worker so everyone else knows we’re serious.” Never mind that the policy is unclear or difficult to navigate. We tend to blame the person and don’t fix the system.

The information in the news about the Iranian downing of the Ukraine airplane points to the weaponizing of the term “human error.” I will grant you that international politics have a way of skewing what is communicated and what is held close to the vest. Also, our lens into this is the media, which may be oversimplifying things to keep it under a certain word count.

Politics aside, this is not going to be a perfect root cause analysis. It is going to look at the broad idea of calling an event “human error” and thinking it gives you the cover to ignore the system issues.

Here is the latest information on the tragedy. The government in Tehran has stated that the accident was the result of “human error” and a “disastrous mistake.” The plane was traveling near a sensitive military base; however, “the plane was flying in its normal direction without any error and everybody was doing their job correctly,” according to Gen. Amir Ali Hajizadeh, commander of the Islamic Revolutionary Guards Corps’ airspace unit.

A member of the Iranian Air Defense saw the commercial airplane and had 10 seconds to decide if the plane was a threat. It is thought there was a disruption of the communication between the missile operator and control and command network, but the only official word is that this has not been proven yet. Missile operators had been put on high alert since it was assumed there would be a retaliation for the missiles launched by Iran toward US interests in Iraq a few hours earlier.

So now the official word is “human error” caused this tragedy.

I imagine that whoever pushed that launch button is miserable with grief and terrified for what is coming since all fingers seem to be pointing their way. The system has found its perpetrator to punish and all the cover they need to go no further. A system that makes it hard for an operator to readily tell the difference between commercial airlines and military threats.

A system, which may or may not have left this poor operator without communication and nowhere to turn for help. A system which said, “be on high alert and you have 10 seconds to make the most important decision you will ever make.” Because of this short amount of time, and the fact that they seem to be zeroing in on one person, I’m guessing they have a system that allows missiles to be launched by only one person—no second checks, no confirmations, no concurrence from a supervisor, no way to abort an errant launch.

This is from an excerpt in a New York Times article about the aftermath of the mistake and ensuing misinformation: Mohamad Saeed Ahadian, a conservative analyst in Iran, said on Twitter: “There are two major problems with the Ukrainian Airlines issue. One is firing at an airplane and two is firing at the public’s trust. The first can be justified, but the latter is a mistake with absolutely no justification.” The first can be justified. Let that sink in.

“Human error” is certainly not a justification and is not even a full explanation. Yes, people do make mistakes—it’s human nature. But those mistakes are allowed to happen and even encouraged, by broken systems. “Human error” is the action that hits the first domino in a series of faulty choices and processes.

The human is usually the trigger; not the root cause. “Human error” gets weaponized when it is the shorthand, look no further explanation for what happened. If your organization is willing to chalk it up to “human error,” then they are missing the bigger picture. The person in your organization being accused of the “error” is likely horrified by what they did and terrified for what is about to happen to them, as all fingers are pointing their way. They were put in a tough spot and weren’t able to overcome all of the challenges in the way of their success, but they should hardly bear all of the blame.

If an accident and error occurs, ask yourself the following question: Could an equally qualified person, put in the same position, make the same mistake? If the answer is yes, then you have a system problem, not a person problem.

Based on what we know, which admittedly might be limited information, do you think another missile operator could have made the same mistake? Probably. Should the stated reason for the tragedy be “human error”? Probably not, because the system failed the human and likely any human put in that situation.

Don’t accept “human error” as the sole cause of an accident, whether it’s in the news or in your organization.

Toolbox Talks offers quick insights and thoughts to use for your toolbox (tailboard) talks. Dave Sowers is a founding member of Knowledge Vine, a veteran-owned human performance training and consulting organization that strives to reduce the frequency and severity of human errors in the workplace. He has almost 30 years of experience in power generation and the utility industry. He is a veteran of U.S. Navy Nuclear Power Program and holds a bachelor’s degree in resources management and a master’s degree in both management and emergency management and homeland security.