TL;DR: Numbers don't make decisions–people do. Metrics that look “good” or “bad” can be dangerously misleading when leaders ignore context, system limits, and natural variation. The Chernobyl disaster shows what happens when incomplete data, fear of speaking up, and rigid mental models override reality at the front lines.
For executives, the lesson is clear: data must be paired with context, curiosity, and psychological safety. Ask what a number really means, what it can't measure, and whether people feel safe reporting bad news. Without that, “data-driven leadership” becomes guess-driven leadership–with costly consequences.
I really enjoyed the HBO miniseries “Chernobyl” that aired/streamed recently. You can watch it all now through HBO if you have access (or you can get an HBO free trial through Amazon).
Why Data Has No Meaning Without Context
One of my favorite and most meaningful quotes from Don Wheeler (as I've shared in my book Measures of Success) is:
“No data have meaning apart from their context.”
How Context Changes the Meaning of Metrics
There are many applications of this concept. One is the idea that we need to not overreact to every up and down in a metric. If we report that a metric is down 22%, that might sound bad. But, if we draw a run chart or a “Process Behavior Chart” that shows more data and more context, we might learn that the natural behavior of that metric is that it routinely goes up and down by 15%, 20%, or 25% in a typical month. That context matters.
Chernobyl as a Case Study in Context-Free Data
In the first episode of “Chernobyl,” there's a gripping scene where data is taken out of context to suit an agenda (or it's done out of denial) — bad language warning — don't blare this at work:
In a shorter clip from the scene, the deputy chief engineer, Dyatlov, is told that the radiation reading is:
“3.6 roentgen, but that's as high as the meter…”
“Not Great, Not Terrible”: When Limits Become Lies
He's cut off. 3.6 is as high as the meter goes. But Dyatlov, who already said “RMBK reactors can't explode” (one just did), is clearly in denial, and he's processing everything through that lens.
He says:
“3.6… not great, not terrible.”
The actual number must have been higher, of course.
Dyatlov then reports the faulty number to the higher-ups. Is he lying or just blinded by his bias and mental models. The number on the dosimeter WAS 3.6… but Dyatlov doesn't share the context of “that's as high as the meter will go.” So, that makes the situation seem less dire than it really is (although it looks more dire to those who can see the reactor burning from a distance).
This whole Chernobyl disaster is a situation where “going to the gemba” (the actual place) is quite deadly. But, we need to be careful that we're not misled when relying on data.
He tells someone higher up:
“I'm told the number is 3.6 roentgen per hour”
and is told:
“Well that's not great, but that's not horrifying.”
Higher-ups are told that things are “well under control” when that was clearly not the case.
Not long after, they're told the number was 200 roentgen. The response:
“Another faulty meter.”
An employee who was in the gemba tells Dyatlov that he saw graphite in the rubble (which would be part of the control rods). Dyatlov replies,
“No you didn't… because IT'S NOT THERE!”
He says this from a conference room.
Denial.
It's easy to deny things that fly in the face of your mental models.
Later in a scene from Episode 2, a dosimeter that reads up to 15,000 arrives. When the senior military officer is told that “lead shielding might not be enough,” he says, “I'll do it myself” (which is a great example of Leaders Eat Last-style servant leadership):
“What does that number mean?” That's a question that tries to set context.
“That means the core is open… the fire is giving off more than twice the radiation of the bomb at Hiroshima… 40 bombs worth by now… it will not stop…”
The actual number was later estimated to be something more like 30,000 roentgen, a very deadly number.
The official death toll of 31 seems to be another example of “data without context.”
Fear, Power, and the Suppression of Bad News
The whole series illustrates what happens in a culture of fear — except it's not a company or a hospital, but a country. Everybody feared sending bad news upward to the central committee and Gorbachev, even with him being a reformer.
What Leaders Must Learn About Data and Context
The video and the Chernobyl story show that “data-driven decisions” can be terrible if the data is faulty or if context is missing.
I've shared this idea on Twitter recently and before:
“Data-driven decisions” aren't necessarily GOOD decisions, especially if we make the mistake of reacting to data points or apparent trends that aren't statistically meaningful. We're better off when we learn to filter out “noise” so we can see real “signals” in our data.
— Mark Graban (@MarkGraban) June 4, 2019
And on LinkedIn:
What do you think about the need for more context with our data?
A 2026 Perspective: Context Is Still the Difference Between Insight and Illusion
In 2026, we have more data, faster dashboards, and more sophisticated analytics than ever before–yet the core risk hasn't changed. Numbers still get stripped of context. Bad news still gets softened as it moves upward. And leaders can still confuse confidence in data with understanding of reality.
The lesson from Chernobyl isn't about nuclear power or the Soviet system–it's about leadership under uncertainty. High-stakes decisions demand more than metrics. They require leaders who ask better questions, who go looking for disconfirming evidence, and who create environments where people can say, “This number doesn't tell the whole story.”
Data doesn't fail organizations. Context-free leadership does.







LinkedIn Discussion: