Albert Einstein — the mind who redefined physics — once called something he did “the biggest blunder of my life.”
When he completed his general theory of relativity, the math pointed to something extraordinary: the universe should be expanding. But in the early 20th century, the prevailing belief was that the universe was fixed and eternal.
Instead of trusting where his own work led him, Einstein adjusted his equations to match the consensus. He inserted a mathematical “fudge factor” — the “cosmological constant” — to force his results into alignment with what “everyone knew.”
Years later, astronomer Edwin Hubble's observations confirmed the universe was indeed expanding, making Einstein's adjustment unnecessary and, in hindsight, a missed opportunity.
Why tell this story here? Because it illustrates a universal challenge in improvement work — whether in physics, healthcare, manufacturing, or software:
- Even brilliant people can be swayed by prevailing beliefs.
- Data sometimes tells us something unexpected — and uncomfortable.
- The pressure to conform can lead us to dismiss or distort evidence.
Lean Lessons from Einstein's Blunder
In Lean, we talk about the importance of going to the “gemba” (the workplace), observing reality, and letting data guide decisions. But that's easier said than done, especially when the facts challenge a long-standing narrative.
Einstein's situation mirrors what I've seen in organizations:
- A team runs a pilot test that produces better results than the old method — but leadership resists change because “we've always done it this way.”
- A Process Behavior Chart shows a stable process with no meaningful improvement — but leaders still react to every up and down as if it's a crisis.
- A front-line staff member identifies a problem — but is told to “stay in your lane” instead of being encouraged to investigate further.
In all these cases, the risk is the same: we bend reality to fit our assumptions instead of letting reality reshape our thinking.
How This Connects to Toyota Kata
Toyota Kata is built around the idea that improvement is a scientific process:
- Set a target condition.
- Experiment toward it.
- Learn from what happens.
We can do the same things when framed as Kaizen or PDSA (Plan Do Study Adjust) cycles.
If you already know exactly what will happen, it's not an experiment — it's an implementation. In true experimentation, some predictions will be wrong, and that's OK. The value comes from learning why the result was different than expected.
Einstein's equations gave him a surprising result. In a Kata mindset, that surprising data point would have been the most valuable part of the process — a signal to investigate further, not to smooth things over.
Challenging the Status Quo Requires Psychological Safety
Einstein had the math. Many teams have their own “math” — solid data and direct observations. But speaking up about it, especially when it contradicts conventional wisdom, takes courage. In Lean terms, it also takes psychological safety: the shared belief that it's safe to raise concerns, ask questions, and challenge current thinking without fear of ridicule or punishment.
Without that safety, people are more likely to self-censor or — like Einstein in this case — alter their conclusions to fit the dominant view.
The Real Blunder? Not Learning
If Einstein had trusted his equations and his brilliant brain, he could have predicted the Big Bang decades earlier. But his “blunder” became one of science's most famous cautionary tales, reminding generations of scientists to be bolder when the evidence points in a surprising direction.
In Lean, the greater mistake isn't getting something wrong — it's refusing to learn from it. The key is to notice when our reasoning is bending under cultural or political pressure and to reflect on how we can handle it differently next time.
A Challenge for Leaders
Whether you're leading an improvement project, making operational decisions, or setting strategic direction:
- Let the data speak — even when it whispers something unpopular.
- Invite respectful dissent — ask, “What are we missing?” and mean it.
- Model learning from mistakes — show that revising your position when presented with new evidence is a strength, not a weakness.
- Run true experiments — expect surprises, and treat them as learning opportunities.
If the father of relativity could misjudge the size of the universe, there's no shame in us discovering we've been wrong about our own corner of it. The real error is ignoring the evidence that could help us improve.
What about you?
Have you ever run an experiment — in work or life — that gave you surprising or even disappointing results… but ended up teaching you something valuable? Share your story in the comments. Your example might be the nudge someone else needs to trust their data, follow the evidence, and keep improving.
Please scroll down (or click) to post a comment. Connect with me on LinkedIn.
Let’s build a culture of continuous improvement and psychological safety—together. If you're a leader aiming for lasting change (not just more projects), I help organizations:
- Engage people at all levels in sustainable improvement
- Shift from fear of mistakes to learning from them
- Apply Lean thinking in practical, people-centered ways
Interested in coaching or a keynote talk? Let’s talk.
Join me for a Lean Healthcare Accelerator Trip to Japan! Learn More
