Beyond Blame: How Punishing Healthcare Workers Fails to Prevent Medication Errors

117
0

A nurse makes a medication error. A patient is harmed. The nurse already feels terrible — they went into healthcare to help people, and now the worst has happened.

Then comes the organizational response. The investigation. The suspension. Maybe termination. In the most extreme cases, criminal prosecution.

And here's what nobody seems to ask afterward: did any of that make the next patient safer?

When “Accountability” Really Means Punishment

In healthcare, leaders saying “we'll hold them accountable” too often means, in practice, “we'll punish an individual for mistakes that have systemic causes.”

Consider the case of RaDonda Vaught, a nurse who was criminally prosecuted and convicted for a medication error. The organization and its leaders were not held to a similar standard. The system that made the error possible — the confusing drug cabinet interface, the workflow pressures, the process design — largely escaped scrutiny while one person absorbed nearly all of the consequences.

What happened to her was unjust. And I don't think it made anyone safer.

There's something deeply satisfying about finding the person responsible. It feels decisive. It feels like you've addressed the problem. Everybody can move on.

But satisfaction and effectiveness are different things. Punishing one nurse doesn't change the drug cabinet interface. It doesn't fix the workflow that allowed the error. It doesn't redesign the process so the next nurse — who will be just as human, just as fallible — is less likely to make the same mistake.

What it does do is teach every other nurse in the building a very clear lesson: if something goes wrong, protect yourself first.

The Data We Don't Have (And the Data We Do)

I'll put a challenge out there: can anybody show me data that proves punishment reduces medication errors and patient harm? Either locally or at a national level?

I've never seen it.

What I have seen is data showing that reported events go down after high-profile punishments. And reported events going down is not the same as actual events going down. A culture of fear and punishment teaches people to hide mistakes when they can — and to especially hide near misses, which are the very signals that could prevent the next serious harm.

Dr. Lucian Leape, a pioneer in patient safety, put it clearly: approaches that focus on punishing individuals instead of changing systems give people strong incentives to report only those errors they cannot hide. A punitive approach shuts off the information needed to identify faulty systems and create safer ones.

That's the trap. Punishment feels like action, but it actually makes the organization blind to the very problems it needs to see.

Why Does This Keep Happening?

Dr. David Mayer has an explanation I find persuasive. Hospitals find it's the easy way out. If you can blame a nurse for something that any nurse might have done, or blame a physician for a mistake any physician might have made in the same situation, you've “solved” the problem in your mind. But it's not solved.

As I've written and taught: “human error” is not a root cause. It's a starting point. The real questions come after: Why was that error possible? What in the system allowed it? Would another person, in the same situation, likely have done the same thing?

If the answer to that last question is yes — and it usually is — then you have a systemic problem. Replacing the person doesn't fix it. It just resets the clock until it happens again.

At Virginia Mason Medical Center, a patient died after a technician used the wrong antiseptic solution. It was a clear liquid, easy to confuse with another clear liquid. The systemic fix? Virginia Mason changed from a clear liquid antiseptic to a colored gel with a different applicator. That made the error physically harder to repeat, regardless of who was doing the work. That's what prevention looks like.

Reported Errors Need to Go Up Before They Go Down

This is the part that's hardest for many leaders to accept: if you're doing this right, your error reports should increase before they decrease.

Dr. John Toussaint tells a story in his book On the Mend about a gemba walk where a nurse admitted, point blank, that her unit wasn't recording medication errors. The reason? It took almost four minutes to navigate through multiple computer screens to submit a report. And there was fear that reporting would bring punishment.

Toussaint didn't yell. He didn't write up the nurse. He thanked her for being honest, then contacted the information systems department to simplify the reporting process. That's servant leadership.

The fact that the nurse felt safe enough to say it out loud was a sign of a maturing culture. In most hospitals, that admission would be a career risk. And so the errors keep happening, unreported, unexamined, unresolved.

As I like to put it: we need to shift from discipline and punishment to disciplined problem solving. The words sound similar. The results are very different.

What Good Systems Actually Look Like

W. Edwards Deming taught that 94% of problems belong to the system. The Institute of Medicine's landmark report “To Err Is Human” reached a similar conclusion: the majority of medical errors are caused by faulty systems, processes, and conditions that lead people to make mistakes or fail to prevent them. This is not a “bad apple” problem.

Mistake-proofing — designing work so that errors are harder to make and easier to catch — is how you actually reduce harm. Not by telling nurses to be more careful.

Automated medication cabinets that open only the correct drawer for a specific patient. Barcoding systems that verify the right drug is going to the right person. Checklists that catch steps before they're missed. Software that flags an unusual dosage before it's administered.

These aren't perfect. No single safeguard is. But layered together, they create systems where human fallibility doesn't automatically become patient harm.

The question isn't whether people will make mistakes. They will. The question is whether the system catches those mistakes before they reach the patient — and whether the culture encourages people to speak up when something goes wrong so the system can learn and improve.

The Culture Makes Prevention Possible

All of the technical safeguards in the world won't help if people are afraid to surface problems. Fear doesn't prevent errors. It prevents reporting. And without reporting, there's no learning. Without learning, there's no prevention. You're stuck in a cycle where the same mistakes keep happening to different patients with different staff, and nobody connects the dots because nobody is talking about it.

Creating a culture where people feel safe to speak up about errors — and near misses especially — is the foundation that everything else rests on. That's psychological safety in action. It doesn't mean there are no consequences for reckless behavior. The “Just Culture” framework, which originated in aviation, provides a useful way to distinguish between honest human error, at-risk behavior, and truly reckless conduct. The vast majority of medication errors fall into the first category. They deserve a systemic response, not a punitive one.

And the organizations that get this right? They don't just have fewer errors. They have more engaged staff, less turnover, and better patient outcomes. The prevention and the culture reinforce each other.

Healthcare has a choice: keep punishing individuals for systemic problems, or start building the systems and cultures that actually prevent harm. The evidence for which approach works has been available for decades.

So why do so many hospitals still reach for blame first?

Please leave a comment on the blog post or join the LinkedIn discussion on the topic:

Get New Posts Sent To You

Select list(s):
Previous articleNo, You Can’t Have Too Much Psychological Safety at Work
Next articleApply Operations Science to Accelerate Success Now, with Ed Pound [Webinar Preview]
Mark Graban
Mark Graban is an internationally-recognized consultant, author, and professional speaker, and podcaster with experience in healthcare, manufacturing, and startups. Mark's latest book is The Mistakes That Make Us: Cultivating a Culture of Learning and Innovation, a recipient of the Shingo Publication Award. He is also the author of Measures of Success: React Less, Lead Better, Improve More, Lean Hospitals and Healthcare Kaizen, and the anthology Practicing Lean, previous Shingo recipients. Mark is also a Senior Advisor to the technology company KaiNexus.

LEAVE A REPLY

Please enter your comment!
Please enter your name here