Why Warning Signs Don’t Prevent “Never Events” in Operating Rooms

44
2

tl;dr: High-profile “never events” aren't caused by careless clinicians–and they aren't prevented by warning signs. Signs that tell people to “be careful” substitute reminders for real system design, error-proofing, and a culture that makes it easy to do the right thing every time.

Medical errors are not a laughing matter. Harm (or death) resulting from preventable process problems are far more common than most people might think.

This headline caught my eye the other day:

Man circumcised in hospital mix-up gets £20k payout

A 70-year-old NHS patient in England was supposed to have an injection into his bladder, but was circumcised by mistake.

This is a good example of dry British “stiff upper lip” discussion:

However, there was a mix-up and afterwards Mr Brazier said he was told “sorry, but we've circumcised you“.

The 70-year-old said after the procedure he was left waiting for two hours until they broke the news and he just replied “Oh, have you?”

The hospital apologized (which is very British of them, even if things had gone well) but there's really no excuse for this type of so-called “never event” to happen.

Never Events Are System Failures, Not Rare Accidents

Here are the 10 most frequently reported sentinel events for 2017, according to The Joint Commission:

1. Unintended retention of a foreign body — 116 reported
2. Fall — 114
3. Wrong-patient, wrong-site, wrong-procedure — 95

That doesn't mean that only happened 95 times… it was just reported 95 times. Underreporting of errors is a known problem in healthcare.

Why Many Errors Are Never Reported — and Why That Matters

I've heard countless stories over the years that go like this (paraphrasing):

“We were supposed to operate on the right foot, but we mistakenly cut into the left (which was supposed to be done later)… so we did the left and then did the right, and told the patient they got two for the price of one.”

Sometimes, the consent is changed and they ask the patient to sign after the fact. That's not how things are supposed to work… that's a cover up. So, we know not all errors get reported, unfortunately. Hiding problems means we can't solve problems.

The patient was stunned, as quoted in this article:

“What I couldn't believe though was that I wasn't the only person that had been victim to a mistake (never event) in Leicester.

The article doesn't say how the man was incorrectly circumcised. There are supposed to be process checks along the way, including bar-coded wrist bands, identification checks, and surgeons confirming with the patient which procedure is supposed to be done. Wrong-patient (or wrong-site or wrong-side) errors are preventable, if we have good processes and a culture of “safety first.”

Was the surgeon or the O.R. team rushing for some reason? Were they prioritizing schedule and speed over quality and accuracy? We need to detect that “unsafe condition” before there's a bad event.

This article talks about a cause:

“…staff at Leicester Royal Infirmary mixed up his notes.”

It doesn't say HOW the notes were mixed up. What could have allowed that to occur? Why weren't there better process controls and error proofing in place?

And not to blame the patient (who had this correct procedure done three times before):

“Brazier told the Star he was so distracted chatting to nurses that he did not realise he was getting a different procedure until it was too late.”

At the bottom of the article about the circumcision error, there were links to related articles:

They're all shocking… but “wrong patient” and “blood sample mix up” are errors that I've sadly heard of before.

But “hospital posters? — What?

Again, while medical errors are NOT funny, I have often made this case when talking about the fallacy of “warning signs” being effective:

“If warning signs were really that effective, every operating room would have a sign like this…”

I didn't think signs like that exist. They'd be insulting to the surgeons, but that's not the point… errors are usually due to process problems or communication problems, not “bad surgeons.”

What does 2026 AI think these silly signs would look like in the real world?

A close-up photograph of a handmade sign taped to a white tiled wall in a hospital operating room. The sign, made of white paper, has large black text that reads, "ATTENTION SURGEONS: PLEASE DO NOT CUT THE WRONG PATIENT." Below it, in smaller handwritten text, it says, "(SERIOUSLY. WE'RE TIRED OF APOLOGIZING.)". In the blurred background, medical personnel in green scrubs are visible near surgical equipment and bright overhead lights.

If one sign is good, why not more then?

imageA close-up photograph of a handmade sign taped to a white tiled wall in a hospital operating room. The sign, made of white paper, has large black text that reads, "ATTENTION SURGEONS: PLEASE DO NOT CUT THE WRONG PATIENT." Below it, in smaller handwritten text, it says, "(SERIOUSLY. WE'RE TIRED OF APOLOGIZING.)". In the blurred background, medical personnel in green scrubs are visible near surgical equipment and bright overhead lights.

Then I saw the following link…

Oxfordshire NHS patient mix-ups prompt hospital posters

Again, wuh? What the what?

Oxford University Hospitals NHS trust introduced the signs in staff areas at the John Radcliffe and Churchill following four “never events”.

They included the wrong person receiving an endoscopy, where a camera is inserted into a patient's throat.

Four of them. That's why I generally call them “so-called never events” — and I'm being serious not funny.

Why “Be More Careful” Is Not a Safety Strategy

Do “reminders” really work? Do signs that ask or demand that people “be more careful” really help? I have an entire blog series on this topic.

“The posters are part of a programme of work to ensure that patients receive the safest care possible.”

If there's truly an effective programme, then signs aren't necessary.

A spokesman added that while the current posters are internal, it is working on versions for public areas.

Do more signs mean more safety? Do BIGGER signs lead to fewer errors? Good grief.

I hope they don't put up public signs, because that would be a very embarrassing public admission that they don't know how to prevent errors… patients deserve better.

Can you imagine seeing a sign like that in a waiting room while you're there waiting for your loved one?

Poka-Yoke, Not Posters: Why Reminders Fail and System Design Works

If warning signs were enough to prevent harm, healthcare would already be error-free. The reality is that signs, reminders, and slogans all depend on perfect human attention in imperfect conditions. Operating rooms are complex, fast-paced environments filled with interruptions, handoffs, and time pressure. That's exactly where reliance on memory and vigilance is weakest.

Poka-yoke–mistake-proofing–is the Lean alternative. Instead of telling people to be more careful, it asks a better question:

How can we design the process so the error is difficult or impossible to make?

That might mean forcing functions that prevent the wrong procedure from being selected, physical or digital constraints that require confirmation before proceeding, or standardized workflows that make deviations obvious immediately, not after harm occurs.

True poka-yoke respects clinicians by acknowledging human fallibility. It doesn't assume bad intent, carelessness, or incompetence. It assumes good people working in systems that need to be better designed. When organizations respond to “never events” with posters, they're effectively saying, “Try harder next time.” When they respond with poka-yoke, they're saying, “Let's fix the system so this can't happen again.”

If a process truly makes the right action the easy action–and the wrong action hard or impossible–then reminders become unnecessary. Safety isn't achieved by more signs on the wall. It's achieved by building reliability into the work itself.


If you’re working to build a culture where people feel safe to speak up, solve problems, and improve every day, I’d be glad to help. Let’s talk about how to strengthen Psychological Safety and Continuous Improvement in your organization.

Get New Posts Sent To You

Select list(s):
Previous articleOperational Excellence Mixtape: August 16, 2019
Next articlePodcast #346 — Mark Ryan, Transformation, Kaizen, and Management at Franciscan St. Francis Health
Mark Graban
Mark Graban is an internationally-recognized consultant, author, and professional speaker, and podcaster with experience in healthcare, manufacturing, and startups. Mark's latest book is The Mistakes That Make Us: Cultivating a Culture of Learning and Innovation, a recipient of the Shingo Publication Award. He is also the author of Measures of Success: React Less, Lead Better, Improve More, Lean Hospitals and Healthcare Kaizen, and the anthology Practicing Lean, previous Shingo recipients. Mark is also a Senior Advisor to the technology company KaiNexus.

2 COMMENTS

  1. I added:

    So-called Universal Protocol would be a good way to describe it.

LEAVE A REPLY

Please enter your comment!
Please enter your name here