Preventable medical mistakes remain one of the most serious challenges in healthcare, yet many hospitals still struggle with learning from medical errors and preventing them from happening again.
From emergency departments to pathology labs, the same types of errors continue to harm patients–often not because of bad people, but because organizations fail to learn, improve, and redesign broken systems.
Why Preventable Medical Errors Still Happen in Hospitals
Working in healthcare, it's hard not to find stories about preventable errors. I keep hearing stories from friends and family – such as my relative who was recently mixed up with another hospital patient, something that was discovered AFTER she was given an X-ray and CT scan that she didn't need. There's a more distant relative who died after a knee replacement surgery when she was told, “Just keep taking your meds as you always do,” and her sleeping pill interacted badly with the remnants of her anesthesia, leading to heart failure, brain damage, and death in her 60s.
I have a queue of news stories from the past few months… I guess I'll go ahead and combine them into a post that focuses on the theme of hospitals learning from mistakes and preventing future occurrences (or, sadly, not doing so).
The Rory Staunton Case: Emergency Department and Lab Failures
Most recent in the news was the case of Rory Staunton, a 12-year old boy who died after being sent home from the emergency department at NYU's Langone Medical Center (as covered in the NY Times and this ProPublica story). He died from septic shock after:
The hospital's emergency room sent Rory Staunton home in March and then failed to notify his doctor or family of lab results showing he was suffering from a raging infection.
This is arguably a mistake in clinical judgment, if the docs didn't detect signs of dangerous sepsis. I mean Rory had just cut his arm diving for a basketball in the school's gym. The E.D. thought he was just dehydrated and had an upset stomach, so they gave him fluids and a pain reliever and sent him home. In retrospect, say some, Rory's vital signs, should have raised warnings that he was more seriously ill. Lab results came back hours after Rory and his family went home:
three hours later, when the Stauntons were at home: the hospital's laboratory reported that Rory was producing vast quantities of cells that combat bacterial infection, a warning that sepsis could be on the horizon.
Again, Rory Staunton died. Some argue that this set of circumstances just couldn't have been avoided or that clinical care can be complex. The countermeasures, though, suggest there was more the hospital could have done. In the aftermath, the hospital promised changes, as discussed in this New York Times story.
- Emergency physicians and nurses will be “immediately notified of certain lab results suggestive of serious infection, such as elevated band counts” (I thought critical lab values were already immediately communicated to the MD)
- Developed a new checklist to ensure that a doctor and nurse have conducted “a final review of all critical lab results and patient vital signs” before a patient leaves (As with any checklist, this will only be helpful if it is used)
- If a clinically relevant test is only available after the patient is discharged from the E.D., the patient will be called, and the information will be shared with referring physician
The hospital issued a statement:
“Keeping our patients safe is our first priority, and we want to prevent this situation from happening again,” she said.
Easier said than done.
This tragedy wasn't caused by a single bad decision, but by breakdowns in communication, system design, and leadership responsibility–patterns that still define many preventable medical errors today.
Will Hospitals Actually Learn From Medical Mistakes — Or Repeat Preventable Errors?
OK, so how do we keep this from happening again anywhere else? Some are optimistic (from the Times):
Drawing lessons from what happened to Rory will require an exploration of all the factors that influenced his care, said Dr. Paul Spirn, a radiologist with Beth Israel Deaconess Medical Center in Boston. Among examples he mentioned were the extent of communication between the hospital and Rory's pediatrician, and the procedures for alerting doctors and patients to urgent lab results after a patient has left the emergency room. “The Staunton case has the potential to yield far-reaching improvements in patient care,” Dr. Spirn said.
Others, such as the writer of the ProPublica piece, question if this spread of knowledge and wide-spread process improvement will really happen.
As veteran health reporters, we wish we could tell you that this case will spur changes in emergency rooms across the nation, that never again will a hospital make such an avoidable mistake. But, sadly, decades of experience covering such incidents suggest the medical system may prove resistant to change. Forget about every hospital rewriting its procedures. History suggests it would be a victory if NYU Langone manages to follow its own new rules as we all hope they will.
Will the NYU Langone checklists be mocked and cast aside by physicians and nurses? That might be understandable if the checklist was forced on them from on high. Toyota's Taiichi Ohno always taught that “standardized work” (in the Lean parlance… checklists are an example) must be developed by the people who do the word. Dr. Atul Gawande teaches the same idea in his book The Checklist Manifesto: How to Get Things Right.
Will the checklists and protocols be cast aside when things get really busy and hectic? Or if a particular doc thinks they know better? It remains to be seen.
The ProPublica writer is a bit cynical, but it's understandable.
When Healthcare Fails to Learn, Patients Pay the Price Again
Cedars-Sinai didn't learn from the series of preventable mistakes that killed three babies in an Indianapolis hospital in 2006, leading to the 2009 overdose of the Quaid twins (which they survived, thankfully).
The ProPublica piece has examples of other organizations failing to learn and repeating the same mistakes.
Pathology Errors Keep Repeating — A Preventable Medical Error Problem
Many, many, many hospitals fail to learn from preventable errors in anatomic pathology labs, where specimens are mixed up due to poor processes, including the “batching” of work. I've written about this a lot:
- Being Careful Isn't Enough, Particularly in Pathology (I warned of this risk in 2006)
- This Will Happen Again, Unless… (a mix-up that made the news in 2007)
- Another Pathology Mishap (in 2008)
- Yet Again – A Patient Harmed as Hospital Lab Mixes Up Specimens (a case in 2009)
- Pathology Mistakes (Again) on Oprah and in the News (in 2009)
And it's not like I blog about every single error that occurs.
There was a recent batch of stories like this over the past few months – in New Zealand and Canada (this isn't just an American problem):
And a flurry of articles about this case in Windsor, Ontario:
- CEO of Windsor hospital says human error to blame for unnecessary breast surgery
- Windsor Regional Hospital pathologist under supervision after error
- Windsor hospital CEO takes responsibility for surgery mistake
These stories aren't tales of poorly trained individuals. There are system issues, such as people being overworked (not enough staffing or time… or too much waste in their processes and days).
In New Zealand, a pathologist says they were “set up to fail”:
“It was one of those cases where I was rushed,” Beer said. “When you are interpreting pathology, you shouldn't be rushed.”… Engels' case, where her tissue samples were placed in the wrong processing cassette, was a classic example of rushed staff making mistakes, Beer said.
How Targets, Quotas, and Fear Create Unsafe Care and Medical Errors
A 5-day target, if held as a strict quota (as in “meet that target or else”), will inevitably lead to cutting corners, much as British hospitals cut corners and gaming the system to hit 4-hour A&E targets and 8-minute ambulance targets. Targets and fear-based management will lead to errors… but that management system is rarely blamed. We are far too quick to blame individuals rather than look at the system, leadership, and culture.
Leadership Responsibility: Who Is Accountable for Patient Harm?
In the Windsor case, the hospital CEO David Musyj (who butted heads with me here in the past) says he is “ultimately responsible” for the mistake, not the pathologist who switched the specimens nor the surgeon who did the surgery correctly (based on the bad information sent from the lab… the SECOND time she was burned by a path lab).
Conceptually, I appreciate what CEO Musyj is saying. He buys into the Paul O'Neill school of safety leadership that says everything that happens on the ship, in the factory, or in the company is the responsibility of leadership — as Dr. W. Edwards Deming said, top management is responsible for the system.
From Blame to Learning: What Must Change to Prevent the Next Error
So, realistically, what does it mean that Musyj takes responsibility? The pathologist involved took two weeks of paid leave. I'm not a big fan of firing people for making mistakes — but I don't think Musyj punished himself, took unpaid leave, etc. Has a CEO, COO, or CMO ever been fired for a preventable medical error that occurred under their watch? Frontline staff get fired all the time (or even jailed) — but again, I don't think that's the answer.
I believe strongly that punishment, blame, firings, and jailings won't lead to improvement that can prevent other errors. So what ARE we going to do? If I had a biopsy that was suspected of cancer, would I be able to trace that specimen through the process with my own eyes? I'd sure want to…
Why This Still Matters in 2026
More than a decade after many of these stories first made headlines, the underlying problems haven't gone away. In fact, they've become more dangerous. Hospitals in 2026 are operating under intense pressure: chronic staffing shortages, rising patient acuity, burnout among clinicians, tighter financial constraints, and growing regulatory and public scrutiny. In that environment, systems that rely on heroics, workarounds, or “being more careful” are far more likely to fail.
What still determines whether preventable harm occurs isn't technology, checklists, or policies alone–it's leadership and culture. When leaders treat safety as a priority only after a tragedy, when targets and quotas create fear, or when individuals are blamed instead of systems being improved, the same mistakes repeat. Conversely, organizations that consistently reduce harm are those where leaders take responsibility for the system, invest in better process design, eliminate fear, and make it safe for people to speak up before patients are harmed.
In 2026, respect for people is no longer just a Lean principle–it's a competitive necessity. Healthcare organizations are competing for scarce talent, public trust, and financial sustainability. Leaders who fail to learn from mistakes don't just risk the next adverse event; they risk losing their workforce, their credibility, and their ability to provide safe care at all.
If you’re working to build a culture where people feel safe to speak up, solve problems, and improve every day, I’d be glad to help. Let’s talk about how to strengthen Psychological Safety and Continuous Improvement in your organization.







Thanks for compiling the list. “Preventable” is the key.
Interesting blog post from a pediatrician on the Rory Staunton case:
LINK
We need to keep bringing awareness to medical errors. I was given heart surgery by mistake. Watch the Shannon Koob Story on youtube
http://www.youtube.com/watch?v=-AjnGowZH0A
More pathology mixups – specimen identification errors (this time in New Zealand).
These are preventable if you have a good process:
http://www.stuff.co.nz/taranaki-daily-news/news/7637978/Human-error-in-cancer-botch-ups