A medical error was in the news this week, a teacher was mistakenly given an insulin shot instead of a flu vaccine. Thankfully for the teacher, it's not even something they had to be taken to the hospital for.
We're human, we make mistakes. In the Lean mindset, we recognize that and try to design systems that make it harder for errors to occur.
But what happened here? Reading the opinion piece I linked to, the response was predictable:
- Nurse placed on administrative leave (we must punish someone to show we, as management, have control of the situation)
- People are told to “be careful.”
The newspaper that wrote the opinion column expresses a few unfortunate nuggets of conventional wisdom — that errors are “bound to happen” (no, we can work to prevent them) and we should “be careful” (no, we should work to prevent them through systems improvement).
I'm sure the nurse feels horrible. He (yes, it was a he) didn't mean to make that mistake, I'm sure. How would things turn out differently if, instead of placing the nurse on leave, he was involved in the identification of the root cause of the error. The vials look alike — how can we error proof the process? What if the nurse were involved in solving the problem, to prevent it from happening again? Even if the nurse is fired, are other nurses likely to do a better job at “being careful?” The next nurse hired is human, too.
Instead, we blame, we punish, and we say “be careful.” No wonder we have such problems. Being careful helps, but it is not enough. I wish the U.S. Department of Health and Human Services had better recommendations than to be careful and to constantly inspect the work being done on you in healthcare.
Don't want to miss a post or podcast? Subscribe to get notified about posts via email daily or weekly.