Professor John Grout: A Deep Dive on Mistake Proofing and Lean


Scroll down for how to subscribe, transcript, and more

My guest for Episode #462 of the Lean Blog Interviews Podcast is Professor John Grout the former dean of the Campbell School of Business at Berry College in Rome, Georgia. 

He was recently a guest on “My Favorite Mistake” — Episode 186, so I encourage you to check that out.

He's the current Chair of the Technology, Entrepreneurship, and Data Analytics Department and the David C. Garrett Jr. Professor of Business Administration. John has overseen the development, approval and implementation of Berry College's Creative Technologies program and Berry's makerspace, HackBerry Lab. 

Dr. Grout has researched mistake-proofing extensively and published numerous articles on mistake-proofing. In 2004, John received the Shingo Prize for his paper, “The Human Side of Mistake-Proofing” with Douglas Stewart. John has also consulted with a large variety of firms to mistake-proof their processes.

He's also published “Mistake-Proofing the Design of Health Care Processes” a book that's freely available online.

His Website:

Today, we discuss topics and questions including:

  • Your origin story – how did you first get introduced to TPS, Lean, etc? Context of discovering mistake proofing?
  • Shingo's book on Poka Yoke
  • “Shingo was not kind to Statistical Quality Control”… use SQC and/or mistake proofing?
  • Acceptance sampling… keeps bad product out… maybe?
  • Field goals — Conformity to specs vs. closer to center?
  • Successive checks and self checks
  • Source inspections – Shingo's gold standard
  • Why should you react when a part's out of control but still in spec??
  • Do you HAVE to stop the line? Don't be dogmatic??
  • Statistics don't do well with rare events
  • Do we have data on how universal the “universal protocol” is?
  • Doctor signing vs. you signing the surgical site?
  • ZERO – “the only way to go” in terms of goals
  • The goal of “zero defects” can be controversial.. is it possible? Motivating? Demoralizing?
  • Possible research – optimal time to stop doing final inspection??
  • Why is it easier to error proof now? Technology
  • “People don't like to own up to mistakes”
  • Naida Grunden episode on aviation safety
  • Can't error proof everything??
  • Preventing execution errors is easier than preventing decision errors
  • The balance and benefits of examples to copy vs. developing thinking?? “Catalog or catalyst”?? BOTH

The podcast is sponsored by Stiles Associates, now in their 30th year of business. They are the go-to Lean recruiting firm serving the manufacturing, private equity, and healthcare industries. Learn more.

This podcast is part of the #LeanCommunicators network

Video of the Episode:

Thanks for listening or watching!

This podcast is part of the Lean Communicators network — check it out!

Automated Transcript (Not Guaranteed to be Defect Free)

Announcer (1s):
Welcome to the Lean Blog podcast. Visit our website at Now, here's your host, Mark Graban.

Mark Graban (12s):
Hi, it's Mark Graban here. Welcome to episode 462 of the podcast. It's November 9th, 2022. Our guest today is John Grout. You'll learn more about him in a minute. He is an A professor and an expert on mistakes and mistake proofing. So that's really the deep dive that we're taking here today. We'll talk about what kinds of mistakes are easier to mistake, proof, and which are more difficult. We're gonna talk about how to create a culture of admitting mistakes, detecting mistakes, learning from mistakes. We'll talk about that and more. So to learn more about John, to get free ebook from him about mistake proofing in healthcare and more look in the show notes.

Mark Graban (53s):
Or you can go to Well, hi everybody. Welcome to the podcast. Our guest today is Professor John Grout. He is the former dean of the Campbell School of Business at Berry College in Rome, Georgia, who was recently a guest on My Favorite Mistake. That was episode 186. So I encourage you to go check that out. You can hear John's favorite mistake story and our conversation there. You can find it at So if you're not already listening to that, find my favorite mistake wherever you're listening to this podcast.

Mark Graban (1m 34s):
And so I think it's gonna be, you know, a different conversation here than we had in the last episode. A little bit of overlap, but I think a different conversation for, for a different audience. John and I, we, we, we collaborated on a Lean Enterprise Institute webinar that I was the host for, I think it was in 2010. I'm still trying to find, find out if that is online, if the recording of that is still available. There are some broken links that haven't been mistake proofed, if you will. So I'm trying to follow up with LEI folks, and if I get that sorted out, I'll put a link in the show notes. So before we, before I tell you a little bit more about John, let me say first off, you know, thank you for joining us.

Mark Graban (2m 14s):
How are you?

John Grout (2m 15s):
Hi, Mark. I'm doing great. I hope you're doing well also.

Mark Graban (2m 18s):
Yeah, I'm excited about the conversation here. I guess, you know, gosh, everything that was from that webinar in 2010, I'm sure, I'm sure it holds up even if, if you or I or both of us barely remember that that was so long ago.

John Grout (2m 33s):
Yeah, I actually think it's more relevant today and more importantly, more easier to implement today than it was back then.

Mark Graban (2m 42s):
So let's leave that as a teaser. I, we will come back to that question of whether mistake proofing concepts are easier to implement today and, and, and how. So. So keep keep listening for that. But a little bit more about John Grout. He's currently the chair of the technology, entrepreneurship and data analytics department, and he is the David C. Garrett Jr. Professor of Business Administration. He's overseeing the development, approval and implementation of Berry College's creative technologies program and their makerspace called Hackberry Lab. John is research from proofing extensively. He's published numerous articles on mistake proofing in, in 2004, he received the Shingo Prize for his paper, The Human Side of Mistake Proofing that was authored with Douglas Stewart.

Mark Graban (3m 29s):
He's also consulted with a large variety of businesses to help mistake, mistake, proof their processes. And he's published, there's a free ebook that's available online through the A R q, it's mistake proofing the Design of Healthcare processes. So I encourage, if you're working in healthcare or otherwise, go grab a copy of that, that's still available online. And John's website is that, that that healthcare ebook is still available to, to your knowledge, right?

John Grout (3m 58s):
Yes, it is. It is absolutely still out on the web. And I also have a PDF copy on my hard drive if anyone can't get it any other way.

Mark Graban (4m 9s):
So I hope people will go check that out. And you know, there, there's probably a lot to offer for a reader that's, that's not working in healthcare in terms of looking for different ideas that help them think about their own mistake proofing. Would, would you agree with that?

John Grout (4m 23s):
I would agree with that. I, the framework is set up to use a whole bunch of existing quality management and reliability modeling tools to help you think through how to create mistake proofing devices, environments where you're not sure exactly what you should be doing. So I think one of the fallacies that we sometimes see is that we assume, well, the saying is that, you know, a, a problem well defined is half solved, and you know, that's probably true, but there are those cases where you see something happening and you don't really have a good vocabulary of how to fix it.

John Grout (5m 5s):
And this book will provide that vocabulary.

Mark Graban (5m 8s):
Yeah. So hope people will check that out. And again, there's a, a link in the show notes and you can find it. And John's website, So, so John, there's, there's one question. It's become a habit, I guess, recently of asking different guests, you know, their origin story, if you will. Like, how did you first learn about mistake proofing? Was it in a broader context of learning about lean or the Toyota production system? I'm curious, what was your initial introduction and you know, what, what, what sparked, you know, a, a really deep focus in this?

John Grout (5m 43s):
So back in 1991, I was teaching at Southern Methodist University, and they were in the middle of the tqm furor, and they asked me to teach a quality course. Now, I'd had lots of statistical quality control training, both in my undergrad and grad programs, but I wanted to do a thorough read through of everything that was out there. And as I was looking through it, it seems like Sean Berger had said something. He was one of the very early writers on the lean side, and he had mentioned poka yoke.

John Grout (6m 27s):
And so I was interested in, you know, what it was just to know whether it should go in the course or not. And I found it very difficult to find anything about poka yoke. And I finally stumbled on Shingo's book, ordered it, had it sent in and read it. And it is not kind to statistical quality control Shingo's book. Yeah. He kinda says, Statistical quality control has all these problems and here are the issues. And I was about to teach this stuff, and so I had to sort out who was right. And I have done that to my satisfaction and have published academic journal articles based on that.

John Grout (7m 13s):
And so, and I now do a lot of mistake proofing, but I still think single might have gotten a few things wrong.

Mark Graban (7m 21s):
So I I I was gonna ask you to elaborate on that. I mean, you know, first off, for people who, who might not know the terminology, you know, how, how, how would you summarize first off, statistical quality control?

John Grout (7m 34s):
So my view of statistical quality control is you have acceptance sampling on one side. And I think by and large people have figured out that that doesn't really help the, the core of the problem. It all it does is keep product out of your factory maybe. So the other is statistical process control, which involves control charts and three standard deviation limits above and below the meme. And when you have unusual products, things that go outside those limits, you need to find out what's going on and take action. Now, in terms of Shingo's mistake, I think his main mistake was he inferred that every time you had a defect, you were out of control or that a remedial action was required every time you have a defect.

John Grout (8m 23s):
And I think that statistics, the whole notion of process capability is the idea that if you can't consistently kick your field goal through the goal posts that that you have to work on the variants. You can't just take one off things and adjust based on individual defects. You have to manage the entire process. And if you have a process that's in control and you make adjustments based on individual defects, you may be making the process worse.

Mark Graban (8m 60s):
Tampering, if you will, was the term tampering Deming. Exactly. And others used over adjusting a process. Yeah.

John Grout (9m 7s):
And so sh didn't kind of focus on that aspect. And so kind of blurred control limits and process limits that is, or excuse me, control limits and tolerance limits or specification limits. Right, Right. And so he was conflating the engineering limit of this will function, this won't, with the statistical limit of this is usual, this is unusual.

Mark Graban (9m 31s):
And, and, and you, I think we, we, we can, boy, we can keep diving, you know, in in into this topic of, you know, is mistake proofing a binary, you know, spec or out of spec sort of determination versus the idea that I think would come from SQC and SPC that being closer to the center line of your specifications and tightening the variation would be better. Where a lot of mistake, I think of physical mistake proofing of a, a product flowing down a line and if the product is too big, it bumps some sort of barrier and gets kicked to the side that's, it's, it's a go no go gauge in, in that case, but that might not really be the best we can do from a quality perspective.

Mark Graban (10m 19s):

John Grout (10m 20s):
Right. And so I think it's, it's worthwhile to have defects not make it to the customer because the cost of that is almost always way higher. And so Shingo had a variety of different inspection techniques. He talked about successive checks and self-checks. Successive checks is when a downstream operation runs your, your product through a go no go gauge you or self-checks are when you do it. And of course it's always better if you do it than if someone else does it. But then he had something called source inspections. And I think that's really his kind of gold standard concept, which is that you need to inspect the conditions that will lead to high quality and make sure those exist before you produce the product.

John Grout (11m 9s):
And it's also where it fits very nicely with statistical process control, because if you have an out of control point, it's saying something unusual has happened and your job is to go into the process, ex understand it, explore it, do experiments, whatever's required to find out why you had that outta control point. Once you find out why you need to take that precondition for good product and test that precondition and a source inspection is how you do that.

Mark Graban (11m 43s):
And source inspection would be in the case of a manufacturer back at the supplier or on receipt into your factory.

John Grout (11m 52s):
So I think you're focusing on the product and what I recommend is that we focus on the process and so we're going to look at a process and say, is the temperature right or is the machine setup correct or is the, you know, if something comes out of calibration, is there a way for the device itself to say, Yo, I'm out of calibration. So we want the process to talk to us and source inspection is how we get it to do that. And so, and typically it's done with very simple kinds of means to make it obvious that something's going wrong before it causes a problem.

Mark Graban (12m 34s):
Yeah. And there's, yeah, I mean, you know, SPC, I mean, I saw this, you're dredging up memories from my first year, especially General Motors 1995 executives confusing the difference between the control limits on the SPC chart, which are calculated based on process and how much variation there was. And then that would be part of the input into the process capability calculation. But they would confuse specification limits with control limits. I think, you know, it's a great detriment. So I mean, you know, very real scenario here, I'm curious how you would react or coach somebody on it.

Mark Graban (13m 15s):
So you, you've got somebody on, you know, frontline production worker is doing their SPC checks and they notice, hey, something has drifted above the upper limit. We should stop and do something. Management says, well, it's in spec, the part's good. Why the hell would we stop production?

John Grout (13m 37s):
I think the answer there is that you would stop production or you know, I am not dogmatic about stopping production. What I think you would do is go figure out what's going on with the machine, why did it go above the limits and try and resolve that issue without ever making a defect. So, you know, at some level I, I know maybe the orthodoxy is you stop the machine, but if you can figure it out without stopping the machine, then you've got an, you know, it's sort of like internal and external setup. You know, you, you would love to figure it out while the machine is running, but if you can't, I think that there's a benefit to be had by stopping the machine long enough to see if you can figure out what that special cause is.

Mark Graban (14m 32s):

John Grout (14m 33s):
And then work out a system for inspecting for that special cause as a source inspection using po. Yeah.

Mark Graban (14m 43s):
And, and, and, and thank you. Yeah. That, that's, that's a, that's a good nuance there. If, if the point is protecting the customer and, you know, do doing that investigation versus not doing the investigation. Like, you know, the, the process we were working on was, you know, I'm thinking of a, a machining line where this is where memories and technical details were either or, or either fuzzy or I never fully understood, but a lot of times the debate was being framed in terms of do we keep producing or not? I don't think that middle ground was being explored necessarily, and that that could be a different problem.

John Grout (15m 22s):
You know, we having knee-jerk reactions is so easy in life and it in some ways that's what statistical process control is all about. You know, don't adjust the process unless you have something that's out of control and we all want to say, Oh, that's a defect, let's fix it. And what SPC says is, if it's just generic normal process variation, you fixing it does not involve adjusting it based on a single defect.

Mark Graban (15m 60s):
Yeah. Or adjusting it based on, I mean, I think the worst form of tampering, and this goes back to ding and the funnel experiment of saying, well this part, this, this, this, this hole in the engine block is larger than the center line of the spec. So let's tweak the machine and try to make the next one a little bit smaller. And then, you know, that, that, that really amplifies variation in a really bad way.

John Grout (16m 25s):

Mark Graban (16m 27s):
But I think back to fuel goals for a minute, and I I, I could bring in some point in front of mine who does a lot of Lean six sigma work and I've, he, he was a guest on my favorite mistake, Brion Hurley, he was a place kicker at the University of Iowa on their football team.

John Grout (16m 44s):
So he, he understands process capability very well.

Mark Graban (16m 48s):
Well this is why I would be curious to, to pull him into the discussion because a field goal anywhere between those goal posts, even if it hits the goal post and still goes through, is worth the same three points.

John Grout (17m 2s):

Mark Graban (17m 3s):
So there's that go, no go. Good. Not good. Kind of binary. The kick was in the, the result of the kick was in spec or not. But I'd be curious, you know, back to your point of source inspection to see if I, if this analogy is a way of thinking through it, source inspection on the kicker might look at their mechanics and like the leg angle or different, different things of saying, well if, if the, the leg angle is more consistent than the kicks are more likely to be good. Is that,

John Grout (17m 34s):
Yeah. And so in some ways, I don't know if it's necessarily some of the things that you would call a source inspection don't look a lot like an inspection. So for example, if you have a kicker and he is practiced and practiced and what he knows is that he lines up his shot, he steps back three steps, but then he'll step sideways two steps, the two steps sideways are his effort to get the right angle of approach to get the ball to go where he wants it to go. And so he's built into the process a way of determining where the starting place ought to be.

John Grout (18m 17s):
From my perspective, that really borders on a source inspection. You know, I, I would be hard if they said, Oh, that's a source inspection. I would be hard pressed to say, Oh no, it's not. I I think it's really close. And so, you know, source inspection, sometimes inspection's not a great word for it because it's just, you know, some, in some ways it's good management and it doesn't have to be sophisticated, it doesn't have to be based on some computerated anything. It could be always taking two steps to the right. I put my parking ticket when I go to the airport in my wallet.

John Grout (18m 59s):
And the reason I do that is it means I can't leave my car without knowing I've got my wallet in my pocket. Right. And you know, you get into the airport, you don't have your wallet, you're not going anywhere.

Mark Graban (19m 9s):
Right. It's sort of passport in your bag. But No, but I get your point, right? Yeah. Yeah.

John Grout (19m 16s):
It's procedural and so calling it an inspection probably is stretching things a little. And yet I think that's what Shingo has in mind.

Mark Graban (19m 23s):
It sounds more like maybe in TPS speak, going to the gemba, understanding your process and the connection between process and results.

John Grout (19m 33s):
Absolutely. And mistake proofing and particularly source inspection tends to be very, very idiosyncratic. It it, it's precise to your particular process, which is why, you know, when people would pay me money to come and talk to them about mistake proofing, they, they would say, Well, how do we mistake proof this? And I'd said, Well, I, I don't know. Cause I don't have any of the detailed knowledge that's necessary to design a really good mistake proofing device. That's why I wrote a book like the, The Mistake Proofing the Design of Healthcare processes. It was because I wanted to give what I knew to them because they had all the detailed knowledge to actually implement it.

Mark Graban (20m 17s):
Yeah. So let me, let me come back to healthcare in a minute because that's the more meaningful life changing, you know, implications and application of, of this. But just, you know, one of the things just to think about kicking for a minute, you know, those two steps, there's probably this question of how consistent are those steps, right? If I two steps is not always the same. So how do you train yourself, even if the wind is blowing really hard, that those two steps are the same size would probably be an indicator of quality.

John Grout (20m 53s):
Yeah. So that, that's a kind of a meta question to me. You know, a question above the question. I think that, you know, in their case, they don't get to take a measuring tape out on the field. No, no. Although I don't know that that's ever been tested, but,

Mark Graban (21m 13s):
Well, there, there was something that happened recently in an NFL game. What was illegal and what was a penalty was it was a wet field and they brought a towel, I'm pretty sure it was an NFL game. They brought a towel out onto the field and the holder was kind of trying to dry the part in the field where the kick was gonna be placed. And that was penalized. So a tape measure really even had the time. That might also be considered somehow in, but they said that the towel was interfering with the field. The tape measure isn't changing the field. So I don't know.

John Grout (21m 44s):
Yeah, that's a conjecture. I'm not sure I care if actually experience or not. But you know, I think every time he practices, one of the things he's practicing is how wide are those steps? And hopefully he gets it to the place where it's muscle memory. And as long as he can get the game circumstances out of his head, which is the hard part of kicking anyway, he can take those two steps, he's ready to go and he kicks it. Or it may be that the variability of his steps is small enough that it doesn't impact a c k, but you know, his process capabilities are just fine, even if he takes a big step versus a small step because the posts are quite a ways apart.

Mark Graban (22m 28s):
Yeah. Well you say, you know, you talk about muscle memory. I have, boy, I've probably only tried kicking a football like that once, and it was not in a sort of game setting, it was just being out on the field. I think of a, during a marching band rehearsal, cuz I was in the marching band for eight years in high school, in college. So there's a point where you're talking about the, the distance of your steps. Traditional marching band, I, I won't spend too much time on this, I swear eight steps every five yards, but that's 22 and a half inches per step. And you want each of those steps to be the same because of the quality in the, you know, the, the, the dimension of quality of how the band looks when you're marching, how

John Grout (23m 13s):
Straight, Yeah. Everything lines up

Mark Graban (23m 15s):
Are your lines. And the ideal would be that every step is 22 and a half inches. So a line is continually looking, perfectly aligned, but that quality check is that you can kind of peek down and you're not supposed to move your head down. There's another, you know, visual thing. But if the ball of your foot is hitting that yard line, which are thankfully marked off every five yards, you can gauge how you're doing. Now you don't wanna be taking a bunch of small steps and make up for it with one big step at the end, but that muscle memory actually becomes quite good where you can test it by closing your eyes marching 10 yards. And boy, if your right foot ended up right on that yard line, then you gotta dialed in.

John Grout (23m 58s):
It's remarkable what the human body can do very consistently. And yet that's also the cause of the problem with mistake proofing. One of the reasons mistake proofing is hard is because people don't make mistakes very often, but when they do, it can be catastrophic. And so it's, it puts it outside the realm of normal statistics because statistics is not, statistics are not good at rare events. And that's what human mistakes are all about is the rare event, you know, human beings as a, you know, as, as a process it's very reliable. You know, one in a thousand, one in 10,000, you know, we get things right all the time.

John Grout (24m 40s):
It's just that one in a thousand times that it's a problem. And so that's why when you design your mistake proofing device, you, you are thinking about those kinds of issues and triggering when, when those have happened and stopping the process right then.

Mark Graban (25m 1s):
Yeah. So, okay, I'm gonna put football field behind us now. Let's think a little bit more about operating room even before we get into definitions here. So the thing about healthcare rare events, let's say, you know, you're, you're the patient and the surgeon operates on the wrong side or the wrong site. I don't know the exact numbers, but that happens very rarely and we're not gonna fall back on well on average they're getting it right because the impact they don't is huge

John Grout (25m 29s):
Nor is it subject to any kind of normal statistical process control.

Mark Graban (25m 35s):
Right, Right. There's not a bell curve of how wrong are they? It's either, you know, it's right or wrong.

John Grout (25m 43s):

Mark Graban (25m 44s):
So are are, do, do you have thoughts or experiences around, and I realize this is probably a, a very complex thing to try to error proof, but if, if the, if the problem statement is that rare mistake of wrong side or wrong sight, what, what would part of your thought process be about how to try to error proof that?

John Grout (26m 6s):
Well, I think that a lot has been done using something that they call the universal protocol, which is to sign your site. So if you're having surgery on your left arm, you, they should be explaining to you the procedure well enough and then asking you to write your name where the cut's going to occur. I've also heard people say that they write no on the other arm. That doesn't help at all. And the reason it doesn't help at all is because then if they drape you for an appendectomy, you're out of luck. Right. Cause no one will, you know, it presumes that you're on one of the two arms and sometimes the wrong site is an entirely wrong

Mark Graban (26m 49s):
Surgery. Right? Yeah. It's, it's not a bilateral, it's a yeah. Difference. It's the wrong surgery.

John Grout (26m 54s):
And over time, if the surgeon knows I'm only gonna cut if I'm cutting through the person's signature, that's a pretty decent signal. And when you add to that the timeout where the entire surgical team stops for a minute and says, Okay, let's all agree about what we're doing here. We're operating on the left arm below the elbow and we're doing this procedure, here's how it's gonna work, here's about how long it's gonna last. And you know, even things like, do they introduce each other around the room so that everybody knows everybody else or at least has some sense of who they are and then they all look at the site and there's a signature there.

John Grout (27m 40s):
You know, that's a step in the right direction. Is it a perfect mistake proofing device? I I have yet to figure out a perfect mistake proofing device for that particular application. Yeah.

Mark Graban (27m 53s):
It it, it seems like a lot of that is dependent on our ability or our willingness, it's probably the better word, our willingness to follow the quote unquote universal protocol.

John Grout (28m 6s):

Mark Graban (28m 7s):
Do we have data on how often that's really occurring or not occurring?

John Grout (28m 12s):
There probably is data, but I don't have that data.

Mark Graban (28m 15s):
I mean my, my my little bit of exposure to, to operating rooms is, I I I have a question. Where's that data coming from? Yeah. It's certainly not being universally tracked

John Grout (28m 24s):

Mark Graban (28m 25s):
In a really reliable data sort of way.

John Grout (28m 28s):
In fact, my wife was in for some surgery and I asked the nurse about a timeout procedure and she says, Oh, they always clean the room thoroughly between operations. And I'm like that Is not the answer. Right? And the anesthesiologist, I said, So in this particular case, cuz this was internal surgery, that was female surgery, so there was no kind of external body incision going on. And so I said, So how does the universal protocol work in a case like this? And I forget what he said, but it, it was like he totally whiffed on the answer. He just came nowhere close to understanding that I was saying in a case where the site isn't an external incision, what do you do then to make sure you're doing the right si right operation.

John Grout (29m 24s):
And he, he had no idea, one of my students in an evening MBA class was the chief operating officer of this hospital. So I sat down and typed out an email in the waiting room and sent it to him and said, I just want this date, date and time stamped that you've got these issues and it's making me nervous. Yeah. Now surgery went off fine, no problems. But, you know, my view was I wanna be on record that Yeah. This is not what I had in mind. And I ended up on the quality board for that hospital years later and they're like, Oh, I recounted the story.

John Grout (30m 7s):
And they're like, Oh, you are the guy. And to their credit, their quality culture was such that that story still remained in active memory to the people I was talking to like five, 10 years later. Yeah. And that I admire.

Mark Graban (30m 24s):
And, and this is all, I mean, you touch on culture and behavior and there's I'm sure elements of psychology, this becomes a fairly complex thing as opposed to the physical size of a part that we've been cutting, you know, in a, in a back in a factory. So, you know, kind of recap and I've seen it go both ways. I, I've, I've even, I had a friend of mine send a picture where somebody I used to work with, she did, she was the six Sigma black belt. Her husband had a broken collarbone and they sent a picture where they had written no on the, the, the non-broken collarbone. Well it was painfully obvious, painful to him. Like you could obviously visually see that's the broken one.

Mark Graban (31m 5s):
And they, they thought, well, okay, thank you for following the protocol because you don't wanna be in a situation of like, well we don't need it this time. But to your point, like you could write no on an almost infinite number of body parts instead of marking the one spot. Right?

John Grout (31m 22s):
Right, right. And at the start of all of this one in four orthopedic surgeons would have a wrong site surgery during their career. So that's rare, but not nearly rare enough.

Mark Graban (31m 37s):
Right. And there's, yeah, so you know, again, on soapbox minute universal protocol, I I think of it as so-called universal protocol. Like universal is the ideal and,

John Grout (31m 51s):
And it ought to be done every time.

Mark Graban (31m 52s):
It ought to be, and hopefully the right form of it because there would be variation around you signing the spot versus the doctor signing it. Now, if I'm having back surgery, I probably can't literally write on, on my lower back, but I could see where they would introduce opportunities for error if the doctor is signing it. And let's say you've already been given some medication that makes you drowsy, right? If, if the x-ray's been flipped and they marked the left arm and you're barely aware of it, that could lead to a

John Grout (32m 20s):
Mistake. Absolutely. And, and that would not be the universal protocol. Right. And in some ways, far better to have a loved one, like you would say to your wife. Okay, I can point to it, I just can't write my name there.

Mark Graban (32m 31s):

John Grout (32m 32s):
Put it right here. To me that would be better.

Mark Graban (32m 37s):
Yeah. So, you know, the, these, the, the, the universal protocol is supposed to help prevent one form. There's this terminology that again, I think is sort of ideal or optimistic, never events. I, I definitely, I refer to these as so-called never events because they happen like never is not, I mean that, that, that's the goal. It's not the reality,

John Grout (32m 58s):
Right? It's like zero defense.

Mark Graban (33m 2s):
So tell me, and for those who are listening and not watching on YouTube, I'm quite intentionally wearing a hat that says zero, that is a reference to zero harm from the patient safety movement Foundation. John noticed this of course when we got on. So it's, I was gonna ask you your thoughts on, on this idea of aiming for zero, talking about zero.

John Grout (33m 23s):
I I think it's the only way to go. I think it makes perfect sense to aim for zero and in all of these cases, you know, if a customer gets a defect that's a problem. One defect is a problem and so zero should be the target. Now getting there, it all depends what environment you're working in and what kind of knobs and tools and dials that you can turn to make things better. One of the things I love, well back in the old days, I guess it was Crosby said that quality is free. And his argument was that the cost of preventing defects was always less than the cost of the defects.

John Grout (34m 10s):
And I think through the years people have just kind of stopped arguing about it.

Mark Graban (34m 17s):
I, I argued with, or not, not just I, some of my classmates in grad school at MIT that had a, a background in manufacturing, we actually did end up arguing with a microeconomics professor who really was still teaching this idea of, you know, the optimal quality levels and these assumed tradeoffs that better quality costs more and at some point it ain't worth it. And we're like, but that trade off isn't true. And he, he, he basically, basically, I remember he got to a point where he was like, Okay, you all need to, you all need to shut up so I can finish my lecture. Like it was a little more polite. We did talk about it after the fact because we were hoping to try to educate him from what we were seeing in industry.

Mark Graban (34m 58s):
And he was still pretty upset that we were kind of derailing the lecture.

John Grout (35m 2s):
Well I think the issue is that if you look at it from a statistical perspective, if you are on a normal curve and if it really is infinite tails, it's hard to get to zero when it's an infinite tail. It's also hard if you've got that infinite tail driving the prevention appraisal part of the old concept of cost of quality. But then you look at mistake proofing and you say, okay, can we virtually eliminate a at a finite cost that's that's reasonable, that's not too expensive.

John Grout (35m 43s):
And the answer is, of course you can. And we've got lots of examples where people have done things and the errors have gone away. Like until you have a different process, that error is not going to occur again. And it's done using, you know, a a a pin or a piece of steel sticking up or a little sensor. Now with the sensor you do add on some appraisal costs because you've gotta keep it calibrated, but as long as you keep it calibrated, it's gonna do what it's supposed to do. And so, you know, I'm, I'm of a mixed mind on, on zero defects as a practical kind of theoretical matter, but as a practical everyday matter, it's clearly the right target.

John Grout (36m 27s):

Mark Graban (36m 28s):
And, and, and I think in relatively simple error proofing applications, you really can reach zero defects of like Oh absolutely. Like a simple go no go gauge as parts are flowing down a line, you could say unless the air proofing device failed somehow that that would be perfect. Right. Mistake proofing. But then you get into more complex systems and you know, coming back to Toyota as much as they talk about quality at the source and going back to Shingo and long, long practice of mistake proofing 2019, the last opportunities I had to go to a, a Toyota plant in Japan, they have final inspection and like to, that might be surprising to some people of like, well why wait, why I thought I thought inspection was waste.

Mark Graban (37m 15s):
Like if we're gonna be dogmatic and Toyota I think is decidedly not dogmatic on things like this.

John Grout (37m 21s):
Yeah, I I've often thought that that would be an interesting research paper to write, which is when is the optimal time to stop inspecting? And it, it's, it's seems to me a challenging question to, to answer. You have to make some presumptions about the process that are difficult to make

Mark Graban (37m 43s):
Because you, you could air you, you could keep doing final inspection well beyond the time it's necessary. But, you know, I, and this is a story I've told before, I think in other episodes, 1995 General Motors, there was a design intent for the, the, the engine factory that quality was gonna be built in and they decided to eliminate the, they used to do a hot test at the end of engine assembly, fire up the engine, make sure it's working fine. And there was sort of like this dogmatic of like, well we don't need that anymore cuz we're building quality in. So then unfortunately that first hot test was happening after the engine was installed in a Cadillac and at the end of the Cadillac assembly line, they would go to fire up the engine to drive it off the line and going back to the engine block machining, if there was some part where the hole was too big and now this engine is literally got black smoke coming out of it.

Mark Graban (38m 42s):
All the potentially defective engines between there all the way back to machining could have been like 2000 bad parts and hundreds of assembled engines, a hundred already installed in cars. Like that was just, that was the mistake of ending inspection too soon.

John Grout (38m 60s):
And it's a presumption that that entropy doesn't happen and entropy is one of those laws you can't get away from. I've got another example of that. Frito Lay was, this would've been back in the nineties, so it's a long time ago, but they had their tortilla chip production down to a science and they started doing everything associated with the science of getting that chip right. And they started losing taste tests, blind taste tests to Eagle brand, which was a brand back then. And Eagle brand was winning in the taste tests for, for blind taste tests.

John Grout (39m 47s):
Now if it was branded, people still like Frito Lay better, but if it was unbranded and they couldn't tell they liked the Eagle brand better and Frito laid to their credit, said, you know, we can't have this, this is not acceptable. And so they realized that no one was tasting the chips and they found out that you need to have a person on the line producing chips, tasting the chips. So it's that final inspection. And so they hired an artist and they made a chip that was a little raw one that was just right, a little one that was a little cooked too much. They're interested in the amount of blister on the chip.

John Grout (40m 28s):
You know, it was amazing stuff. We would never care about that. But they cared about it and that's why the chips turned out good. And so they would hold the chip up and they would, you know, look at the chip and say, you know, which, which model does it look like? And then they would eat the chip and then they had a bag of gold standard chips and they would eat the gold standard chip and say, Does it taste like that? And it's like, oh yeah, those tasted about the same and they had a whole procedure to keep a gold standard bag in stock.

Mark Graban (41m 3s):

John Grout (41m 3s):
Because you had to rotate it through, you couldn't let it sit there for two weeks because then it wasn't a gold standard chip.

Mark Graban (41m 8s):
I was gonna say it's stale now. Yeah, yeah. So someone deciding what's the gold standard and how accurate are they in deciding that?

John Grout (41m 15s):
Yeah. And so through time if that gold standard gets messed up, yeah you could have problems, but you know, they did their best and at some level it still depends on someone eating the chip and going, oh yeah, that tastes okay. Yeah. And I'm sure there are people with more refined pallets than mine who could do that. Well,

Mark Graban (41m 35s):
And that's where it comes back to how do we define quality? We could be looking at process measures or variables that really don't matter to the customer. We might think we're really dialing it in on quality in a way that is really pretty meaningless to the business.

John Grout (41m 51s):
Yeah. So Frito, they like wanted 15% broken chips or less, you know, once my kid picks up the bag off the shelf and jams it in the cart, you know, that metric is out the window.

Mark Graban (42m 4s):
I mean it sounds like the old story that may or may not be true of like, you know, the Japanese supplier getting an order from the American automaker and the punchline of the story in this case would be, here's your bags full of broken chips that you wanted, you know?

John Grout (42m 17s):
Yeah, yeah, yeah. We separated out the 15% they're here, the rest of 'em are fine. Yeah. Right. Yeah. And that goes back to a name that is really going back to Gucci and the whole idea that any variance is worse than, you know, that any amount of variants should be reduced. And I haven't heard his name in forever, but I think his idea is still true, that if you can get variants as low as possible, that you're better off even if, even if it functions just as well with higher variants.

Mark Graban (42m 57s):
The, the, this is the to GCI loss function and if to GCI invented football, and I've, I've actually drew a chart like this once where a ucci football game nowadays would have some sort of laser measurement and the points on a field goal would be determined based on how close it was to the center of the goal post. You would get maybe four points for a kick that was perfectly centered all the way down to like a half point if you barely missed the upright.

John Grout (43m 22s):
Yeah. You can have a little posts stick up and whichever one you knocked off as it went through would

Mark Graban (43m 28s):
Yeah. I jumped to technology. There could be a bunch of, yeah, a bunch of pool noodles or something going across,

John Grout (43m 35s):
Which is instructive in terms of mistake proofing because it's so easy to think you've gotta have a laser to do it and you don't. Sometimes it's a pin on a die that keeps you from putting the part in backwards. Some, you know, some of the best mistake proof proofing I've ever seen are these devices that are $25 or less. And in Shingo's book, of course nowadays that would be a $50 part because of inflation. But you know, he had about a quarter of his ideas were 25 bucks or less and fully half of them were a hundred dollars or less back then. So that would be $200 now two 50. But would anybody pay $250 two an eliminate a mode of defect in their factory?

John Grout (44m 18s):
I'm guessing lots of people would sign up for that.

Mark Graban (44m 22s):
So then you, you mentioned earlier, let's come back to the thing that, that we teased early on where you're saying compare the 2010, it's easier to mistake proof things today. Is that because of technology or why, why is that?

John Grout (44m 34s):
Oh yeah, it's absolutely because of technology and I was actually thinking more like, you know, the, the heyday for mistake proofing was like 95 to 2000. That's when, you know, there were, you know, conferences about po you know, there's no conference about PO anymore and yet now more than ever, it's the time to do it because now you can get, so with the maker movement came this thing called an Arduino, it's a little programmable logic controller that costs 35 bucks and once you've got it sorted out, you can buy an Arduino Mini for 10 bucks and you can hook it to a limit switch or to a light sensor or to any number of different kinds of sensors, a hall sensor that will do magnetic.

John Grout (45m 24s):
And I've got undergrads in their first semester doing prototypes where they can do basic programmable logic controller mistake proofing. And you know, you couldn't do that in 2000 back then you, you had to, you know, figure out how to do it on an ABB or a a square D or you know, one of these industrial programmable logic controllers that were hundreds of dollars. So now it's a 10 bucks thing. It's so easy.

Mark Graban (45m 56s):
So it makes me wonder, back in 1999, 2000 when I was working at Dell Computer, one of the, the parts of the process is what they call the pick to light line, where based on that order for those computers, people would pick the different parts, the hard drive and fan and different things that would go into a kit that was then sent down to the assembly station. And I'm trying to remember exactly how it worked. Like one version of pick to light would be lights that say, here are the ones you grab. I'm pretty sure they had some error proofing where if you tried to grab the wrong one, there would be some sort of indicator or a light or a buzzer or something.

John Grout (46m 34s):
Yeah. And, and there was a company called Speeds Tech back in the day that had one that had a, had a laser kind of a, not a laser, a

Mark Graban (46m 45s):
A light curtain maybe.

John Grout (46m 46s):
Yeah, a light curtain, right?

Mark Graban (46m 47s):

John Grout (46m 48s):
And it, it would cycle around and if you stuck your hand in the right spot, it would turn the light off, you stuck your hand in anywhere else, it would say that's the wrong spot. And a buzzer would go off. They had good luck with it. They were down to their, their pick went from like 200 parts per million down to two. And by the way, that's the hardest, that's the hardest mile in all of this is going from 200 parts per million, which is fantastic to two, which is world class.

Mark Graban (47m 19s):
Yeah. So then I bet that technology would be a lot cheaper today is what you're saying.

John Grout (47m 24s):
Oh yeah. I've got undergrads who could do that all day long. Yeah.

Mark Graban (47m 28s):
But then I think of this, this is

John Grout (47m 30s):
So to, to put this in perspective, Yeah. I had an undergrad who for his senior project created a vision system to look at a box with fuses in it. So you know, the automotive fuses with the number printed on top, he had five of those in a row that was going into, I think it was a, don't quote me on this, I think it was a Kubota tractor. And so they had this wiring harness and the fuses would get put into the wrong place. He created on his own a vision system to look at those, do optical character recognition and figure out whether they were in the right spot based on this fact and alert the operator if they were in the wrong place.

John Grout (48m 18s):
And it then took a picture of it so that if the company ever came back to this vendor and said the the fuses were in the wrong spot, you go, here's the picture. They came out of the factory. Correct. And so he had both the quality at the source and he had nice audit documentation.

Mark Graban (48m 38s):
Yeah. And you know, it's, it's making me reflect on like going back to 1999 Dell Computer, like the impact was pretty trivial if there was a mistake in the pick to light line because then in the assembly station there was further error proofing I where if I remember right, pretty much everything was barcoded and everything was scanned and to the order, if, if there was a part missing or the wrong part that would get caught and that would lead to a little bit of inefficiency, but it would probably protect the customer and Sure. You know, you, you would wanna prevent the waste of having to go swap out chase down the right part. Sure. Versus the lack of error proofing in a process as critical as, let's say, gathering the right instruments for a surgery where then if the wrong, if if if it's discovered once the patient has been under anesthesia and it's supposed to be detected in advance.

Mark Graban (49m 33s):
Right. But the, the, the defect of let's say a missing instrument could be discovered at a point where now it either delays a surgical procedure, which could have an impact on the patient depending on the situation. They could find the error once they're under anesthesia, which is bad because now they're under anesthesia longer or the error could be found once they've already been cut open, when that's even worse. And there, there's, I mean, I I hate to say it, there's there, there's a lack of error proofing in, in, in that operation.

John Grout (50m 3s):
Oh, absolutely. And all of a sudden your inventory theory becomes part of your quality management system. And so of course the fix for that is to have prepackaged kits

Mark Graban (50m 20s):
And, and sometimes you, you can do that depending on, on the procedure,

John Grout (50m 24s):
But not all procedures are that way and not all doctors like what the kit has in

Mark Graban (50m 28s):
It. Well, and if the, if the kit was perfect when it first comes in, the challenge then on the, on the, at the end of the procedure is if you've opened up four trays who's guaranteeing that the right instruments go back into the right tray, that's part of the waste and the process of processing and then sterilizing those instruments is kind of sorting out the mess that was created back upstream in the operating

John Grout (50m 49s):
Room. Well, yeah. And in many cases they're designed with disposable pro product

Mark Graban (50m 56s):
And that's true sometimes too.

John Grout (50m 58s):
Of course. Then you've got the hazardous waste that comes from it. So, you know, all of these are tradeoffs and

Mark Graban (51m 6s):
There, and you know, there are some other things that are hard to error-proof in that context where one defect could be the instruments getting back to the operating room and you there appears to be what they politely call bioburden euphemism of like, well, like how, I mean it's supposed to be inspected, it's certainly not error proofed, but there are all sorts of, back to your idea of source inspection, you know, and it's, it's kind of an interesting value stream in that it's circular where the pro, you know, you could go back to look at the process and that was part of a team that did this back in the operating room. If the instruments aren't properly sprayed with a foam, then the odds of properly cleaning and sterilizing them in the sterile, in the sterile processing department goes down dramatically.

Mark Graban (51m 57s):
So it, it comes back to like, Hey, you know, your, your process helps us help you, but how do you do those process checks rather than just doing inspection at the end?

John Grout (52m 5s):
Yeah. So, Yeah. No, well one of the things I want your listeners to understand is that mistake-proofing is not easy and designing good mistake-proofing devices is, you know, is the pinnacle of design. The idea that you make something that is essentially invisible until it's needed. You know, back in the day when you had a three and a half inch disc and you put it in the machine correctly, no problem. You put it in upside down, it stops you halfway in. It only notifies you when something's wrong The rest of the time you may not even know it's there.

John Grout (52m 48s):
I have a table saw like that, it's a soft stop table saw and until I touch it with my finger, it's gonna be just fine. I'm not gonna know any that it's any different than any other saw. Soon as I touch the spinning blade with my finger, it snaps out of the way. I have a little cut on my finger, I put a bandaid on it and I'm done. And so, and people said, Oh, that's no good. You know, particularly the other manufacturers, they didn't wanna use the technology and now it's the top selling saw in the market.

Mark Graban (53m 22s):
I wonder if they're thinking about, well clearly that's user error and you shouldn't put your finger or any body part anywhere near the saw. Right. There's back to some psychology of blaming the user or blaming them for being human because we all get inattentive or sloppy at times.

John Grout (53m 38s):
Well I think what was really going on was that the owner of SawStop, or the inventor of SawStop is a patent lawyer.

Mark Graban (53m 46s):

John Grout (53m 47s):
He had patented it and he was trying to license it. He didn't wanna build saws at all, but when no one else they, they could see that they were gonna be paying him royalties for the rest of their lives and they didn't wanna go down that road and he didn't wanna build saws, but he was essentially forced to. So he went and found a contract manufacturer, they built a saw, it's sold like hot cakes and to this day all of the other power toolmakers are like, Yeah, no, don't do that. Yeah.

Mark Graban (54m 19s):
It is like competitive disadvantage to them or,

John Grout (54m 22s):
And you know, because he knows how to play the game, he went to the National Product Safety Commission and tried to get rulings that would force everyone else to buy his module for their saws. And as far as I know, that didn't end up succeeding.

Mark Graban (54m 38s):
So a quick detour, but you mentioned this National Product Safety Commission. We have a National Transportation Safety Board. There are people advocating that we need something. They're calling it the National Patient Safety Board. I mean, it seemed like there, there, there's, it's interesting that there's a, a societal or legal role in a place for, for commissions or boards like that, but we don't have that in healthcare.

John Grout (55m 5s):
Yeah. And so one of the things that we have found out about mistakes is that people don't like to own up to them. And in particular they don't like to own up to them if they're going to be fired as a result or sanctioned or punished or what have you. And so there's this whole other realm called just culture. And just culture is where you figure out a way to do a full and thorough investigation without people feeling threatened. And yet you want accountability if someone's been egregiously

Mark Graban (55m 42s):
Willfully harmful.

John Grout (55m 43s):
Right. You know, those people should be held accountable. And so sorting out how to draw that line has been something that there's been a lot of talk about. Now, National Traffic Safety Board does that because you report, if you see another plane that's doing something unsafe or you are in a plane as a pilot who's doing something that's unsafe, you can report it. I, I'm trying to remember how that works. But there, there's an offshoot of NASA and I think it may be the national, I think it may be the NTSB

Mark Graban (56m 20s):
And, but there's this non-punative reporting,

John Grout (56m 22s):
Right. So there's a no fault reporting and there's a fault reporting. So the FAA will come after you but the NTSB won't or something like that. I'm not sure I've got the agencies correct there. Sure. But the idea is that you can get a full report of near misses and actual events and learn from those without having people have to take actions that are against their own self-interests. Sure.

Mark Graban (56m 50s):
And people who are interested, actually back in episode, I had to look this up, episode one 12, I talked to Naida Grundenen who had done a lot of lean work, should written a book about lean in healthcare. Her husband was a pilot, they were like personal friends with Captain Sullenberger and, and she was a big advocate for, you know, the, the need for, you know, kind of reporting mechanisms and learning and and safety culture that they, you know, that is sometimes brought from aviation. But you know, think back to Sully in that plane and his co-pilot and all the passengers, you can't really error proof against a bird strike unless you somehow have a design that's some, I don't know, magically robust against that.

John Grout (57m 33s):
Right. And so the moral of the story there is that you cannot mistake proof everything. And that goes back to kind of the psychology of errors. Things that are slips, things that are your intent is correct and your execution is flawed. You can mistake against those. But anything that takes deliberation or judgment, it becomes very difficult to mistake proof. Those things. I'm constantly having people say, Ow we need to mistake proof that. And it's like, yeah, I can't help you. Yeah.

Mark Graban (58m 3s):
And, and I think, yeah, when you talk about near misses and that opportunity, Paul O'Neill, when he was CEO at Alcoa, he certainly advocated for zero employee harm. And part of that was the recognition and creating the culture that made it safe if not required. But the thing is, again, making it safe for people to speak up about risks, near misses, minor injuries, that was the pathway to preventing fatalities.

John Grout (58m 31s):
Yeah. And that's pretty much all cultural kinds of things. And what we see is that any place where you have that culture, it really helps. So, you know, all of the work on crew resource management and aviation has been very helpful in terms of reducing problems. You know, once a, once a warning light goes off in a plane or an alarm is sounded in a plane cockpit, their chances of fixing it properly is about 50-50. Or at least that's what it was when they started the process. And a lot of times everyone was so busy working the problem, no one flew the plane.

Mark Graban (59m 7s):
Cognitive, cognitive and cultural issues there.

John Grout (59m 11s):
Yeah. And so nowadays when you have a warning, light goes off, someone flies the plane and that's all they do. The other people work, work the issues. And that's why Jeff Skilling was busy. Yes. Going through checklists on the Miracle on the Hudson

Mark Graban (59m 30s):
Is the co-pilot

John Grout (59m 31s):
And Sully was flying the plane.

Mark Graban (59m 33s):
And, and I'm pretty sure in the audio recordings you hear like there, there's this procedural, you know, My airplane, your airplane,

John Grout (59m 41s):
My plane,

Mark Graban (59m 42s):
My plane. So yeah, it comes back to culture and, you know, there's, there's, there's maybe the, you know, technical aspects of mistake proofing, but a lot of it is cultural, just culture, I think as an amazing framework. I think of it as a healthcare framework that others could adopt. Yeah. In other industries. But, you know, it comes back to this question of, you know, I think there, there's, there's two different situations. There's the mistake of giving the patient the wrong medication or the wrong dose that leads to a death. And unfortunately society still blames, punishes or even sometimes prosecutes.

John Grout (1h 0m 19s):

Mark Graban (1h 0m 20s):
Versus a situation where there are cases where I would say, this is clearly something you should prosecute, where people are intentionally murdering patients. Right. Putting them outta their misery. Like I, How how do you, how would you mistake proof? Like what checks and balances could you have to make sure somebody's not subtly murdering patients in a way that's hard to detect? I don't know.

John Grout (1h 0m 43s):
Well, I'm not gonna be, you know, Perry Mason on this, or Murder She Wrote or

Mark Graban (1h 0m 48s):
Whatever. That's got really dark. Sorry.

John Grout (1h 0m 53s):
Yeah. So I think that you use every tool in the toolbox to try and sort this stuff out. And if sleuthing is required, you know, that is something that we've done for a long time and I think, you know, if someone is doing that, they should be found and punished and per perhaps prosecuted. I think that the, the just culture stuff really works for me in the sense that if you didn't intend to do harm, you look at it differently than if you had a reason to believe you're doing harm. So my example is, if you drive to work and you're driving the speed limit and you get in a wreck, you know, at some level it's not really your fault.

John Grout (1h 1m 42s):
You know, you can drive 10 or 15 miles over the speed limit and you know you're breaking the law, but you didn't think you were gonna do harm. Probably understanding what was going on in your life that made you wanna do that, or the fact that everybody does that kind of changes the equation for me. Now, if you drive a hundred miles an hour to work, that's not a reasonable thing to do. Right.

Mark Graban (1h 2m 13s):
Or if you know that, you know, if you know somebody is drunk and you let them drive anyway, or you are a bar who's overserved somebody knowing that they're gonna be driving there, there, there are different levels of, you know, responsibility there. But I, you know, I think a lot of times we're quick to oversimplify it and in a lot of these healthcare error cases, find a scapegoat or, you know, well simple, we, we found the problem, we found the person who screwed up, they've been fired. It's not a problem anymore. But, you know, even back to instances of intentional or, you know, intentional harm, or let's say the occasional like the, this is pretty rare, like, you know, the, the, the pa the, the surgeon who's known to be let's say so incompetent that they are really harming people and, and others aren't speaking up about it, that that becomes a cultural issue.

Mark Graban (1h 3m 6s):
And you say, Well now there starts to be some culpability on leaders of the organization. If you knew there was a problem and you didn't address it, or you stifled the communication channels that would've informed you, it becomes more than just an individual problem.

John Grout (1h 3m 21s):
Yeah. It would be nice if we could just talk to the local surgical nurses and when they say, don't use that doctor, you know, they know what they're talking about, you know, they're right there with 'em. They know what's going on. I tend to trust them a whole lot. And so, you know, I really think it'd be great if, if all the surgical nurses say, Yeah, I'd rather not work with that doctor, that's useful information. I don't know how you culturally allow that to get out, but it'd be great if we could.

Mark Graban (1h 3m 60s):
And sometimes it comes out after the fact

John Grout (1h 4m 3s):

Mark Graban (1h 4m 3s):
The reporting about something. It's like, it was known we spoke up, you didn't listen.

John Grout (1h 4m 8s):
And so often organizations will, will need really compelling evidence to end someone's career in their hospital. And I kind of understand that, but if the, in those cases where the evidence is all there and no one took action, it it's troubling. It's really troubling.

Mark Graban (1h 4m 28s):
Yeah. There's, yeah, there, there, there, there's an expression that's been written about and even dramatized a little bit, Dr. HODAD, have you, have you heard this?

John Grout (1h 4m 38s):
Had not heard that

Mark Graban (1h 4m 39s):
This, this slang. So Hoda is an acronym that stands for Hands of Death and Destruction. And there's this, you know, this balance. And I think just culture helps us figure out where, look, I'm all about saying most errors are caused by the system, but there are some outliers where they're either just so grossly incompetent or intentionally causing harm. Like there's a whole podcast in the TV series. I can't bring myself to watch or listen to a, you know, called Dr. Death, a surgeon in Dallas who was really just naming people. And it went on for a really long time. Now that if that's the rare exception, we need to make sure that that's addressed differently than quote unquote systemic errors.

Mark Graban (1h 5m 23s):
It could happen to any surgeon.

John Grout (1h 5m 25s):
So I think it's important when you do root cause analysis and you're looking at what's going wrong, that you almost never take a negative descriptor of a person as the root cause.

Mark Graban (1h 5m 39s):
Right. Yeah.

John Grout (1h 5m 42s):
I also think that when you've ruled everything else out and you've got the data to show it, it's unconscionable not to act on it.

Mark Graban (1h 5m 53s):
So one thing that frustrates me, I'd love to hear your reaction to it, is like we might be making baby steps toward not blaming an individual and throwing them under the bus. But then I've heard this progression that I think only goes part way where an organization will say, Okay, well it was a systemic problem. We're not firing someone. It was human error. What can you, there's nothing you can do about human error. I'm like, Oh, that, that shouldn't be the end of the story. Right?

John Grout (1h 6m 19s):
Yeah. You're talking to the wrong person about that mistake. Proofing is all about that. And if it's a judgment call, you should have other people looking at that judgment and you should be rewarding differing opinions. You know, one of the problems is if everyone's responsible, nobody's responsible. And so the more people you have look at it, the less responsibility any one of them takes. And what you need to do is see if you can figure out some structure that will keep the accountability for each and every individual so that you, you know, so if you can make it blind so they don't know what other people have said, you know, all kinds of other things that will lead people to take it very seriously, that's worthwhile.

John Grout (1h 7m 12s):
You know, having five nurses check something is not a recipe for success, but with judgment redundancies really the only answer. Cause you know, the normal mistake proofing process is not very good.

Mark Graban (1h 7m 29s):
Well, well, so, so there's, there's two levels of, of, of possible mistake back to, and, and, and I will point people back to the episode of my favorite mistake, 180 6 where I do get John's thoughts on definitions of like mistakes is bad decisions versus slip ups. So if I'm going in for lower back surgery and they cut in the wrong place, that would be like a slip up. They intended, they didn't intend to cut there. But then there's, you know, diagnostic decisions where two and a half, almost three years ago, I went to a surgeon where I had a really badly extruded disc that was pressing against a nerve.

Mark Graban (1h 8m 11s):
It was awful. And the first surgeon said, in his judgment, in his professional experience, you need surgery immediately or your foot and leg are gonna be numb for the rest of your life. Well, not because I distrusted him, but I ended up going for a second opinion because long story short, he said, Well if you have the surgery, which you need, you can't travel for six weeks. I'm like, Okay, well if I can't travel for six weeks, I need to have that surgery in Orlando. Cause I need to be there with my wife at the time if I can't travel for six weeks. So the surgeon in Orlando had different judgment. He had young, he was younger, he had newer education. He showed me the journal articles that said, guess what outcomes are better if you wait and let the body try to heal itself first.

Mark Graban (1h 8m 52s):
So like quote unquote mistake. I mean, there's a difference between a decision that the other surgeon's judgment would say isn't grounded in science versus doing the right thing the wrong way. That was a really longwinded way, way of trying to compare this.

John Grout (1h 9m 10s):
Helping people not make errors in surgery is easier than doing the right than diagnosing the right surgery. Yeah, absolutely.

Mark Graban (1h 9m 18s):
Preventing execution errors disease,

John Grout (1h 9m 21s):
Right? Execution errors are, are where mistake proofing thrives deliberation and decision in unstructured problems is where it's more difficult. You know, we're starting to see this, well we're seeing this move towards evidence-based medicine and I think that's in the right direction and, but yeah, even there you're just codifying statistics to have it supplant personal judgment. I think a lot of times that makes lots of sense in terms of kind of mistake proofing, personal judgment, but you'll always have those counterexamples.

Mark Graban (1h 10m 4s):
Yeah. I mean, even about 10 years ago, Dr. Brent James, who is considered one of the leaders in the modern quality and patient safety movement, I saw him give a talk and he said, for all the talk of evidence based medicine, it probably applies in about 35% of cases.

John Grout (1h 10m 18s):

Mark Graban (1h 10m 19s):
Across medicine. Like there are some really common really well-known things like a child with certain, you know, illness symptoms like okay, clear poss you know, ear infection. Like okay, we, there is absolutely evidence-based best practice for how to treat that. Me having this weird, mysterious lower back pain, not so straightforward.

John Grout (1h 10m 38s):
Yeah. Right. And of course the Apgar score has changed medicine more than any other one thing.

Mark Graban (1h 10m 44s):
What, what, what, what does that score? The

John Grout (1h 10m 45s):
Apgar score is the score that you give a child when they're born. And you look at, there's, there's a whole series of kind of, there's a rubric if you will, I don't know what the clinical term is, but a a list of things that, that you look at different aspects of how the child was born and they get a rating and the better the rating in some sense, the healthier the child. And so the practice of, of giving birth and the medicine around that has changed dramatically since, you know, the sixties if you'll, and it's all because we had good metrics of what the outcome looks like, probably has some side effects and almost any mistake proofing you do will have some side effects.

John Grout (1h 11m 31s):
So for example, the side effect in, in giving birth is there's so many more C-sections than there were before. Yeah. And some

Mark Graban (1h 11m 43s):
Hospitals are really working to reduce that.

John Grout (1h 11m 45s):
Right. But that was driven by the fact that if you used a c-section, the car score tended to be higher. And so any chance that it was gonna impact the baby negatively, there was that much more impetus to not have a natural, you know, vaginal birth. And, and so yeah, this, all this stuff is all mixed together and then we have to kind of, so we're trying to port things over from Toyota production system and oingo to an environment where in a lot of ways it's very different.

Mark Graban (1h 12m 24s):
And then part of what point too, I think is like side effects of metrics and targets and rewards where, you know, if surge, if heart, if cardiologists are being ranked, rated, compensated based on, you know, postop mortality rates, there's this dynamic of where they might choose to not take on the sickest patients.

John Grout (1h 12m 44s):
Yeah. Which is

Mark Graban (1h 12m 46s):
Kinda a bad distortion of what the care should have been,

John Grout (1h 12m 49s):
Which is why I'm a little skeptical, you know, like they have health and things like that where you look up your doctor and you say, Is this doctor great or not great? And some of the best doctors may be those doctors who only do the hardest cases and lots of their patients die, but far fewer than expected. And so I'm not sure kind of in trying to create transparency that we've got it right yet. So,

Mark Graban (1h 13m 15s):
Well as another professor from statistics and quality fields, Dr. Donald Wheeler would say statistics without context have no meaning. Right. So if you look at mortality rates across hospitals could be very misleading. There are these sort of mortality ratios of actual versus expected based on, you know, smarter people than me figure that out and, you know, Yeah. We have to be, we have to be careful with that. So, you know, as we wrap up here, again, we've been talking to John Grout from Berry College. You can check out his website, and again, that pdf book that you can either get online or, or contact John to get his mistake proofing the design of healthcare processes.

Mark Graban (1h 13m 56s):
And kind of one, one last question I wanted to ask you, John. You know, looking at a book like that or getting education on mistake proofing, like what, what's the benefit in seeing examples to maybe in some cases copy versus developing a, a way of thinking and a process for developing mistake proofing?

John Grout (1h 14m 15s):
So there was a guy at the VA who we were talking about this book and he says, So is it a catalog or is it a catalyst? And it's both. And so on the one hand I think that examples are a catalog where you say, I'll take one of those. But there's a lot of other cases where you'll say, Oh, so they did that in that industry. In my industry it would look like this and it would be something entirely different. And so I hope that it is both a and a catalyst and that you design it carefully, vet it thoroughly, it may have side effects, but if it does, those side effects may be far less than the side effect of not improving the system.

Mark Graban (1h 15m 13s):
That's very well said. So I think we'll, we'll leave it at that as a, a final note. So, so John's been a lot of fun. I hope people also enjoyed, or if you, if you haven't, go checkout episode 186 of My Favorite Mistake. Lots to learn from John. I feel like we've just scratched the surface. But you know, thank you for great discussion and, and for sharing some of your knowledge here today.

John Grout (1h 15m 33s):
I've enjoyed it very much.

Mark Graban (1h 15m 34s):
well, thanks again to John Grout for a link to the free ebook on Mistake Proofing in Healthcare, his website and more. Look for links in the show notes, or again, you can go to

Announcer (1h 15m 47s):
Thanks for listening. This has been the Lean Blog Podcast. For lean news and commentary updated daily, visit If you have any questions or comments about this podcast, email Mark at

What do you think? Scroll down to comment or share your thoughts and the post on social media. Don't want to miss a post or podcast? Subscribe to get notified about posts via email daily or weekly.

Get New Posts Sent To You

Select list(s):
Leave A Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.