Steve Montague on Patient Safety, Checklists, and Lean Lessons from Aviation

16
2

Listen:


montague

Episode #246 is my second episode in recognition of Patient Safety Awareness Week.

My guest is Steve Montague, who talked about Lean and Crew Resource Management with me in episode #195 in 2014. He's a retired Navy fighter pilot, a commercial pilot, and a consultant for hospitals and health systems… and a fellow Texan and a near-neighbor of mine. See his full bio here.

Today, we're talking about a number of topics, including patient safety and checklists… what's the difference between good checklist systems and bad (and what are the parallels to Lean done well and Lean done badly). We talk about a number of articles and recent events about how NHS employees are afraid to speak up, an Iowa hospital that had four wrong site surgeries in 40 days, and the recent NEJM brouhaha.

See this recent BBC article that touches on how healthcare can learn from aviation: “Hospitals and airlines – on the same safety journey


podcast subscribe

For a link to this episode, refer people to www.leanblog.org/246.

For earlier episodes of my podcast, visit the main Podcast page, which includes information on how to subscribe via RSS, through Android appsor via Apple Podcasts.  You can also subscribe and listen via Stitcher.

Thanks for listening!

Transcript

Mark Graban: Hi, this is Mark Graban. Welcome to episode 246 of the podcast for March 17, 2016.1 Today's episode is my second this week in recognition of Patient Safety Awareness Week, and my guest is Steve Montague.2 He's a returning guest. We talked about Lean and Crew Resource Management back in episode 195 just over two years ago.3

He's done a lot of things. He'll introduce himself. He's a fellow Texan and a near neighbor of mine in the DFW area.4 Today we're talking about a number of topics including patient safety and checklists.5 What's the difference between, if you will, good checklist systems and bad? And what are the parallels to Lean done well and Lean done badly? We'll talk about a number of articles and recent events, including how an article said NHS employees in England are afraid to speak up and report errors. An Iowa hospital had four wrong-site surgeries in 40 days. And we'll talk about the recent brouhaha about Lean, Toyota and Taylorism in the New England Journal of Medicine.6 I hope you enjoy the discussion. Steve has a lot to offer and I hope you enjoy it. To learn more and to see links to all these articles, you can go to leanblog.org/246.7 Steve, it's great to talk to you again, thanks for coming back and joining us again on the podcast.

Steve Montague: I'm really excited to be here, Mark.

Mark Graban: I do want to invite listeners, they may want to click, pause and go back to episode 195 or just listen to that some other time. Back in 2014, we talked about Lean and Crew Resource Management and Steve is very uniquely positioned to be able to talk about Lean and patient safety and healthcare and lessons from the aviation space.8 But for people who didn't listen to the last episode, can you give a synopsis of your background and career for us?

Steve Montague: Yeah, the 30-second synopsis, I guess. I'm a retired Navy fighter pilot and when I left active duty I went to work for American Airlines. So I continue to fly as an international pilot for American Airlines. And then about 13 years ago I was asked to start working with some hospitals on helping them to incorporate and adapt best practices from high reliability organizations into their standard of care. It's really become quite a calling for me. And it's what gets me up in the morning, gets me out of bed in the morning, is trying to figure out ways to help great people do even better work.

Mark Graban: Yeah. And as we've talked about before, that's one of the parallels, the recognition that systems and processes and culture matter, that these problems that we see are not the fault of bad apples. It's great that you're coming and working with these good apples and helping them do better work. And when I say this week, part of the reason we are doing the podcast here is to talk about the annual Patient Safety Awareness Week that's promoted by the National Patient Safety Foundation and other organizations.9 So on this topic of awareness, Steve, I was wondering if you could share some of your thoughts about what the level of awareness is both for healthcare organizations about the need to improve patient safety and awareness in the general public.

Steve Montague: I think that they are both on the rise. When I started working with hospitals 13 years ago, we often felt like a voice crying in the wilderness. And clearly that has all changed. When I begin to talk about the work that I've done, people have a knowing look in their eyes and they recognize, “Ah, okay, yep, safety systems, I get it.” And so I think there is a greater reception in healthcare. There is growing concern and that's driven partly by CMS. They are slowly, over the years, been raising the bar on what are acceptable outcomes and what are unacceptable outcomes. So it's both a bit of a financial prodding, but also everybody in healthcare wants to do the right thing. I think it's just greater awareness of, “Boy, we sometimes get it wrong and that is many times preventable.” So I don't generally get as much resistance as I used to get. And then when it comes to the general public, so I'm working with some folks in Indiana and next week I'll be up there doing some workshops up and down the state. One of the topics that we're talking about is sepsis and the Surviving Sepsis Campaign and such. And my point of contact at the Indiana Hospital Association told me that during halftime of the Bears and Packers game, there was a one-minute commercial about sepsis and about how it's a big deal and it's a patient safety topic. So I think they are both on the rise. And so I'm not sure that there is a public outcry for it yet. I don't think it's gotten to that point of awareness, but I think it is rising.

Mark Graban: Yeah. And it's still, even with that awareness, there's still a lot of work to be done. If you look at the data, it seems like there's consensus though that the problem is still a big one. And I think that kind of points to the need to make this a focus and front-of-mind discussion every single week of the year. Is that part of the work you're doing with operating rooms and executives? How much of the solution is just making sure that people give daily priority to safety instead of just talking about it occasionally?

Steve Montague: Yeah, I think that building it into work practices, exactly like we do with standard work, makes it easier to do the right thing, to practice safely, just because that's how we always do it. And so I don't have to light a candle and say, “Okay, now I'm going to be safe for the patient.” You build the safety into the work processes. And we do that by working with them and building standard work. How do we start a case, how do we start the shift? How do I properly relieve the CRNA so that she can go and get a bite to eat? If it's built in and it's just part of the care processes, then we don't have to stop and stop what we're doing, stop providing care for patients and be safe. And so safety done well is transparent but not invisible. You can point to specific behaviors as to how they hand off the patient, how they brief the case and say that right there is a safety statement. That statement right there is intended to open lines of communication and ensure that we are cross-checking one another's behaviors. So I can see that it's happening, but nobody is saying, “Okay, now it's time to be safe.” So it's just there, it's running like an antivirus does on a computer.

Mark Graban: Yeah, it's always there. And I think the phrase “safety culture,” as opposed to thinking of safety as a program, I think that's important language. Even back in manufacturing, when I used to work there, the best performing companies in terms of employee safety didn't depend on a safety department. It was just built into the way everyone did things. It was a priority and it was clear that you didn't cut corners on safety. Bad manufacturers, and I've seen some of this too, they do cut corners on safety because they make the daily production quota the priority and they tolerate people doing things in an unsafe way. This isn't just a healthcare problem and not just a problem in aviation, some of this is just interesting human nature of on one level, nobody would ever truly want people to get hurt. But it seems like people make excuses in different industries of either why these things are bound to happen or they blame individuals. Curious your thoughts on that?

Steve Montague: Well, I agree with you. Nobody gets up that morning and says, “I'm going to be unsafe today.” And what's interesting is that you almost have to sell safety. And what I mean by that, for example, is we're struggling with a hospital where we're working and the surgeons are not interested in doing a more structured timeout that includes them providing a briefing on the case. “Here's what we're doing, here's why we're doing it. I expect it to last this long. This is my expected outcome. If this happens, a couple of contingencies.” And those types of things and they're resisting it. Fundamentally I understand why. I mean, I don't think I'm an unsafe pilot and yet I know I'm an unsafe pilot, which is to say on any particular day I may do something unintentionally or unaware of it, and it's unsafe. I'm certain that every surgeon in that organization believes that they are safe, and they are. And yet a safety culture, as you say, begins with the notion that on any given day, I can be part of a team that begins to practice unsafely. And so what we can do, though, is going right to your point of production pressure and financial incentives, there is also in being more organized, I don't have to tell you, Lean practices not only provide a higher quality product, but they become more efficient as well, just because it's a continuous iteration of what is best, what is the best way to do this. And so similarly, what we do quite often is point out that surgical briefing, starting the case. Yes, it's an investment of 30 seconds of time. The return on that investment is at least 30 seconds and probably several minutes of reduced intraoperative time simply because our team is working in a more coordinated fashion. And so if you tell a surgeon you can spend less time doing your cases, that's something they do believe. “Yeah, my cases take too long.” And it's because of the system and frustrations. That does resonate with them. And so you can provide that as an initial incentive. And then when you follow up and show that, “Oh, by the way, your post-op infections are down, your length of stay is down, your post-op medication orders compliance is up,” you know, and we are raising the standard of care simply because we're willing to invest that time to coordinate our efforts. So yeah, safety isn't sexy. It just isn't.

Mark Graban: But, you know, back to surgeons and the general topic of resistance to change. I mean, you would hate to think there's probably not a resistance to safety or resistance to good outcomes. But what I hear you saying is that there's either some sense of denial or saying like, “Well, I haven't made these mistakes in the past, so I'm not going to make it right now. I don't need these checklists.” Or maybe people think, “Well, yeah, a mistake might happen, but what can you do? It's human error.” And what I heard you saying was within a safety culture, we're all capable of error and we recognize that. That reminds me of Lean and the Toyota notion of respect for people respects the fact that we're human and we're fallible. It's just hard to find a point of alignment with the physician. I guess you stated one of them is, you know, “Investing this time saves you time.” But even without that, how do you try to make the case just around safety and outcomes alone? Because again, I think there's alignment there in terms of what they want to happen.

Steve Montague: Well, fortunately, which wasn't true even 10 years ago, we didn't have very good evidence. It was moving forward on face validity and comparison between accident rates before and after and that kind of thing. But at this point, I really don't have to worry about it. There's so much really solid evidence that these practices improve patient outcomes that anybody that is willing to take the time to look at the evidence, they really don't have a credible argument anymore. But I think you're exactly right when it comes to the resistance of, “Wait a minute, I'm very safe without these things. Why do I need to do these things? I'm not unsafe.”

I was talking to Marjorie Stiegler, who's an anesthesiologist up at University of North Carolina. I worked with her many years ago at UCLA and she was helping with the simulation program there. We talked about this very issue and I said, “Okay, clearly part of it is an outcome bias, which is to say that the patient emerged from the hospital without any injury or harm. So therefore what I was doing was safe.” And she made a really good comparison to drunk driving. How many times has a drunk driver, when somebody gets pulled over DWI or in an accident, how many times before that had they drunk and yet they got home safely and everything was fine? And so it very much is human nature to point to that and say, “I don't need to change anything.”

Mark Graban: Well, there's the… you know, driving while drunk I think would be an example of an unsafe practice that doesn't always lead to harm. Alcoa, I've blogged about this. Alcoa has always taught what they call the safety pyramid of sort of this hierarchy of you have unsafe practices and then at a smaller ratio, you have near misses, and then you have, at a smaller ratio, incidents that cause harm. And then there's the tip of the pyramid, that one in a large number of occurrences that actually causes death and what they're teaching, which I think is similar to the patient safety movement, you need to act on not just the near misses, but the unsafe practices. And kind of keep in mind past performance doesn't guarantee future results, I guess when it comes to unsafe practices.

Steve Montague: Right. And you know, that's really important. I mean, on the one hand it's sort of pathetic that we have a Patient Safety Awareness Week. Why isn't this week Patient Safety Awareness Week or next week? I think you go back to recognizing, first of all, stating very clearly in job descriptions and performance expectations that you will engage in specific, very clearly outlined and defined safety practices. And so you build it into job descriptions, into evaluations, credentialing, hiring and firing decisions, performance reviews, what have you. And if you focus on safe practices, then you stay out of the bottom of that pyramid altogether. Intermountain has done that for years and I can't remember what year it was, but they were within, I don't know, I think it was two decimal points of the stated performance goal and they came up short and nobody in the system got their bonus that year. And wow, that's a real commitment to expectations. I think when you build it into expectations and have the accountability there in all those different ways, I think that makes it just a part of the safe practices. Certainly I'm evaluated that way. Frankly, every time I fly, I'm evaluated by my peers and they'll call me on it. If I am doing something that's unsafe, they'll call me on it. That right now, peer-to-peer accountability is critical. But then I'm also evaluated at least every nine months, sometimes more frequently by FAA evaluators or designated evaluators. And so those safe practices are just built into the job expectations and I'm not going to be able to continue to practice if I'm not using them.

Mark Graban: So let's compare that to back to the healthcare realm. There was a news story I shared with you from England that said, “Only 5% of hospital mistakes ever get reported. Staff are too scared to blow the whistle for fear of repercussions.” And that kind of strikes a contrast to what you were describing in aviation of a culture where people are expected, encouraged to speak up versus people who feel intimidated or scared or they think it's just not going to lead to anything. What are some of your thoughts on that fear culture, how can you take steps to actually try to change that in healthcare where people do speak up?

Steve Montague: For the right reasons, I think that really goes to just culture. But even that it goes back one step further of an expectation. So when I begin a trip, I tell my crew, “Look, if at any time you see something doesn't look right, I expect you to speak up. I am fallible. I want you to challenge me.” And so I create that expectation that they will challenge me and that I will challenge them. And we continuously say that to each other because we know that we've got to do that because there is a reluctance to challenge one another and to speak up.

Mark Graban: Even if that means a delay to the on-time departure which is such an important metric and bonuses are tied to that, I'm sure.

Steve Montague: Yes, absolutely. But then this is a higher level of being willing to speak up. But I do think that sort of doing is becoming. And so if I am on a consistent basis telling people that I expect them to speak up and they're telling me they expect me to speak up, then I think it becomes easier within the organization to speak up. Now that requires a lot of trust. And what you're seeing here in that article is that there is distrust of the system. And commercial aviation is not immune to that wariness or that fear. There was, I should remember what year it was. But several years ago the just culture system that operates in commercial aviation had been shut down essentially by most of the major airlines in the US and there was a brief period of less than a week when the just culture agreements were suspended. And it came down to one thing. And it was a desire by some of the airline management teams to want to use the self-disclosure information punitively. And boy, if you want to… and that's what you're seeing here is if I speak up, I'm going to be punished for speaking up. Whereas in my profession here in the US I'm actually rewarded for speaking up, for self-reporting my errors. And that's just critical.

Mark Graban: Or I mean there might be the fear. If I'm a nurse in the operating room and I speak up and I don't want that surgeon to be treated in a punitive way if they were about to make an honest mistake and I caught it, there might be that fear as well, because we see people getting blamed and punished and sometimes prosecuted and jailed for making what it seems like were just honest mistakes.

Steve Montague: Yeah, absolutely. And it's just built into us, I guess, because we feel like, “Well, the only way to make sure something doesn't happen is to punish the individual.” And don't get me wrong, I'm not saying there shouldn't be accountability, but I had a commanding officer many years ago who would say, “You know, Monty, when you point your finger at somebody else, you're pointing three back at yourself.” And I kind of use that as a good rule of thumb for everything I want to blame on the individual. I think there's probably three institutional things that we should take accountability for. How is this person hired if this person is acting? You talked earlier about unsafe acts. What's our hiring process that we got this person in? What are our daily practices? Are they unique in these unsafe practices? Or is this pretty much how things get done around here? It's a really fascinating deal. But I think that moving away from the punitive aspect and going more to a sort of a five whys, “Well, why did this happen?” and really being humble and honest with ourselves as an institution about how we aided and abetted or set this system up that allowed this person to make this error or what have you.

Mark Graban: Yeah, well, and there's… you can even play “five lies” if it's true. If leadership says, “Well, we've got some bad apples,” to your point, “Who hired those bad apples? Why are we not better screening out the bad apples?” You could ask those questions, even though that would be, I think, an application of root cause analysis pointed in the wrong direction, perhaps.

Steve Montague: Yeah. And what are our radar to identify this person and remediate the behaviors and reach out to them and try and keep this employee? I mean, I really think there are very, very few bad apples. I think there's a lot that goes to reaching out to them and figuring out what's going on. Solving the problem and not firing the person.

Mark Graban: Yeah. And I would encourage listeners who don't know about the Just Culture methodology, just do a Google search for that. And it kind of gives a bit of an algorithm and some mindsets to help determine if something was a systemic error or if it was the type of situation where personal accountability would be most appropriate. But I want to delve into one other headline and story and I'll link to this on the blog post for the episode too. A health system in Iowa had reported four wrong-site surgeries within a 40-day period.10 And there were two things that jumped out at me in the comments. One was the hospital spokesperson who I assume is still employed there, they said, “The mishaps were due to the improper execution of test timeouts.” I was wondering if you could… I guess there's a couple of different questions combined there, but, can you talk about proper versus improper execution of timeouts and if it's improper, if it's happening improperly, whose responsibility is that? The physicians or their leaders? What do you think?

Steve Montague: Oh, boy. You know, there's a lot to that. The first thing that I thought when I read that was the old line about minor surgery is something that somebody else has. And so “no serious consequences reached the patient”? The patient feels they didn't die as a result. But still, let's not go patting ourselves on the back. And so who's responsible for a good time out? Everyone. But we want to avoid the myth of social redundancy. The old saw about how do you starve a horse? You ask two people to feed it. And so who is most responsible? In my world, it's the captain and the person who has the highest level of authority or licensure or title or power. They are the person that is most responsible. So yes, the surgeon and the anesthesiologist are the two persons with the highest level of licensure. So they have the highest responsibility for it. However, clearly the hospital leadership has got to be involved too. Four wrong-site surgeries in 40 days.11 If you recall, Mark, they had a… there was one surgical team that was undergoing disciplinary procedures and you kind of have to laugh. Seriously, you had four wrong surgeries in 40 days and you think it's a team's fault? You don't think this is… you've got serious problems with your systems. And I guarantee that if you had watched the timeouts prior to those events, you would have seen a compulsory compliance that was, “Well, we met the standard,” which is somebody used the word “timeout” and there was some sort of one person, typically the circulator, because they are the ones, generally speaking, who are actually held accountable for the timeout. And so while they don't have any power or at least they don't have sufficient power, they are held to a higher level of accountability. So typically you will see them basically just call out to the room and say, “Okay, everybody, here's our timeout. This is Mrs. Jones and this is what we're going to do.” And the surgical techs continue to prepare their instruments and the surgeon is talking to whomever about something else. There is no engagement as a team. I was absolutely blown away. I was down in Houston last week and observing teams down there and they did the finest timeouts I've ever seen. Everybody stopped, turned. It really was interdisciplinary. Everybody was talking about their role. The anesthesiologist said, “Okay, this is the antibiotics that were given at this time and the person has no allergies.” And then the surgeon discussed, “I don't anticipate any blood loss or if there's a possibility that we have it typed and screened,” and so on and so forth. It was truly interactive. And it's what the timeout was always meant to be, is, “Let's create an effective team.” And that's what we help organizations do. These folks had gone a long way to doing that on their own, and that's not easy, but I really admire the work that they've done.

Mark Graban: Yeah, well. And when you talk about timeouts that are being done in the most cursory way to say, “Okay, yeah, we checked that box.” If that's happening, that seems like one of those unsafe practices that isn't necessarily immediately going to lead to harm, but it's one of those unsafe practices that in my mind, leaders have a responsibility for detecting. You know, who is there as the equivalent of the FAA observing you as a highly skilled professional pilot? Who is there observing these highly skilled professional surgeons and their teams to say, “Wait a minute, here's an unsafe behavior. We have a responsibility to do something in advance instead of just reacting after harm occurs.”

Steve Montague: Right. And this is actually a place that's quite promising. As simulation becomes more and more routine and it really doesn't have to be high-fidelity simulation. You can use some very low-fidelity simulators that still invoke and still require the team dynamics and you can evaluate. I can walk into an operating theater and we're talking in surgery, so I can walk into an ICU and watch team rounding or so many different venues. But the point is that I can tell if somebody's faking it because they're just not comfortable with it. It's clearly not what they always do. And people are kind of looking and they're confused. So you can see, you can tell what's going on. What do people really do day in and day out practice. I think that the more we use simulation and we have peer-to-peer evaluations, Dr. Atul Gawande talks about how it occurred to him that the world's greatest golfers and tennis players all get coaching every week. And he said, “Well, maybe I could use some coaching.” And that's what I think is the future, is more peer-to-peer coaching and evaluation and assessment. And the simulator is a great place to do that. So I think that we're coming along. We've come a long way and we've got a long way to go. And frankly, I can say that about aviation too. I like the fact that high reliability organizations are no longer the way that the researchers discuss it. They discuss “high reliability organizing” because it describes a journey rather than a destination. I do believe that I know commercial aviation is very, very safe. And yet we have a long way to go.

Mark Graban: Well, so and as a final topic, let's talk again about. I think there's clear parallels between checklists and lean. You talk about this being a journey. We have, we've talked about, good checklist implementation and bad. We have really effective lean instances out there. And then we have organizations that do some really almost embarrassing things. And there's studies that will show, “Simple checklists save lives in the operating room. Study finds.” And there's a different study from the same time frame that says, “Surgical checklists may not be effective at improving safety. Study finds.” And there are lots of studies about lean being effective. There are people who write journal articles about lean not being effective. So it's kind of puzzling. Where is the truth here? Or is it really possible to have simultaneous existence of effectiveness and ineffectiveness? We were also going to talk about, along these lines, an article I blogged about from the New England Journal of Medicine. Pamela Hartzband and Jerome Groopman, a couple of doctors from Harvard affiliated institutions, were decrying lean and efficiency experts and saying that this is not appropriate for healthcare. And they weren't even trying to point to data. It was really more of an editorial. So in my long-winded… I was curious, first, your thoughts on these conflicting studies or simultaneous existence of bad and good examples. And then that New England Journal of Medicine article, maybe first off, the good and the bad simultaneously. How is that?

Steve Montague: Well, so first of all, I think they are very closely related, both of the articles or issues. I really respect Dr. Groopman. When I read through that article, there was one line in there that I thought actually reflected really what their point was, and that is, “Good medical care takes time, and there is no one best way to treat many disorders.” Okay, I get that. But what they're really decrying is cookbook medicine. And that's not what I think anybody's advocating for. But even in internal medicine and Dr. Hartzband, clearly there's a lot of complexity in some of those diagnoses. But there are also some common elements that all doctors share, and that is they are humans, and therefore they are… in fact, Dr. Groopman talks about this in How Doctors Think, they are subject to human cognitive errors. So a diagnostic pause is a protocol to avoid Type 1 diagnostic errors. And so it is a standard way for us to have a to bias out the things that we know are wrong with some of the way that our brain works. So there is a protocol, a standard work that we can put into place to make sure that we are using our higher-order thinking to avoid these types of errors. And so when you read the rest of the article, it's clear that these two have absolutely no idea what Lean is.

Mark Graban: I don't think they were even exposed to Lean, done badly. I mean, it just seems like they're just pontificating.

Steve Montague: Yeah. I mean, they're very frustrated. And I get it. I would be too.

Mark Graban: And they have valid complaints. But I think they're kind of pointing at the wrong cause of those things that are frustrating to them.

Steve Montague: And other doctors agreed. So yes, there is Lean. I love it when I go somewhere and I am not qualified to really do lean like you do, but I love it when I hear somebody say, “Yeah, we did Lean.” And I said, “Well, then, no, you didn't.” And so it's the same kind of thing. There are checklists done well and checklists done poorly. And Peter Pronovost at Hopkins and there's some researchers in Sweden, all of whom have said, “Look, putting checklists into place without really grounding them well in a system is actually arguably increasing, exposing patients to greater risk.” So look, a 5S or spaghetti diagrams or any of that, those are tools. And very similarly, a checklist is a tool. But let's consider a carpenter's tool belt. It's a standard set of tools, but the carpenter must be trained on how to use it. And same is true with standard work. The carpenter has to integrate with other trades. So a checklist will not work on its own. It's got to be embedded in teamwork, behaviors and just culture algorithm and a management system that supports it. The carpenter follows a structural blueprint. Well, Gemba walks, metrics, publishing success, visual management boards, all of these are the milieu, the structural blueprint within which the carpenter functions. And then there is the builder who's overseeing the carpenter. And that's really the lean management team. And in the last piece is that and this goes back to “we did lean” is the house still needs maintenance after it's been built. And so checklists are the same thing. They've got to be, first of all, either designed or modified by the end users, with respect for people, they should be routinely maintained. They should be updated to reflect the latest evidence base. Everybody needs to be trained on how to use them properly. And we use TWI principles to do that.

So I think that first of all, the biggest takeaway is that whether it's a checklist or standard work or lean tools or any of that, they've got to be embedded in a system. And for the checklist, it's got to be a system of the willingness and the knowledge of how to challenge one another and to say, “Wait a minute, no, we're going to use the checklist.” The ones that have shown failure, the way that they evaluated “were people using the checklist” was self-reported completion. And yet the studies that have gone and looked at it have found anywhere between… a study at University of Texas, in only 2.3% of the cases… reported compliance with using a checklist, a 13-item checklist, a timeout checklist, the reported compliance was 100%. And yet when they went and they did observations, in only 2.3% of the cases were more than 7 of the items completed of a 13-point checklist. And so when you go and look at the studies of, “Look, it didn't change the morbidity, mortality.” Well, really, you're talking about self-reporting and that's how you're assessing. So there was no training. It was like how to implement a checklist. Wrong. And I think that Dr. Pronovost and the folks out in Sweden are correct that it is increasing the risk to simply say, “Here's a checklist, use it.” It probably increases the risk to the patient.

Mark Graban: And you bring up a lot of points that remind me of the way Toyota people today describe Lean. I've shared this framework in earlier editions of my book and on the blog. The idea that Lean, Toyota Production System, it's not just the technical tools where you could say, “Okay, well, hey, send me your checklist.” Okay, that checklist is the tool. But then there's also the underlying philosophies. There's the managerial approach. How do we make sure the checklist is actually being used in the right spirit? How do we update it? How do we improve it over time? It seems like whether it's with checklists or with Lean, there's enough body of evidence out there about what needs to be done. But we still hear horror stories, whether it's the clunky 5S initiative where employees are told, “You can't have a family photo on your desk and you can't put a cardigan on the back of your chair.” I don't know what that has to do with the results or the goals of the company. I had somebody send me a tweet earlier today in a healthcare setting, I won't say where, but it seems believable that they're complaining that because of Lean, managers are timing how long staff smile when they're greeting patients, that the smile has to be 7 seconds long. And if that happened, I can believe that it may have happened somewhere, I'd say, but this… that's not Lean. That's somebody misapplying a tool called standard work. But they don't understand the purpose of it or the philosophy of it. And if it's making people that upset, “Wait a minute, timeout.” Different type of timeout. Something's wrong here, right?

Steve Montague: Well, yeah. And then you understand. You begin to understand why folks like Dr. Groopman get upset and they're right. If it's as you described, lame, then yeah. And it really hurts folks like you and the folks that are really trying to be diligent and serious about using Lean to improve care. It makes it hard for you to do it.

Mark Graban: Well, but ultimately I don't. It doesn't hurt me as much as I think that it hurts the organizations and the patients that deserve better. They deserve real lean. Otherwise, like I said, doing lame or fake lean or whatever term we use causes maybe more problems than if they had done nothing. And so in the spirit of Patient Safety Awareness Week, that's really what drives me and gets me up in the morning. It's not about implementing Lean. It's about improving patient safety. And I appreciate that you share that passion as well.

Steve Montague: I do, Mark. And also, it's about, you imagine the frustration of the nurse at the bedside, the doc who's saying, “Why are we… why are you timing. Why are you walking around and timing me?” And so they all go together. You talked about Alcoa. They all tie together.

Mark Graban: And I've read complaints about residents were complaining that they were being followed around by engineering interns who were timing how long they were in the bathroom. And like, “Oh, come on.” That's the stuff that gives us all, that gives Lean and improvement a bad name. So on. Well, let's end on a positive note. Can you tell me, as we wrap up here, a kind of a positive patient safety improvement story so we can end on a little bit better note?

Steve Montague: I get every once in a while we have, we see an article published and it's, “We saved…” Ohio State just published a really nice article talking about how much they've invested in implementing the CRM. And I got to work with them and then how much it saved them. And really those savings are in patient outcomes too, you know, so that's huge and wonderful. But there was a hospital system in Ohio that I hadn't spoken with for several years, and I was afraid that it was going to be one of those, “Yeah, we did Lean,” places where it was a flash in the pan and it went away. And I heard from their chief learning officer and she called and she said, “Oh, yeah, we do this all the time, and it is such a successful program and we've caught so many mistakes and possible accidents and untoward outcomes simply because we implemented this one word.” And they chose the word “cardinal.” And I don't know why, but it was a way to call the team's attention to, “Hey, wait a minute, this is a safety issue.” And they could use that word. And it was psychologically safe for them to say that and for people to stop and listen to them. And so hearing that they had prevented a couple of or fires and potentially these wrong surgeries and stuff, boy, that makes your day. And so it's sometimes we don't know about how far things are rippling and how they're doing.

Mark Graban: Yeah. Well. And, for all the things that are frustrating, it's good to see the positive and the progress that's being made, even if there's still a lot of progress still to be made. I appreciate that you're helping so many people in that process. So, Steve, thank you for being a guest here on the podcast. We'll have to do this again. There's so much we could talk about, and it's always a pleasure to chat.

Steve Montague: With you and hear your perspectives, Mark. I thoroughly enjoyed it. Looking forward to the next time.


Please scroll down (or click) to post a comment. Connect with me on LinkedIn.

Let’s build a culture of continuous improvement and psychological safety—together. If you're a leader aiming for lasting change (not just more projects), I help organizations:

  • Engage people at all levels in sustainable improvement
  • Shift from fear of mistakes to learning from them
  • Apply Lean thinking in practical, people-centered ways

Interested in coaching or a keynote talk? Let’s talk.


Join me for a Lean Healthcare Accelerator Trip to Japan! Learn More

Get New Posts Sent To You

Select list(s):
Previous articleFrom a Patient Safety Tragedy to Lean & Baldrige Success in a Small Texas Hospital
Next articleMore on “Motivational Interviewing” as a Method for Workplace Change Leadership
Mark Graban
Mark Graban is an internationally-recognized consultant, author, and professional speaker, and podcaster with experience in healthcare, manufacturing, and startups. Mark's latest book is The Mistakes That Make Us: Cultivating a Culture of Learning and Innovation, a recipient of the Shingo Publication Award. He is also the author of Measures of Success: React Less, Lead Better, Improve More, Lean Hospitals and Healthcare Kaizen, and the anthology Practicing Lean, previous Shingo recipients. Mark is also a Senior Advisor to the technology company KaiNexus.

2 COMMENTS

  1. Excellent Podcast on the effectiveness of checklists on patient safety.
    I worked for a large oilfield service company back in the 80’s and 90’s and checklists were used anytime we were about to embark on pre-determined high risk operations, notably well treatments and truck convoys to and from well sites. Some suggestions about “surgical timeouts:”
    1. This may sound somewhat irrational but I would change the name of this event as the connotation is childish and you are about to start a high risk operation. In oilfield we called them Operational Safety meetings.
    2. Our meetings were conducted by the highest ranking person.
    3. It was mandatory everyone on the well site attended, not only the employees from our company.
    4. Most job positions had checks they had to make before the meeting and reviewing the outcomes were part of the main checklist.
    5. All key measures were discussed as to the intended levels we would see during the operation.
    6. “What ifs,” were discussed, as to if something went wrong what were the countermeasures.
    7. Each person’s name was read out loud and their role in executing the operation.

  2. Good morning from sunny Santiago, Chile, Robert.

    I think the ship has sailed on the name (timeout) although some organizations have opted for “surgical pause”. As to the rest of your comments, they’re exactly what we recommend to our Partner hospitals when it comes to this event. Specifically:

    Surgeon-led.
    Everyone who will be a part of the surgery is present and attentive
    It’s interactive, with everyone commenting on their readiness (safety checks complete)
    Milestones and outcome expectations
    Most likely contingencies
    Introductions of everyone on the team.

    There are a few more elements that we often see, such as a comment on fire risk, anticipated blood loss and availability, and a request that all team members speak up with concerns at any time. Some of our suggestions made it into the final cut of the WHO Surgical Safety Checklist… others didn’t :-(

    Thanks for your kind comments, and for your dedication to helping healthcare get better.

    Have a good weekend.

Comments are closed.