Interview with Amy C. Edmondson On Psychological Safety And “The Fearless Organization”


My guest for episode #356 of the podcast is Amy C. Edmondson, PhD, the Novartis Professor of Leadership and Management at the Harvard Business School. She is the author of three books on teaming and her most recent book is the topic of conversation today: The Fearless Organization: Creating Psychological Safety in the Workplace for Learning, Innovation, and Growth.

In the episode, we explore the incredibly important concept of “psychological safety,” which means, as Edmondson defines it:

“…a climate in which people are comfortable expressing and being themselves.”

This is necessary for Kaizen (continuous improvement) and it's also a huge contributor to people being able to speak up about patient safety risks (or other problems in the workplace).

One thing I love about her book is that she doesn't just diagnose the problem (that fear of speaking up is bad), but she also lays out a plan for how leaders can create a more psychologically-safe environment and culture.

From her bio: “Edmondson received her PhD in organizational behavior, AM in psychology, and AB in engineering and design from Harvard University.”

  • LinkedIn and Twitter pages
  • HBS Faculty Profile
  • About your background… what inspired you to get into academia to study organizational behavior?
  • How have previous degrees in engineering and psychology affected your views on workplaces and leadership? (that's a rare combination?)
  • Why is fear a problem in organizations?
    • “Driving fear out is mission critical”
  • Some famous startups sound like fear-driven environments from news reports…
    • Are some organizations are successful in spite of fear?
  • Has any of your work been influenced by W. Edwards Deming, who famously wrote about eliminating fear in organizations? Was aware of him 25 years ago
  • How do you define “psychological safety” in a workplace? Has that evolved since 1999?
    • “Psychological safety is broadly defined as a climate in which people are comfortable expressing and being themselves.”
    • What about arguments that nurses and others have a “professional obligation to speak up”?
  • In terms of creating a safe environment, does this necessarily start at the top of the organization?
  • You write about humility… can leaders actually become more humble?
    • “I think of it as a trait, not a skill”
  • What are some misunderstandings about psychological safety?
    • What does it mean to “sanction clear violations”?
  • Why is it important to make it “safe to fail”?
  • Is there a difference in response when teaching seasoned execs vs. younger MBA students?
    • Younger students are more worried about “yeah, but I'm not the boss”
    • Older execs – aha moment as a risk factor for their firm
  • What's the most interesting or surprising thing you've learned since the book was published, on this subject?

Listen to the podcast here

Amy C. Edmondson On Psychological Safety And “The Fearless Organization”

LBI Amy C. Edmondson | Psychological Safety

Welcome to episode 356. It is January 22nd, 2020. My guest for this episode is Amy C. Edmondson, PhD. She is the Novartis Professor of Leadership and Management at the Harvard Business School. She's the author of three books on teaming, and her book titled, The Fearless Organization: Creating Psychological Safety in the Workplace for Learning, Innovation, and Growth is the topic of our conversation for this episode.

We explore what is an incredibly important concept of psychological safety. That's a phrase that means, as Edmondson defines it, “A climate in which people are comfortable expressing and being themselves.” In my view and experience, psychological safety is necessary for kaizen or continuous improvement and it's also a huge contributor. People need to be psychologically safe to be able to speak up about patient safety risks or other problems in the workplace.

I'm enjoying her book. One thing I love about the book is that she doesn't diagnose the problem. We're not blaming the people who are afraid of speaking up. This is about leadership and culture. She also lays out a plan for how leaders can create a more psychologically safe environment and culture. That's where it starts. Edmondson received her PhD in Organizational Behavior, MA in Psychology, and an AB in Engineering and Design from Harvard University.

We are joined by Amy Edmondson. Amy, thank you so much for taking time with us. How are you?

I'm well, thank you. I'm glad to be here.

I'm excited to talk about your research, work, book, and the context for people who are doing work in the realm of lean and continuous improvement. Talk a little bit about your background. From reading or bio, it sounds like you went from industry into academia. I'm curious about that shift and your interest in organizational behavior.

I started out life as an engineer. Right out of college, I worked as an engineer, but I worked in a very peculiar setting. I worked as the Chief Engineer for Architect/Inventor Buckminster Fuller. He was a quasi-academic. He was best known for inventing the geodesic dome which was what I spent most of my time doing. I was perfecting and developing some of the mathematics for geodesic dome design, but Bucky Fuller was all about making a better world. His overarching theme was, “How do we use our minds to design things and solve problems that need solving to make life better for everyone?”

That's a little bit overarching, but nonetheless, it was quite inspiring for a young person. He was exactly four times my age when I started working for him. I don't think too many people have their first boss four times their age. I was 21 and he was 85. He was a remarkably inspiring, generous, and inclusive person. That spoiled me.

When he died, which he did very suddenly at the age of 88, I decided to write a book about his work, which is called A Fuller Explanation. In the process of writing that book, I registered in the back of my mind that the combination of writing, thinking, and trying to make the material as clear as it can be, and then periodically coming out to give a presentation, a workshop, or teach a course rhythm was right for me. I liked spending time alone with ideas and I also liked coming out and trying them.

When I finished that book, I knew I was a teacher-thinker, but I didn't have a field. I didn't think engineering was my field. I got a job with a boutique consulting firm that worked in the organizational development and organizational change space. This is what got me into organizations and businesses, which had not been a natural interest or place for me. I found that the people working in most large companies where we did our work were incredibly curious, generous, and eager to make their organizations better, and got frustrated by bureaucracy and the pace of change.

LBI Amy C. Edmondson | Psychological Safety
A Fuller Explanation: The Synergetic Geometry of R Buckminster Fuller

I became interested in those kinds of problems or in the problems of well-meaning people trying to make things better and getting stuck, and not necessarily knowing how to get unstuck. Somehow, I got it into my head that I should go get a PhD. I'm not sure why that made sense to me at the time, but I thought I'd get smarter and able to be helpful. It took a long time before I was either smarter or more helpful, but it turns out to be the right choice for me.

These organizational challenges that you've researched and helped people with, I would put those in similarly in a category of problems that matter to make workplaces and people's lives a better place. That's a high-minded but important goal.

Yes, especially since so many of us spend so much of our waking time at work.

I don't know if you've seen studies or other people's feelings about it, but I've seen things that suggest a dysfunctional if not toxic workplace can have serious physical health effects on people. It's not just connections to happiness.

Jeff Pfeffer wrote a whole book on how work is killing us through the stress, anxiety, and emotional toxicity of many workplaces.

I saw in your bio that between your Engineering and your PhD, you have a Master's in Psychology, is that correct?

Yes, that turns out to be something that happens to you in most PhD programs. I don't get any extra credit for that. As a PhD student in Organizational Behavior, when you satisfy your master's level coursework and master's level thesis, it's not called that, but you automatically get a master's degree in the discipline.

That wasn't a separate stop in your academic journey. It's somewhat of a rare combination of Engineering and Psychology together in someone's head.

It is, yet it goes together in my mind because you're drawn to engineering because you want to make things work. You like the math and the tangibility of it. Organizational Behavior may seem very different, but researchers have to use math. Fundamentally, at least in the field of Management and Organizational Behavior, we're interested in figuring out what works.

I would agree with that characterization. I want to give thanks and a shout-out to our mutual friend, Tom Ehrenfeld, who's probably reading this episode for introducing us. You might not know much about my background, but I'm also an engineer. I'm an industrial engineer and mechanical engineer. From my exposure to the work of W. Edwards Deming, who's sometimes labeled a statistician, in his work, he says, “The most important thing for a leader is to understand psychology.” That's something I've tried studying even informally.

For me, it was about shifting my interest from structures to people. It turned out I was far more interested in people. People are far more complex as well. We sometimes are banging our heads against the wall. Deming is right. It's about people and having the emotional intelligence and the self-awareness to be able to thoughtfully influence people to engage in the right work.

Maybe this is partly a flip side to psychological safety, but Dr. Deming famously recommended that organizations eliminate fear. There are maybe different interpretations of what that means and you also write about the idea of fear. How would you summarize to people why is fear a problem in organizations or why fear holds organizations back?

First of all, I can't help saying that I was aware of Deming's eighth point several years ago. Maybe Larry Wilson who I worked with before graduate school pointed it out to me. The eighth point of driving fear out of the organization is stuck right there in the middle of a long list of more technical-sounding things. That jumped out at me and I didn't give it direct thought very often, but periodically, I come back to that and think, “That's where it came from.” I would give him credit.

To me, now even far more than when Deming was writing, driving fear out of the organization is mission-critical because more and more of the work is both fast-paced, shifting, and complex. Anyone can have something to say or might notice something that could matter greatly. The vital input could come from anyone. It's driving the fear out so that vital input will be heard is absolutely crucial. Secondly, we have more of an emphasis on innovation. Innovation requires being comfortable raising wild ideas that might not be right and being comfortable experimenting in ways that are smart but might end in failure, and all of those things that just don't flourish when there's a high level of fear.

Innovation requires being comfortable raising wild ideas that might not be right and being comfortable experimenting in ways that are smart but might end in failure. All of those things don't flourish when there's a high level of fear. Share on X

When you talk about innovation, I find it fascinating that there are some famous startups now defunct that, at least from books or different accounts, sounded like extremely fear-driven environments without naming names. It's ironic or troubling that an organization, whether it's in Silicon Valley or otherwise, could preach innovation, but at the same time, be creating an environment that seems to reasonably stifle innovation or, at least, limit potential.

I agree. It requires us to theorize under what conditions will an organization do well when it's being led in a very fear-invoking way. A few possibilities come to mind. Possibility one, the genius at the top is a genius and has such a clear vision of everything that needs to happen. He or she can divide and conquer, and then tell people, “You better do this or else.” They'll do it and it will happen. It doesn't sound like the norm in most organizations that we know now.

The other possibility is we don't know what the upside would have been had there been more diverse voices in the mix. Sometimes we point to the success of some company ruled by fear and say, “They did fine,” but we don't know how they would have done with a more inclusive, more generative, and less fear-based climate. The third thing that comes to mind is oftentimes there isn't an obvious competitor. They're unique in a new space. No one else is there so they can almost do what they want and still succeed. In a competitive market, especially for labor or employees, that mode won't work for long.

To your second point, it seems like there are maybe some organizations that have been successful or still exist in spite of that culture of fear. It sounds like you're describing this unwinnable thought experiment of, “Would they have been more successful?” It's unproven.

It's unprovable. We don't have the counterfactual.

If you can help share how you drew some of your conclusions about the criticality of psychological safety. There were some fairly direct compare and contrast studies or looking within Google or was it primarily Google?

Yes. The great thing about Google, at least great for me, is that I didn't do that study. For me, it was great because they used my variable, which was a published academic article in 1999 in Administrative Science Quarterly. My article was titled Psychological Safety and Learning Behavior in Work Teams. It showed that in a single Midwestern manufacturing company teams differed phenomenally in higher psychological safety and that difference was associated with more learning behavior, and with higher performance as judged by outsiders, either the recipients of the team's work or the managers of the team's work, depending on what team it was.

Google in its famous project, Aristotle, several years later, was trying to figure out what accounted for persistent performance differences across teams. They had a sample of 180 teams and no variable was popping up to explain the variance until they looked at psychological safety and found that was the single factor that explained the performance differences in teams. That's what put this concept on the broader map as opposed to the academic map.

It was enormously reassuring to me because my priors about a place like Google would be that these team members are so smart and are not just going to be holding back. They're not going to have the experience that most of us have of, “I'm not sure if I should speak up now. Maybe that won't be welcomed.” They're not going to be reading the tea leaves the way many other people might be because they've gone to top schools, got hired by Google, which is famously hard, and so on.

I am thankfully dead wrong in my priors because in fact, through Google, we had pronounced differences in psychological safety, and indeed, they were predictive of performance. It was quite a stunning confirmation of something I'd been working on for a long time. Originally, my very early work and stumbling into this concept was done in healthcare delivery in the hospital setting. I didn't set out to look at psychological safety. I set out to look at differences in learning and performance. The narrow focus of the study was medication errors, which are not a good thing. We don't want them.

Differences in psychological safety are predictive of performance in organizations. Share on X

It's the same as we want in all of Lean and all of quality improvement, we always want to catch and correct errors early so that they don't continue down the process. What I found without having set out to look for it, was that there were profound differences in reporting climate across units, even in the same hospital. The unit over here, people are speaking up and going, “Let's check that. Is that the right dose?” Over here, you had people hiding and putting things under the rug if they could get away with it because it was so threatening interpersonally.

Reading your description of the book, that jumped out at me as somebody who's worked in healthcare a lot and cares about these issues of patient safety and reducing, preventing, and eliminating errors and harm. I had an opportunity to visit two hospitals in Japan. It's two different academic medical centers. One of them has gotten very direct coaching from Toyota executives because their hospital is not that far from Toyota City. They've been learning technical problem-solving methods, but they are also focusing on creating a culture of safety. The one physician champion for their whole effort very proudly showed a chart on screen that showed this dramatic increase in reported incidents


I figured you react that way.

It's wonderful. Peter Senge called that general phenomenon, or at least I'll describe this general phenomenon as the worst before better effect. If you're trying to tackle a thorny problem like patient safety, the first thing that has to happen is that people have to become more willing and more open to talking about what's not working. If you pull that first part off, then the data will look worse. They aren't worse, but they will look worse. The fact that A) That was happening, and B) They recognize that were good signs in their own right.

The other part of the story is that we can't apply these problem-solving skills we're learning if we're not openly identifying the problems to work on. When I started my career in manufacturing, I was in workplaces that were generally not psychologically safe for frontline workers or even myself as an engineer or an improvement person. That's one reason this is all personally interesting to me, but I'm curious, what's your definition of psychological safety? Has that definition evolved since your publication in 1999?

In '99, I defined it as a belief that the workplace is safe for interpersonal risk-taking like speaking up about a mistake, asking a question, asking for help, or offering an idea. Now, it's simpler and maybe it communicates more clearly to define psychological safety as a sense of felt permission for candor. That other definition sounds a little academic. Most people don't think a lot about interpersonal risk.

We intuitively are aware of it and will hold back if we feel threatened or anxious about what others might think of us. We're always going to air on the side of silence. Nobody got fired for silence, for example. To reframe it or at least re-explain it as permission for candor, speaks more directly to what I'm talking about. It's about believing that not only can I say what I'm thinking or worrying about, but it will be valued and that's what people want me to do.

We're always going to err on the side of silence. Nobody got fired for silence. Share on X

Leaders in an organization might tell people, “We want you to feel safe to speak up,” but that doesn't mean people feel it.

Leaders in organizations have good intentions and I don't think they're lying when they say things like that. They may be unaware or blind to the impact either the aspects of their behavior or role have on others. They can be blind to the residual beliefs that people bring in from other companies, other roles, or other jobs.

The past wounds and scars.

The problem is it is such an asymmetry. If I speak up and it's welcome, then I'll do it again, but if I speak up and I'm humiliated publicly in front of my colleagues, it could be a while before I do it again.

I've seen situations where we can all picture and suffered through examples where the humiliation was obvious and overt. Sometimes, it's more subtle. I'm curious about your reaction to a situation where a well-intended, kind, and thoughtful healthcare executive in a high-ranking organization who's not known to the frontline staff come around and see a review of either a kaizen improvement or a project.

The frontline staff have spoken up and helped test and implement something. They're sharing what they've done and the executives are asking questions that they might think of as a humble inquiry or they're trying to learn about it and it comes out as something like, “Why didn't you do such and such?” That ends up feeling humiliating.

That might be an okay question after three others which would be, “Help me understand why you did this. How did you think about that?” I'm asking about what they did do and expressing genuine interest in what you hear and then, “Did you think about doing X?” Most of the time, none of us are pretty are particularly aware of why we didn't do the things we didn't do. The, “Why didn't you?” focuses on the gap. The frame is, “Here's a gap I see. What's the matter with you?”

There are softer ways. You used a little bit different language like former Toyota mentors of mine would say in reviewing problem-solving or coaching. A favorite question of Toyota leaders would be, “Tell me the three other things that you considered.” I could see that question in a certain environment is not threatening and dangerous. A lot of it has to do with context.

It's such a respectful question if you think about it because it assumes you considered other things which is proper practice. It expresses curiosity about what they were.

I'm thinking specifically of healthcare. I've seen organizations whether they're drawing on lessons from aviation and the cockpits or other high-reliability organizations. I've seen organizations teach and not just lecture people. They should speak up and create an environment. They're trying to teach people constructive ways of how to speak up. There are sometimes events where somebody doesn't speak up and errors occur. Let's say the nurse is getting blamed for some role in something. I've seen people get on a soapbox. They have a professional obligation to speak up and I don't think it's that easy.

This drives me crazy. To be generous, let's put it this way. You can agree with the statement and in fact, I suspect both of us do agree that one has a professional obligation to do such and such. Having that professional obligation doesn't mean it's possible to do it. Having a professional obligation is in a sense, a values statement, but let's look at the efficacy of it. The efficacy statement would be to what extent is it possible realistic, easy, and enabling for people to do in. In Lean and in healthcare, in general, where you get good outcomes is when there is both a felt obligation and a felt permission. One without the other is not enough.

LBI Amy C. Edmondson | Psychological Safety
Psychological Safety: Where you get good outcomes is when there is both a felt obligation and a felt permission. One without the other is not enough.

Why aren't people speaking up? Why aren't people coming forward with ideas?” It's a blaming judgmental way of framing it instead of asking and pointing back at leadership. What can leadership do to make it safe? That's what you're looking at.

The problem with blaming the nurse is that those with higher responsibility and maybe higher pay, in some cases should have a greater obligation to ensure that the climate is genuinely enabling a voice. It's not right to blame those who didn't speak up. One has to look inward first and say, “What did I do to make it difficult?”

That's an uncomfortable thing for people to reflect on.

We're talking about work. I'm not talking about your next dinner party. You're at work and you have an obligation to have yourself be a little bit uncomfortable when you're in a leadership position. Otherwise, it's everyone else who has to be uncomfortable. That's not fair.

In terms of working toward creating a safer or safe environment, does this necessarily start right at the top of the organization? Otherwise, maybe middle managers are being caught in the same as frontline staff for not feeling safe.

It helps when all of us can look to the very top of our organization and see a role model who is curious and passionate about improvement, acknowledging his or her own failures and shortcomings, and owning it. When the company has a failure, which we all do at various times, hopefully, not a big one, they take responsibility for it. That is good role modeling and that casts a wide and magnificent effect on others. That said, anyone anywhere who's leading a team, heading up a plant, or in charge of a restaurant and a chain can do things that make their little part of the world their pocket as psychologically safe as possible.

Anyone anywhere who's leading a team can do things that make their little pocket of the world as psychologically safe as possible. Share on X

If you don't have one of those fantastic role models at the top, that doesn't mean your hands are tied in terms of making your proximal work environment as good and learning-oriented as it can be. If you do, you still can get pockets of tyrants here and there, either unbeknownst or known to executives. It's not going to explain all the variants of how the top acts, but it may explain some of it.

One other thing you write about and there are parallels to language that Toyota uses is talking about humility and leading with humility as an important part of this type of culture. Can leaders become humbler or are some of that ingrained? Once humility is lost, can it ever be regained?

It can. Maybe I'm an optimist on this point, but I like to think of humility not as a trait, but as a skill. If you pause to think about the pace of change now, the remarkable amount of knowledge and expertise that exists, of which each and every one of us can only master a tiny bit, it's a rational stance to be humble. It's a terribly irrational stance to be arrogant because none of us know everything. None of us has a crystal ball.

When one is confronted with reality in that way, even if I have a lot of confidence about what I know about this situation right here and right now, still stuff is coming at us. I've got to always be humble in the face of that uncertainty and that can be learned. People can be reminded to be humble in a way that isn't self-effacing and isn't to say, “I'm useless.” It's to say, “I have knowledge and experience. I am almost certainly missing something.” With that recognition comes a deep desire to keep filling in the gaps. This can be taught and it must be taught.

It seems like there's an opportunity then for follow-up coaching for an executive coach to either pull someone aside and point out slipups in their attempt to be humbler if they've made a commitment to that.

Even just to let them know because oftentimes, coaching that says, “You're this and you're that. You messed up,” might be informative, but perhaps it's more powerful to say, “You couldn't possibly have seen the impact of that move in the same way I did because I'm sitting on the sidelines. It makes it easy. When you're in the thick of it, it's hard to see the impact.”

I can't see the impact I'm having right now on you or the readers, whereas someone who's on the sidelines reading can see it better. It's a gift to share with me the impact that I'm having that I didn't know I was having. For better or for worse, I can learn from the things that went well and I can learn from the things that didn't go well.

Maybe along the lines of learning from things that don't go well in any field or approach, there are misunderstandings and this happens in the context of Lean. Go to hospitals and people somehow have developed misunderstandings about Lean. You write about this in the book, but what's the first or the most important misunderstanding about psychological safety that comes to mind?

In retrospect, the most important misconception is that psychological safety is about being nice. I saw in an online digital article that the Google study said, “Better teams are ones where people are nice to each other.” No. I'm not against being nice, but the problem with that statement is in organizational life, nice often means, “We will say to each other's face what we think each other wants to hear, and in the hallways or with others, I will say what I think.” In many ways nice, therefore means tiptoeing. Psychological safety is the opposite of tiptoeing.

Psychological safety is not about being nice. Psychological safety is the opposite of tiptoeing. Share on X

It's being willing and feeling that it's okay and in fact, it's expected. “I'll err on the side of speaking up when I'm not sure.” That's the biggest one. The second biggest one is psychological safety means we have to dial back on ambition. We can't expect people to stretch and accomplish great things. Nothing could be farther from the truth. In fact, I'm suggesting that only with psychological safety can we stretch and accomplish great things together.

You sketched that out very nicely in the book in a classic 2X2 matrix of aiming high and having professional safety as opposed to exactly not aiming high, and you called that comfort.

It's the comfort zone. I don't want to work in my comfort zone. There are days when I would love to work in my comfort zone, but most of the time, I feel much better about myself and my colleagues when we're getting challenging things done together.

You talked about this misunderstanding about being nice. I hear similar things when people talk about Toyota's principle of respect for people. What's been taught and emphasized to me is that being respectful means sometimes challenging people because you believe in them. That might not feel like being nice.

I wouldn't challenge you if I didn't respect you because I wouldn't expect to get much from it. The very act of challenging someone is an act of respect.

You said something and it came back to some other language from the book where you talked within the context of psychological safety and respect. There is a time and a place to sanction clear violations. I was wondering if you could talk about that a little bit.

It's somewhere between a misconception and something that we don't think about enough which is odd. I will argue that it makes the workplace more psychologically safe, not less when misconduct or way outside-the-bounds behavior is treated harshly. If there are no negative repercussions for doing the wrong things, then people don't know where the boundaries are. If I know where the boundaries are, I can feel safer within those boundaries. If anything goes, then it's a little bit of a walk-on eggshells type of environment.

I think of Bob Sutton's work, The No Asshole Rule. Psychological safety shouldn't mean I feel empowered to be a raving able.

A total jerk. That's a stance that basically says, “The world revolves around me.” it's a stance of being inadequately aware of other people's feelings and experiences.

This comes back to the questions around not just humility, but empathy, a leader not being a narcissist.

It couldn't be more important. If you think about leadership as the activity of influencing others to do hard things for the greater good and you're a narcissistic jerk, that doesn't influence others to do anything but hide or the bare minimum.

If you think about leadership as the activity of influencing others to do hard things for the greater good and you're a narcissistic jerk, that doesn't influence others to do anything but hide or the bare minimum. Share on X

I was wondering if you could talk a little bit about why it's important to make it safe to fail and in what context failure is acceptable, if not beneficial to an organization.

Failure is an encompassing term because failure technically includes such things as small mistakes. I failed to set my alarm clock the right way and I was late for work. That's a failure, but it's a mistake. It's a human error that all things being equal, we'd like to find ways to prevent small preventable errors from happening. That's one category.

Way over on the other side of the spectrum are failures that happen when we engage in experimentation and thoughtful experimentation where we had a good hypothesis based on existing knowledge at the time that we hoped would work, but it didn't. It failed. There was no way to get that new knowledge that we now have without doing it. That's what scientists have to do all the time.

Clearly, those are two very opposing meanings of the same term. The latter, the failures that happen when we are in new territory with hypothesis-driven experimentation are mission critical to innovation and to creating new value. Organizations need more of that and when people are reluctant to engage in any failure because the consequences seem so severe, they won't do that failure. Sometimes you get a management style that says, “If we're tough on people when things go wrong, then things won't go wrong. Things won't go right either in terms of innovation and new knowledge.”

The first task here is to be discerning about the kinds of failures we want more of and then the kinds of failures that we'd all love to work together to avoid or the preventable failures that are in relatively known territory. Neither kind of failure should be subject to shame or blame. Both kinds of failure are worthy of diagnosis and learning as much as we possibly can from them. I'm still anti-shame and blame no matter what, but it's not proper to call preventable failures or mistakes good news for the organization. They're not.

LBI Amy C. Edmondson | Psychological Safety
Psychological Safety: It's not proper to call preventable failures or mistakes good news for the organization. They're not.

There's a lot of talk about failure and you're connecting it to innovation in the Lean startup community fail early and fail often before you try to scale. I've heard some people say, “This gets taken to an extreme.” People are fetishizing failure and there's a happy medium in that spectrum.

I don't think it's a happy medium. It's more of a map or a discernment. What you want to include in the Lean startup space is a smart failure. It's plain stupid to have a failure that one could have done a tiny bit of research and known in advance would not work. I would call that a preventable failure, but the person might not feel like it was preventable because they didn't even do their homework.

Number 1) Make sure it's a good hypothesis. Number 2) It needs to be the right size. What does the right size mean? It means it's just big enough to learn, but no bigger because then you get into the territory of wasteful. You don't bet the farm on some product that you don't have any real idea of whether it's going to work. You test it out with a smart pilot that's designed not to succeed but to break into the right places so that we can figure out how to make it better.

Is there a happy medium?

That's an important bit. It's not a happy medium. It's about discernment. It's about being smart and thoughtful in advance. It's not, “Is more failure good or bad? Find the right sweet spot on that spectrum.” It's what kind of failure and homework you do. What size did you do it?

That broader message came through. You're talking about failure to do your research. There was a phase where American Airlines redesigned the look of their signs at the gate and it was all very fashionable. The text was very thin and it was much harder to read. It looked prettier, but it was harder to read compared to the old version. Somebody must have eventually given enough feedback and said, “I can't read the signs at the gates anymore.” I wrote about it and I said, “The ability to iterate isn't an excuse to do a good sign upfront,” which is what I hear you saying.

Spot on. I don't know how many hundreds of thousands of dollar mistake to roll out all that new font and then unroll out all that new font. Ten people in a laboratory setting could have told you, “That's not going to work.”

They might not have been in an environment high in psychological safety.

That's true. We hired this expensive designer who says, “Here's the best new thing.” “I'm not a designer, so I'll lay low.”

Two other questions before we wrap up. I'm curious if there's a difference in the response that you get if you're teaching more seasoned executives versus a classroom with younger MBA students, in terms of their reaction to the need for psychological safety and the role of leaders in creating that.

Not really. The one difference that I do see is that let's say the MBA students or younger students are immediately worrying about, “Yes, but I'm not the boss. Now that you've told me these matters, how do I get it for myself?” They aren't thinking about the people who will be reporting to them as much as they're thinking about the layers above them whereas the senior executives are a bimodal group. Some of them are just, “This is interesting.”

Some of them are experiencing an a-ha moment. There's this awareness that this is a risk factor for them and for their firms. There's a wow. There's this assumption that they know what's going on and then when confronted with the thought, it's not a surprising thought, but I might not know what's going on because the very power of the office might lead people to hold back.

It's interesting they frame it as a risk to the firm as opposed to an epiphany. “This is how I should be treating people.” They're framing it in terms of, “We could fail or we're not going to be as successful.”

That's okay. It's not about them as people. It's about their ability to lead an organization to create value, which is a pretty challenging activity if you think about it.

It's probably helpful to frame it in those terms instead of saying, “This is how it should be,” and pointing to examples like Google or other companies to show, “Here's the data or the evidence.” Last question, since The Fearless Organization was published, what's the most interesting or surprising thing that you've learned about all of this since, if you could go back and magically insert something into the book?

It's been such a journey. When I wrote the book, I felt the book was overdue. It was time to get this out there in an accessible format. Since it has come out, I've had so many conversations with people like you that help me make connections that enrich my own thinking. The thing about a book is it ends. You have to finish writing it and it has to get out there, but the ideas keep moving forward.

The thing about a book is it ends. You have to finish writing it and it has to get out there, but the ideas keep moving forward. Share on X

At the end of the book, in the last two chapters, I give practical suggestions for what you can do to make your organization or your team better. Now, my idea of practical suggestions needs to be made even more concrete and maybe more context-specific and industry-specific. As a result of this journey, I now see an opportunity or a need for more workbooks and playbooks like, “Where do you begin? How do you get started? What do you do next?”

LBI Amy C. Edmondson | Psychological Safety
The Fearless Organization: Creating Psychological Safety in the Workplace for Learning, Innovation, and Growth

Thank you for sharing your reflections on that. I like the book very much and I'm excited we had the chance to talk. I encourage everyone to go find it. It is The Fearless Organization: Creating Psychological Safety in the Workplace for Learning, Innovation, and Growth. These are important challenges and important things to be working on. Any final thoughts that you'd like to share, Amy?

We've been having such a wonderful conceptual conversation. The book is full of stories and that's what makes it readable as opposed to merely academic. There are 25 case studies in it of companies that both have been missing tragically psychological safety or who have worked hard to put it in place. It's the stories that make it come alive.

Thank you for those stories and for the book. Amy, thank you so much for taking the time to be a guest here.

It's my pleasure. Thanks for having me.

Important Links

What do you think? Please scroll down (or click) to post a comment. Or please share the post with your thoughts on LinkedIn – and follow me or connect with me there.

Did you like this post? Make sure you don't miss a post or podcast — Subscribe to get notified about posts via email daily or weekly.

Check out my latest book, The Mistakes That Make Us: Cultivating a Culture of Learning and Innovation:

Get New Posts Sent To You

Select list(s):
Previous articleThe Path to Patient Safety in Japan (or Elsewhere): Reporting Problems, Solving Problems
Next articleCan Questions About Patient Safety Feel Psychologically Unsafe?
Mark Graban
Mark Graban is an internationally-recognized consultant, author, and professional speaker, and podcaster with experience in healthcare, manufacturing, and startups. Mark's new book is The Mistakes That Make Us: Cultivating a Culture of Learning and Innovation. He is also the author of Measures of Success: React Less, Lead Better, Improve More, the Shingo Award-winning books Lean Hospitals and Healthcare Kaizen, and the anthology Practicing Lean. Mark is also a Senior Advisor to the technology company KaiNexus.


Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.