Failure is the secret to success. Our failures are agents of inspiration, iteration, and innovation.
Dr. Amy Edmondson is the Professor of Leadership and Management at Harvard Business School and her newest book "Right Kind of Wrong" is all about how we can change our relationship with failures and start “failing well.”
This is…A Bit of Optimism.
For more on Dr. Amy Edmondson and her work check out:
https://www.amazon.com/Right-Kind-Wrong-Science-Failing/dp/1982195061
https://www.hbs.edu/faculty/Pages/profile.aspx?facId=6451
https://www.linkedin.com/in/amycedmondson/
Doctor Amy Edmondson wants you to fail well kind of She wants you to learn how to fail well. That's the subject of her new book, The Right Kind Of Wrong, The Science of Failing Well. Amy is a professor of leadership at Harvard Business School, and her groundbreaking work on the concept of team psychological safety has changed the way we look at workplace culture. So we sat down to talk about why we fear failure so much, what it means to fail, and how failing might actually be the secret ingredient to success. This is a bit of optimism. Amy Edmondson, thank you for joining me. I'm actually very very excited to talk to you because you study a subject that I think plagues many people. In fact, I dare to say everyone, which is the fear of failure? What I find fascinating about your study, and I see this in my work as well, which is we are constantly bombarded with conflicting instructions about failure, like we must never fail or the rise, especially in the tech world, of fast failure. And so how do I reconcile how do we reconcile the fact that I'm never supposed to do something that I'm supposed to do often well.
First of all, thank you, thank you so much for having me, thank you for being interested in this subject. It's a very human subject. As you said, I think this is something that plagues all of us, and oddly, I think there is an answer. I mean, I think there is an answer to helping us reconcile those competing narratives and those very real tensions. And the answer lies in the recognition that not all failures are alike. Indeed, there are failures we should absolutely work hard to have. Some of them are even embarrassing, others even shameful. Let's do our very best to not have so many of those in our lives. But there are also failures that are genuinely productive, genuinely good, because they bring us new information. They allow us to progress into new domains in ways that we just couldn't without them. So when we make those distinctions, then I think navigating this fraud territory becomes a little more possible.
One of the struggles I always had with the word failure is semantic. It's like cancer. If you have a mild melanomo or you have stage four liver cancer, those things are both called cancer, but they're clearly not the same thing. The word failure is the same, which is we have the same word for things that are very, very different, and it seems to require us to have different words so we can distinguish what we have to avoid and what we want more of.
Some words that we could play with in terms of making distinction, for example, mistake right. Mistake is a word that I think has a very specific meaning. It means there was a right way and we didn't do it, you know, by mistake. Sometimes that's just a slip, sometimes we don't yet have the full training to do it well. But it's a kind of failure, but it's a particular subset of failure. Very other end of the spectrum is discovery, which is another great word, like who wouldn't want a discovery? And some discoveries are successes, but many discoveries are failures. We discover the path that didn't work in our efforts to discover the path that will work.
I had the opportunity to visit three M in Minneapolis, and I mean three M does everything. I mean everything from medical equipment to the coding that goes on the covers of your headlamps on your car so they don't get scratched when rock hits them. And they invented the post it and they have an amazing view on failure, which is they don't believe in it, which is failure is something that goes in the trash. And what they talk about is something that didn't meet the brief. So they were told, for example, to make a strong adhesive, but they accidentally made a week that's not a failure, it just didn't meet the brief. And the way that they deal with failure is they're very public about it, where they talk about what they made by accident because somebody else might need what they have, and this is idea of sharing the accidents, the mistakes, as you pointed out, like didn't go according to plan. That makes them perhaps one of the most innovative companies in the world.
Absolutely, and the post it note, of course, is an oft told story and a great one, and maybe not everyone knows the fuller story. One of the things that I think is so surprising about that story is how many many years it unfolded. During this was not I kind of ooh, here's a thing that didn't work the way I wanted, and then over here suddenly a week later, I got it all sorted out. This happened virtually over decades, well maybe decade anyway, And this is what makes three M so special, right, And what I think so many organizations should emulate was that there was no stigma to walking around and telling everybody about this failed experiment, like, I got this thing, don't know what to do with it, but it sort of strikes me as it must have potential for something. You know, there must be a pony in here somewhere. And then along comes Arthur Fry who has his famous Eureka moment where his bookmark falls out of his choir book in choir practice, and he's like, God, wouldn't it be nice to have a sort of vaguely sticky bookmark that could easily be peeled off without any harm to the pages? And you know, that's not the end of the story. That's sort of the beginning of chapter two of the story, ultimately the development of this billion dollar product. But it's their very enthusiasm about the wrong turns that allows them to successfully take it through to the ultimate, happy ending.
So what is it that prevents other organizations that claim they're innovative? Because I've never met an organization that says they're not innovative. They all think they're innovative, but the reality is they're not. What is it systemically? What is it in their culture that makes them unable to embrace the failures that Spencer embraced.
Well, I'm going to start with human nature. We are more or less hardwired to prefer success and looking good to failure and looking like we missed the boat. We have this very strong desire to get approval from our colleagues. We probably have evolved to get that more from successful moves than failed moves. That's reinforced in our schooling and our upbringing, where again, getting the right answer is rewarded in school, getting the wrong answer isn't. And then, finally, and maybe most directly, organizations are usually set up to reinforce both of those factors. But what three M is realize that human nature and sort of socialization and most organizational systems are not set up to enable innovation. So what do they do. They go out of their way to sort of design something weird and different. They go out of their way to say, we're going to just create systems and structures and rewards that override that default state.
Well as I think out loud about this, you know, I think confidence plays in there. I mean, when I think about the people that I admire for their innovative spirit, they make plenty of mistakes, but they seem to own them with confidence. The confidence shines through, you know, when something goes wrong, they sort of like, yep, screw that up, and it's sort of inspiring that they're relaxed about the screw up. And that confidence, I think is contagious, and good leadership encourages us to be confident in our failure. I've had this conversation with other people before, which is it's about owning our insecurities rather than avoiding them or hiding them. So, for example, I'm chronically disorganized, and I don't sort of go, oh, yeah, really sorry, I'm really disorganized, you know, I go, yeah, I'm just completely disorganized. I'm going to need somebody else's help here, because if you leave it to me, I'm going to completely screw it up. And just the confidence of owning it people come and help.
You know, I have to say I'm you and I would not be a good team in that sense because I have the exact same shortcoming. I mean, I love the ideas, I love the concepts, but then when it's sort of I don't think I could organize my way out of this room to make my way downstairs and get some lunch. Right, I could, and I will. But it is the only little word.
Fortunately, the fear of dying, of starvation will drive you to find it. I'm confident.
Let me back up and say I love the point about confidence because it's a kind of inner confidence. Maybe it's an inner confidence to say, Okay, I made a mistake. I am not a mistake. I have shortcomings, but I am not coming up short. You know, a human being on this planet, and in a way, that's part of what I try to convey and write about, is that each and every one of us, like it or not, is a fallible human being, whether or not we can admit it, acknowledge it, be okay with it. That's where the variance comes in. And I think you're absolutely right. The people who are sort of able to say in various ways, I'm a fallible human being, but I'm going to try again and it's going to be okay, and thanks for coming along with me. Those are the people we want to follow.
It goes back to your previous work, which is the concept of psychological safety, which I believe you coined. May we all bow down to you.
Please don't. I didn't actually coin, you know, I get too much credit for this. The term psychological safety was in the literature already. Psychological safety is that belief that when I take a risk around here, it won't prove fatal. What I did coin in an academic page paper was the term team psychological safety, because my little discovery was that teams varied in terms of their psychological safety right, And that was kind of interesting because these were teams in the same company, a company, let's say, with famously strong corporate culture, and yet still there's this real difference in interpersonal climate across them. So I thought, this isn't just an individual thing or just an organizational thing. It's very much a group thing, an emergent property of a group.
Our work is symbiotic. I was looking to understand trust, and we usually think of people as trustworthy or not trustworthy, and my work led me down the path where I called it a circle of safety. It's that when there is psychological trust in the group, people are more trustworthy, and when there is a lack of psychological safety in the group. People are less trustworthy because they're forced to think about themselves because they can't trust the group or they are leaders to look out for them.
And then that's more it's a more uncomfoble, more painful, less happy place to be. I mean, when you're sort of stuck in that self mode rather than the US mode. It's so much more fun to be part of a we that is trying to do cool, exciting uugh.
But it goes back to what we're saying a few minutes ago, which was when that team psychological safety is offered, which is primarily the responsibility of leaders. It doesn't have to be from the person with the authority, but there has to be somebody who offers team psychological safety. And I think they are the ones kind of like a parent who when something goes wrong, instead of saying what have you done, it's what has happened? As you said, it's not about you or the problem, but there is a problem. We have a problem.
Yeah, And the question what happened is such a powerful question, such an important question, and that's sort of foundational to good safety practices, meaning worker safety, patient safety practices. Where in order to learn as much as you can from anything that goes wrong, the right question you're taught is what happened, you know, not who did it, which is the instinctive question who's to blame, It's what happened.
I visited one of Barry wey Miller's factories. It's a company I've written about a lot, remarkable organization, and I visit one of their factories in Phillips, Wisconsin. It's a factory they make stuff, and like many factories, they're obsessed with safety because it is a dangerous environment where people can lose fingers and limbs and things. And what I found astonishing was nowhere in the factory where there are huge signs that said safety first. Because I've been to other factories and there's like signs everywhere. The data is sort of amazing, which is when you put a sign up, preoccupation with safety goes up about five percent. But when it's cultural and people are looking out for each other and there's strong psychological safety, it goes up to something like seventy percent. In other words, the sign doesn't make people safer. The sign simply covers your ass. If there's a lawsuit, because something went wrong. But this idea that we look after each other not only produces great safety, but I have to believe helps me overcome the fear of failure of trying something new, because safety not only comes from a machine, but fear of being humiliated, fear of being fired, all of these things, and the psychological safety has far and wide reaching impact, not just keeping our people safe and in danger's environment, but for that desire to raise your hand and say, I think we have a better way of doing this. I have an idea.
That's all music to my ears. And you're bringing up something that is really important and we should probably talk about it more. But it's context. Context matters. The more we get into a production setting, the more we really do want people to kind of stay within the lines. And the big risk interpersonal risk we want them to take is that risk of speaking up quickly and immediately when they see or hear or notice someone doing something that, for example, might not be safe to them or others.
You're raising a nuance that I think is extremely important to double click on. And I've seen this happen, That's happen in teams that I've led, where I encourage risk taking or encourage experimentation. But then somebody experiments or does something without telling anybody and it goes haywire and it was easily preventable. And when you say we could have prevented that, they say, well, you told me to experiment. You told me this is a good thing to do, and I think the new once there is absolutely but we are a team. The psychological safety is team psychological safety. You have a responsible freedom to experiment, which means you should probably let somebody know. I mean, I do this. I never make decisions big or small with at least bouncing the idea of somebody, even if it just takes a couple seconds, a couple minutes. Hey, I'm about to do this. What do you think in case they see something or know something that I don't. And the idea that absolutely encourage experimenting and risk taking, but not by yourself in a vacuum. That's actually irresponsible.
Exactly exactly. I mean, decision making with respect to anything consequential is a team sport because there's always to be some perspective or a bit of information you just didn't have access to that could change the decision or change your point of view about it.
I think that's right, and I would add one thing else. I think you know, you talked about uncertainty, you talked about potential impact. But I think the other one is when you're dealing with human emotions. And I think the nuance here is when we talk about process, which is you don't always need to seek authority from somebody higher up in the pecking order before you do anything that would be disastrous foreign organization. But I'm talking about just leaning over to the person next to you and say, can I just read this email I'm about to reply to somebody. I want to make sure I'm not being insensitive. It's those little check ins, which opens a whole Pandora's box here about the value of in person work environments versus distributed work environments, which is you have those people around you for checking in. When we're a distributed workforce, you have to call someone who set up a zoom. It becomes more formal, and so we don't do it.
You're not going to do it. You're just not going to do it, right. I mean, I think that phenomenon you just identified has been so underplayed or under emphasized the last couple of years.
Do you have any data or maybe even it's just instinctive based on your work on the value of an in person office environment versus distributed office environment when it comes to fear, when it comes to failure, failure, tolerance, mistake making, all of the things we've been discussing. Have you found any data that shows a difference.
There are preliminary data that suggests psychological safety takes a hit when we're remote. I don't think the story is fully told or described like this is sort of work in progress, but intuitively, logically we have put a higher bar. I mean you just mentioned before like in a in person situation, I can turn to my left and turn to my right and just say, like, what do you think you know? And it's just it's so easy. We will do it naturally. And as you also said, you know, if I had to sort of set up a zoom to say, hey, Simon, how was your weekend? And oh, by the way, you know, like I'm not going to do it, that would feel just in truth and wrong. And so that means that I have less ability to just easily speak up and connect with you and learn from you and help you learn from me. It's just harder, which means doesn't mean we can't do it, but it means we're going to have to be oddly more heavy handed. That we're going to have to put in systems or routines or rituals or ways that we're going to deliberately lower the hurdle that just got put higher.
If we follow that logic that by being at home, I'm not seeking little reinforcements for decisions I'm making, which could lead to increased levels of fear to experiment or make decisions, and increased levels of fear lead to increased levels of insecurity, anxiety, and all the other mental fitness challenges that we're dealing with. We're making an argument here that those little blocks and tackles that help alleviate fear and promote mistake making and promote good failure have been diluted in a distributed workforce. So personally, we're feeling the impact of increased fear of failure, but organizations are suffering because there's less experimentation. Is that a fair logic to describe I.
Think it is, you know, and it's consistent with I don't know if you saw that study that Microsoft did sort of part way through the pandemic where they had just such an enormous data set on their own employees. But what they found was that we're still collaborating. Of course, you and I are doing that right now from two different locations, but the nature of the collaborative relationships had really shifted. That the likelihood of sort of reaching out to people what the sociologists would call weak ties, people that you know less well, people from other departments, divisions, locations, et cetera, went way down. So we were still teaming up with people, but tended to team up more with people who are like us, you know, had more similar backgrounds and jobs two hours than before. And so that, you know, I think that's consistent with what we're talking about, which is the hurdle to horizontal teaming went up.
So in a distributed workforce, strong ties remain the same or maybe even get better because you talk to them all the time and they're strong ties.
He you're lonely, you want to reach out to those people you already know, like right, hey, Z, I mean how you're doing over there right.
Exactly exactly, And the weak ties either remain the same or decline, which means they get worse. And maybe even disappear. And what creates innovation is diverse thought and diverse perspective. It's ironic, right when we talk about diversity and inclusion, a distributed workforce actually eliminates diversity and inclusion. It restricts diversity inclusion because we're looking for people like us that we know well, that we've worked with for a while.
Exactly, we are at far more risk of going to like minded, like backgrounded others than before.
Can you give a specific example of a company that you have researched or learned about that you admire, that does failure well that you wish your own organization could be more like?
Well, you already mentioned three M, and I really do love the three M model, particularly because it's so overdetermined. It's not one thing, it's the many things that create this environment of learning and anitation.
Just so that people know what we're talking about. One of the things that three M does is they have I think it's a biannual. I can't remember an event where it's not required, it's voluntary, but almost all the engineers go and they basically stand on the stage and say this is what I'm working on. And it worked this is what I'm working on and it didn't work. So they all share their successes and failures with each other in case somebody needs what they're working on, and it's just celebrated. It's just normal and part of the culture. It's kind of incredible and magical, it really is.
And it's you know, it's that plus that you get some portion of time to just play around and experiment with things, and even resources for that. It's the facilities they have that sort of encourage interaction and all the rest. So that's one on the other end of the spectrum, and that it's not all about innovation, it's more about efficiency and excellence. Is the Toyota production system, which I'm also a genuine fan of because they have engineered learning into production instead of just like do the thing, here's how you do it, do it perfectly, and you know, shut up, keep your head down. It's here's what we're thinking. Here's our hypothesis about how to make the best Corolla. But you know what, we really need you to tell us where it's falling short or even where it might be coming close to potentially someday falling short. Right, We want your ideas and in fact, we don't want to just say that we want to engineer into the system a way to operationalize that. We'll call it the end on cord. We want you to up pull it right anytime you have a hypothesis about a concern or and improvement opportunity and it's like three in such a different context there overriding human instinct, socialization habits, and saying can you just be like a learner? Can we do this together?
And Americans, And no offense to Harvard, where I know you have an association where you teach very deep one. Yeah it was a Harvard professor, but we will leave that aside. Who brought the Toyota Way to the United States and made the single biggest mistake an American could make. They called it lean because to your point, it's not about efficiency. And Americans, when we took the Toyota away, we made it all about efficiency. But it was not about efficiency. It was about constant improvement, as you said, it was about a learning behavior, and it was about seeking ideas from the front line, the people who are actually doing the work to inform systems, not to enforce systems.
That's right again, that's where sociology I guess can override and habits and culture can override even the best possible design. I really do believe the essence of the Toyota production system was about empowering everyone to think more like scientists. And it shows such a deep respect for the humans that they hire. Here you are come in, Yeah, they're doing a production job in a factory. Deep respect, right, deep and abiding respect for your potential to help us be better.
And I know, I mean they're hypothetical examples, but they could have happened where it's so detailed. The constant improvement, the desire for constantmprovement that somebody who installs the bumper in the front of a car, they can make the request that if you could please put the rack of bumpers just one foot to the left, I can grab them without having to take a step, and I can save one second for each bumper. And Toyota will do that. They'll move the whole thing one foot to the left so that somebody doesn't have to take a step. And what they discover is one second plus one second plus one second plus one second and all these micro improvements that everyone's empowered to do. And before you note at the end of a couple of years, you can produce a car fifteen to twenty minutes quicker than your competition.
So astonishing, and then at the end of a decade you've got you know, millions more in the bank as a result.
We've talked about what fear looks like in a group and how psychological safety is imperative for people feeling confident to try new things. What if you're the person responsible for creating psychological safety and you have fear, like, how do we overcome our own personal fear of humiliation, fear of getting fired. By the way, sometimes it's imagined and sometimes it's real, you know, depending on the culture of the company.
You might not like this response, but I think it's deeper than it might first appear. Right you say, so, you own it, You own it out loud, because that just that levels the playing field again. Right you sort of say, gosh, I might not get at this, right, I'm a little concerned about what I'm about to do, And suddenly you've got supporters, you know, helpers rather than antagonists. I must have to recognize that relationship is inherently an antagonistic one until you make it otherwise. Until you make them part of the team and your team, and you're a part of their team, it won't be Thus, what.
You're talking about is leading by example, right, which is if the leader expresses their fear, then the likelihood that the people on the team express their fear goes up and we can all help each other.
Yeah, Ed Catmill says, another company I admire, Pixar says, you know, if we as leaders can go first in acknowledging our own mistakes, then it makes it easier for others to do likewise.
I heard Lionel Richie tell a fantastic story. He is a young performer, was crippled by stage fright, crippling stage fright, and yet look at him today. You know this remarkable performer. And he says, the difference between the hero and the coward is not the presence or absence of fear. He says, the difference is they both feel fear. The difference is one steps forward and the other steps back.
It's exactly right, It's exactly right. I often say that psychological safety and encourage or two haves of the same coin.
So a practical question, a smaller organization that not like three M is not able to spend the time or resources to allow people to quote unquote play. What are some practical ways that smaller organizations can learn to be more failure tolerant and give the ability to allow failure to become an essential part of their process.
I think the answer to that is in the essence of a smart failure or an intelligent failure, is you do smart experiments, not stupid ones. So you think carefully, you know, deeply about what You've talked about a lot the why. You know, why does what does our company exist to do? What's the value proposition? What value are we offering to our customers, clients, society, what have you? And then okay, what do we have to do really well today, tomorrow and into the future to keep offering that value? You know, what are we missing right now? And then well, we don't know how to do that? Well, what you know, smart right sized experiment can we run to test a possibility?
I mean I want to push back a little bit there though, because it goes right back to the beginning of this conversation, which is the fears born out of uncertainty. And you know, the devil we know is better than the devil we don't, and certainty is much scarier than just staying the course, even if we know we couldn't do better if we took the risk, and doing nothing might result in sort of our demise. At least, there's some predictability to staying the course.
It's true.
It's true, And this happens in big companies as well, like somebody's personal career. I've been doing the same thing for thirty years. I'm really good at this, and now you're telling me, because of technology, I have to completely change the way I think about my business going to happen. It goes right back to uncertainty.
Yeah, And so what do we do with uncertainty? First of all, let's name it right, Let's put it out there so it's front and center, it's in the room. It's not something we don't talk about. It's the opposite. It's something we talk about all the time. And then how do we put our minds and backgrounds and expertise together to come up with the most high possibility tests to run, experiments to run. It's not a spray and pray idea, it's a what is really a worthy thing to pilot to figure out whether it could work, and let's keep in mind that we might be wrong, but if we're going to be wrong, let's be wrong at the right size.
Thank you for the for the plug, for the why. The whole idea behind the why was that it serves as a filter and a focusing device that when we know why the company exists, and we know that this pattern exists, when we operate at our natural best, then it basically just creates edges to a sandbox, which is, do anything you want to advance this thing, but it's got to be within the sandbox. In other words, it restricts shiny object syndrome, which is I read an article, I saw another company doing this. We're going to try that. If it advances the greater good of advances the cause that you've established, try it. But anything outside of that, no matter how good the idea, you don't do it. So what the why does is it? It puts edges on the sand box of experimentation.
Yeah, I love shiny object syndrome. That's exactly right. Put the edges on the sandbox, which has two meanings. One is, you know, here's where we experiment because it's in our wheelhouse. Here's where we don't. But also it can be the boundaries beyond which don't go because it could be unsafe or what have you. So, yes, we are much humans, much happier to experiment within a domain than just you know, if anything goes, then nothing goes. I'm not even gonna I'm not even going to try.
Two more thoughts before we wind down. One is what happens to people as they become more senior in an organizations they work their way through organization. Does fear go up or down?
Well, you know, it's not an easy question to answer, but there is a way in which it goes up. But at least, you know, fear of looking bad goes up. The more you've achieved, the more people are looking at you, the more the stakes can feel truly high, and the more risk averse you can become. And that is ironic and problematic in so many ways. The higher you go, the more people know you know in theory, the more power you have in practice, the more limitations you may feel.
So how to combat that.
A name it and then be make it a collaborative problem to solve. Name it is a risk that could harm our company and its success into the future, And then sort of invite this to be some one of the things that the team will wrestle with. It's almost like Spotter's ready, Like, yeah, since I recognize that could be a risk for you, I'm here for you and you are here for me to kind of encourage us to keep on the path of learning and growing and risk taking for the greater good.
Can you tell me something that you absolutely loved being a part of that If every project or thing that you worked on was exactly like this thing, you'd be the happiest person alive.
Yes, you know, I'm lucky enough to say I can think of several.
But so this is embarrassing, but in a conversation about failure, right in the middle of the conversation, our whole system just shut down. So this is actually the end of the episode, but we felt it was quite poetic, so we decided instead of editing out the failure, we would just leave it in because that's what Amy told us. We have to be honest about our shortcomings. If you'd like to learn about being honest and other human skills, check out Simonsindik dot com for war. Until then, take care of yourself, take care of each other.