Punishing mistakes can be risky business

When you realize you’ve made a mistake or that you are heading toward one, how do you react?

My work with organizations often incorporates concepts and tools with roots in action research and positive psychology. Action research is a methodology for bringing the power of the scientific method and collaborative learning to the workplace. Positive psychology explores how we can strengthen the psychological foundations of success rather than how we can repair those foundations when they have failed. Both disciplines are brought to life through cycles of proactive experimentation and reflection. Experimentation within these disciplines entails tolerance for mistakes. In fact, mistakes are often welcomed because they can be analyzed and converted into insights pointing to growth opportunities.

If you are a leader, are you creating an environment in which people below and around you can make mistakes without fear of punishment?

Executives frequently lament that their people don’t take initiative to solve problems, make decisions or take risks on their own. When I hear this I usually try to find out more about how mistakes are handled in the organization. In any organization some people are more open to proactive experimentation while others prefer to stick to perfecting execution of things they already know how to do.  Still, while it is true that experimentation doesn’t come naturally to all people, it is also true that some environments are more conducive to experimentation than others.

“Mono wa tameshi” 「物は試し」

This is one of my favorite Japanese phrases and the equivalent phrase in English might be:  “You never know until you try.”  In fact, experimental action is often the quickest path to learning.  In a world where change is the only constant, it is impossible to predict and control all factors before you start doing something.  This means that nearly all proactive decision-making and problem solving entails some degree of experimentation. The treatment of those who do choose to proactively experiment can have profound implications for the growth prospects of an organization.

‘Mistake’ is just one way to label the outcome of an experiment that did not produce the expected result.

If we allow people to proactively experiment, some of those experiments will lead to disappointing results. Often we don’t know if something will work until we give it a try. There are even times when we can’t figure out what problem we are working on until after we have started taking action to deal with the symptoms.

There are bad mistakes, but there are also good ones:)

Mistakes can be costly or even disastrous, so it is only natural that we generally prefer to avoid them. It goes without saying that when someone does something with the conscious intent to do harm this is not the kind of mistake we want to welcome. I’m also not saying that we should celebrate if someone working in a dangerous environment deliberately violates safety protocols or violates compliance rules. I think we can probably agree, though, that the examples above are not mistakes arising from proactive experimentation, but rather from ill intent, carelessness, systemic failure,  or some other contextual cause. Without question, we should proactively look for ways to ensure that these kinds of mistakes don’t happen in the first place.

Somewhat paradoxically, though, the solution to the above problems is probably found through some form of proactive experimentation, and that proactive experimentation might also produce some mistakes on the way to finding a solution. The mistakes coming from this kind of proactive experimentation fit into a totally different category, and they should not be viewed in the same way as those resulting from carelessness. It makes sense to limit the scale and impact of mistakes even when they occur as part of the search for better ways to do things, but we shouldn’t conflate errors resulting from proactive experimentation with those coming from the lack of it.

One of the main factors determining whether or not people take the risk of trying something new seems to be how the organization treats the experimenters when their experiments don’t go as planned. We need to make sure the steps we take to mitigate risk do not inadvertently cause people to shy away from experimentation.  If those who make mistakes are ostracized, fired or punished in more subtle ways, employees with even a modicum of self-interest will probably stop taking initiative to try new things. In fact, in extreme cases, if we handle mistakes in a heavy-handed manner, we might even end up creating stress that leads to more rather than fewer mistakes.  The pressure to avoid mistakes can induce a strange variation on a psychological phenomenon known as the “scarcity effect.” It is possible to become so obsessed with avoiding mistakes that you develop a form of tunnel vision that robs you of your ability to anticipate factors that might actually leading to even bigger mistakes.

Mistakes + Reflection = Insights for Long-Term Growth

Rather than blaming those who make mistakes, we might consider adopting something like Martin Seligman’s ABCDE model (or another reflective model) to understand the mistake itself. (ABCDE stands for “Adversity, Beliefs, Consequences, Dispute, Energy.”) The ABCDE model was originally developed to help people apply critical thinking to break out of irrational cycles of negative thinking when they experience adversity; however, a similar approach could be used to help us shift our focus away from the person who has made a mistake to the context in which the mistake occurred. ABDCE renders mistakes (which can definitely feel like a kind of adversity) less emotionally threatening.

In many cases, once we have developed a better description of the context in which a mistake occurred, we can use what we have learned to formulate a new experiment that avoids the same error next time around. A few years ago, my colleagues and I came up with a very simple (and not particularly unique) model that we use to help clients develop a habit of relatively painless reflection and scenario planning. In an CIAO exercise, you map out Contextual Factors, Intended Outcomes, Action Options, and Actual Outcomes. You can use CIAO analysis for collaborative reflection to confirm whether you got the outcomes you intended to get and analyze what led to your success or failure. You can also use CIAO as a guide for collaborative planning to be sure you and your colleagues have aligned on intended outcomes, key contextual factors and action options before you move forward with any particular action. CIAO allows you to consider mistakes holistically rather than falling into the trap of just figuring out who should be held accountable for them. Reflective models such as ABCDE and CIAO serve as a constructive alternative to the natural instinct to hide mistakes, wallow in a sense of failure or search for someone else to blame.

Welcoming mistakes is not natural or easy.

Engendering fear of mistakes may be more natural than creating environments in which people can learn from them. In an HBR article, Edgar Shein once remarked that people learn based on two forms of anxiety: learning anxiety and survival anxiety. The two levers that induce people to learn are 1) increasing fear for one’s own survival and 2) decreasing fear in relation to learning/change. He noted that most organizations unfortunately focus much more on increasing fear for survival when they might actually get more impact out of decreasing fear of learning through experimentation with new ideas an approaches.

This is unfortunate because when we focus too much on avoiding mistakes we miss out on a great source of insights that could drive future performance. We might even be inadvertently increasing the long-term risk of our own obsolesce and eventual extinction.

Intelligent mistakes are the grist for the mill of reflection that fuels long-term growth. 

You, your organization and your clients can benefit in unexpected ways from the insights gained from the intelligent mistakes that come with proactive experimentation.

What might you and your organization gain from more intelligent experimentation?

What could you do to mitigate the risks that come from intelligent experimentation?  

How could you alter your work environment to reduce people’s fear of the inevitable mistakes they’ll make for the sake of learning, change and increased performance at a later date?

Additional Resources:

On mistakes and experimentation:

For an exploration of the surprising efficiency of learning through trial and error, see: TED Talk – Tim Horford: “Trial and Error and the God Complex”

For a presentation of how X, a world leader in innovation integrates experimentation and learning from mistakes into their development process see: TED Talk – Astro Teller: “The Unexpected Benefits of Celebrating Failure”

Ted video: Kathryn Schulz: On Being Wrong

On relationship between the Scarcity Effect and fear of mistakes

Hidden Brain, August 5, 2019:  Tunnel Vision, npr.org, hosted by Shankar Vedantam

For an exploration of how excessive emphasis on immediate results can lead to a form of perfection that undermines growth, see interview with Thomas Curran on Ted Radio Hour: https://www.npr.org/transcripts/825910251

For more information on ABCDE and explanatory style see:

“Can optimism be learned?”, Shortwave on npr.com: https://www.npr.org/transcripts/847562952

Karen Reivich, PhD and Andrew Shatte, PhD:  The Resilience Factor

Martin E. P. Seligman, PhD:  The Optimistic Child

For a series of interviews on Ted Radio Hour exploring risk, risk mitigation, and weighing of risk versus other factors see Nov 8 2019 edition of Ted Radio Hour:

https://www.npr.org/programs/ted-radio-hour/archive

Psychologist Gabriele Oettingen has developed a model called WOOP  (Wish-Obstacle-Overcome-Plan) that can be used to make consideration of potential reasons for failure a part of one’s formula for success. For information on WOOP, see:

https://www.npr.org/2020/08/21/904680577/you-2-0-woop-woop

For more information on CIAO:. Just send me an email or you can find information on similar models by reading almost anything written by scholars in the field of organizational learning such as Chris Argyris…

For recent research on how reflection + repetition bridge the gap between failure and success see:

https://news.uchicago.edu/story/data-science-predicts-which-failures-will-ultimately-lead-success

For an interview with Sabrina Cohen-Hatton Chief Fire Officer of West Sussex Fire and Rescue Services (and author of “The Heat of the Moment: Life and Death Decision-Making from a Firefighter”) in which she explains the use of a similar model to CIAO for better field decision-making by firefighters see:  Freakonomics Radio, How to Save $32 Million in One Hour (Ep. 397)

The following article by Mirjam Neelen & Paul A. Kirschner outlines how using cognitive disciplines such as attribution theory and self-efficacy theory can support the maintenance of a growth mindset and willingness to experiment:

Goodbye Growth Mindset, Hello Efficacy and Attribution Theory

For information on natural experimentation, an approach to economics research that emphasizes collecting data from real world experimentation, see:  What Coronavirus Researchers Can Learn from Economists, By Anupam B. Jena and nytimes.com, 6/30/2020

Gary Hamel provides a nice introduction to the principles of experimentation @:

https://www.garyhamel.com/video/why-experimentation-beats-prediction

In 2023, Freakonomics Radio did a 4-part series on how to make the most of mistakes. Here is a link to the first episode. One of the main guests in the series is Amy Edmonton, credited with popularizing the concept of psychological safety and a huge proponent of creating cultures in which people are able to take risks openly acknowledge failure and convert that failure into learning that can be used to improve future performance.

How to Succeed at Failing, Part 1: The Chain of Events

For research on the power of “conversational receptiveness” (a variation on maintaining curiosity or at least the appearance of curiosity in a partner’s perspective during conflict), listen to this interview with Julia Manson on Hidden Brain. The techniques she describes here could also be useful for use in reflection on errors when more than one person has been involved in the process:

https://hidden-brain.simplecast.com/episodes/relationships-20-how-to-keep-conflict-from-spiraling

© Dana Cogan 2009, 2019-2020 and 2024 all rights reserved.

Leave a Reply