How to avoid catastrophe what are near misses? Near misses are often unremarked small failures that permeate day to day business but cause no immediate harm.

How to avoid catastrophe

what are near misses? Near misses are often unremarked small failures that permeate day to day business but cause no immediate harm. People are hard wired to misinterpret or ignore the warnings embedded in these failures, and so they often go unexamined.

If conditions were to shift these near misses could erupt into chaos and crisis.

When disaster happens numerous poor decisions and dangerous conditions have contributed to it. With near misses we overlook the warning signs. With each near miss, rather than raise alarms and prompt action, we move on along the process because nothing happened. We accept the fact that nothing wrong happened as a good indicator that we are making the correct decision.

Multiple near misses normally proceed every disaster and business crisis. Most of the missus or ignored or misread. Our cognitive biases conspire to blind us to these near misses. Two particular cognitive biases cloud our judgment. These are:   Normalization of deviance – the tendency overtime to accept anomalies, particularly risky once, as normal. Things we become too comfortable with become normalised. therefore, what should be dangerous could be perceived in our minds as being safe because no dangerous event has ever occurred. The second Cognitive error is outcome bias. when People observe successful outcomes, they tend to focus on the results more than on the often unseen complex processes.

Recognise and learning from near misses isn’t simply a matter of paying attention. Near misses should be instructive failures where leaders can apply their lessons to improve and ward off Catastrophe.

Roots of crisis

When people observe a successful outcome, their natural tendency is to assume a process that led into it was fundamentally sound, even when it was not.  Organizational disasters we really have a single cause.  They are initiated by unexpected fraction of possible if small seemingly unimportant human errors of a technical failures, or bad business decisions.

These latent errors or human errors align with enabling conditions and to produce a significant failure.  Enabling Conditions are factors in the environment that contribute to an event happening. Latent errors often exist for long periods of time before they combined with enabling conditions to produce a significant failure.

Whether an enabling condition transforms a near miss into a crisis normally depends on chance. Thus, it makes little sense to try to predict or control enabling conditions. instead, companies should focus on identifying and fixing human errors before circumstances allow them to create a crisis.

Late errors underlying a crisis long before the crisis is present period latent errors emerge when a crisis gains momentum. When coupled with the right enabling conditions the crisis will erupt. Because latent errors are normalised by bias near misses become increasingly acceptable.

Further, near misses cause deviances which 2 are normalised. These deviances our cognitively ignored because of our outcome bias. Only win enabling conditions occur latent error triggers a crisis.

Recognizing and preventing near misses 

Research suggests there are 7 strategies that can help organization recognise near misses and root out the latent errors behind them.

  1. Heed high pressure

The greater the pressure to meet performance goals, the more likely People are to discount near miss signals or misread them. A classic case of normalization of deviance is exacerbated by enormous political pressure. Pressure can create an atmosphere that increasingly accepts less than specification performance. When People make decisions under pressure, research shows, they tend to rely on heuristics, or rules of thumb. thus, they’re more easily influenced by biases in high pressure work environments People are more easily swayed by outcome bias, more likely to normalise deviance, and more apt to believe that the decisions are sound.

  1. Learn from deviation

research shows that decision makers clearly understand the statistical risk represented by deviation, but become increasingly less concerned about it. it is important that leaders seek out operational deviations from the norm and examine whether their reasons for accepting or tolerating the associated risk has merit. The question to ask is Cola have we always been comfortable with this level of risk? Has our policy toward this risk changed overtime?

  1. Uncover root causes

when leaders identify deviations, their reflex is to correct the symptom rather than its cause. Leaders are to create an intentional model to report near misses. Leaders should be encouraged to report mistakes and near misses so the lessons can be teased out and applied.

  1. Demand accountability

even when People are aware of near misses, they tend to downgrade their importance. OneNote be comfortable is to hold leaders responsible for and to justify their assessments of near misses.

  1. Consider worst case scenarios

People tend not to think through the possible negative consequences of near misses unless they’re expressly advised to do so.  Research shows that by examining events closely helps People distinguish between near misses and successes. Research also suggests they will often adjust their decision making accordingly.

  1. Evaluate projects at every stage

When things go badly, managers conduct postmortems to determined causes and prevent recurrence. Research suggests this is too late. When things go well, however, few managers do a formal review of the success. Because near misses can look like successes, they often escape review.

when critically examining projects while under way, leaders can avoid bias and more likely to see near misses. A technique called pause and learn process typically uncovers near misses that have gone undetected.

  1. Reward owning up

sing and intending to near misses requires People motivated to expose near misses. In many organizations, employees have good reason to keep quiet about failures. In many organizations publicly rewards dot for uncovering near misses – including their own.

Conclusion

Two forces conspire to make learning from near misses difficult: Cognitive bias and outcome bias. when leaders do not recognise these biases, leaders tend not grasp their significance. Organisations often fail to expose and correct latent errors even when the cost of doing so is small. They miss the opportunity to improve and learn from these small mistakes. Bringing to light these near misses and correcting root causes is one of the soundest investments and organization can make.