“Experience is a hard teacher because she gives the test first, the lesson afterward.”  — Vernon Law

When hit with the first winter storm of the season, everyone has to learn how to drive all over again.  In our community, freezing rain coats everything, including the roads, which makes driving especially treacherous.  A few years ago, while gingerly making our way home after the first winter storm of the season, we were passed by a Jeep.  The driver, overconfident in his four-wheel drive, then spun out in front of us.  Fortunately, traffic was light and he didn’t hit anyone and neither did we.  He recovered control and continued his drive, now driving quite conservatively.  He had experienced a near miss and had been chastened by the lesson.

A Shot Across the Bows

Near misses have been described as “a shot across the bows,” originally a nautical term referring to a cannon shot across the bow of an enemy ship to warn that battle is eminent unless the ship surrenders.

Near misses rarely announce themselves so unambiguously.  Unlike most near misses, a shot across the bows is intentional and unmistakable.

My friend, Elizabeth Bertram, has spoken about near miss reporting.  Her concern is that near misses are underreported.  She has concluded that there are three difficulties with near misses, all interrelated.  The first is recognizing them for what they are.  The second is getting them reported. The third is convincing others to take them seriously.

Recognizing a Near Miss

A near miss is a harbinger of a far more serious event to come.  You may consider it bad luck, in that the near miss happened, or good luck, in that it was only a near miss and not something more catastrophic.  Regardless, a near miss is evidence that the existing system is capable of producing situations where catastrophes can occur.

Incident investigations invariably discover in hindsight that there were several near misses that preceded the catastrophic event itself.  In March 2016 there was catastrophic incident at the University of Hawaii at Manoa, where a portable pressure tank containing a flammable mixture of hydrogen and oxygen exploded and dismembered the post-doc working on the experiment.  The incident investigation reported that there was a near miss the day before the incident, where the mixture ignited, but was the near miss was ignored.

The Center for Chemical Process Safety defines a near miss as “an unplanned sequence of events that could have caused harm or loss if conditions were different or were allowed to progress, but actually did not.”  It is hard to argue with the definition, but it is not especially helpful in distinguishing near-misses from annoyances.  Things go wrong all the time.  We are all expected to be resilient, to adapt to imperfection, and to move on.  More often than not, a near miss is interpreted, not as a harbinger of worse things to come, but as proof of the robustness of the system.  How is one to distinguish between the dozens or hundreds of things that go wrong all the time, things with which we cope routinely to no ill effect, and the near misses that should be flagged as warning shots?

If a bolt head came whistling out of a unit, narrowly grazing your ear as it flew past, you would recognize that as a near miss.  If you simply found the same bolt head lying on the ground several meters from the unit, would you recognize it as a near miss?  If a relief valve on your steam distribution system lifted, would you recognize that as a near miss?  Or would it take catastrophic steam system failure to convince you that the release from the relief valve was a near miss, warning you that your steam pressure control system wasn’t working properly.  Too often a near miss is only recognized long after it is of any use as a warning.

Reporting a Near Miss

When Bertram speaks about near miss reporting, her primary concern is underreporting.  Part of the problem with underreporting is in recognizing a near miss for what it is.  Even when a near miss is recognized, however, there can be reluctance to report. “Frequently, safety reporting results in finger-pointing, rather than improvement…It makes people wary of bringing up problems; they don’t want to get in trouble, and they also don’t want to get anyone else in trouble.”

Some organizations turn to mandatory near miss reporting programs, which they see as an easy way to overcome this reluctance.  Mandatory reporting programs have problems, though.  First, they imply that management doesn’t trust workers to report safety problems appropriately.  If management trusted workers, a mandate wouldn’t be necessary.  Second, they encourage misinformation.

Consider a program that requires reporting one near miss per week.  If the near miss doesn’t happen, a worker is tempted to make something up, just to meet the mandatory reporting requirement.  On the other hand, if more than one near miss happens, there is little incentive to take the time and effort to report more than the necessary quota.  In either case, the program collects bad data, which undoes the benefit of collecting the data in the first place.

Responding to a Near Miss

One of the problems with near misses is that they reinforce a misplaced faith in the resilience of the system.  After all, something was wrong, yet still nothing bad happened.  At some point, a near miss loses all resemblance to a shot across the bows and simply becomes “normal”.  “Well, that’s how it’s always been.”  This “normalization of deviance” becomes its own problem.  Then, when random chance and misfortune finally catch up with the near misses, and something catastrophic happens, the investigators point back to the near misses and note that “clearly, there were warnings.”

When a near miss is reported, it is incredibly important that it be taken seriously.  Angrily dismissing a near miss as “whining” has the immediate effect of discouraging reporting in the future.  Even if a near miss report is taken at face value but nothing is done, the effect is to discourage reporting in the future as futile.

A Matter of Trust

An effective near miss reporting system ultimately depends on trust.  Trust that there is a common understanding of what a near miss is.  Trust that reporting near misses won’t result in blame and recriminations.  Trust that the near miss will prompt a response that actually makes the facility safer.  In the absence of trust, a near miss reporting system is doomed to fail.

Experience as Teacher

We learn from experience.  Near misses are experiences with incredibly cheap lessons, at least when compared to catastrophic events.  The lessons are of no value, however, if we do not learn from them.

Experience is even more important, though, in that it teaches us to recognize near misses.  The benefit of identifying near misses in an incident investigation is not that it allows us to assign blame for an event that has already happened, but that it allows us to recognize warning shots for what they are in the future.  We have a saying in our office: “You only learn from experience, but it doesn’t have to be your experience.”

Driving on Ice

As we followed the Jeep, the driver gradually sped up as he grew more confident in his ability to negotiate road conditions.  Eventually he left us behind, his taillights disappearing over a hill.  We continued our drive home, expecting to see the Jeep splattered in a ditch.  We encountered several other cars that had slid off the road, their drivers heedless of the near misses.  It would be a tidy ending to say that we eventually came across the Jeep, nose deep in mud, with the driver flagging us down. That’s not the way it happened, though. We’ll never know if the Jeep driver got the lesson, but we did.


This blog is based on an earlier version, “Learning from Experience: Near Misses”, posted on 10-Jan-2017 by Elsevier in Chemicals & Materials Now!