Key alarms and detectors that could have provided an early warning of the Deepwater Horizon explosion were disabled because false alarms were interrupting rig leaders’ beauty sleep, the LA Times is reporting today.

Have a read of the testimony.  It’s gutting.  One of my committee members does a lot of work on safety, and I recall him arguing that one of the most dangerous words in the English language is ‘accident’.  Accident implies random chance.  Accident implies the unforeseen, the unexpected, and the unpreventable.

The Deepwater Horizon blowout wasn’t an ‘accident’ in those ways.  It was a direct causal consequence of a host of deliberate decisions:  On the rig itself, choosing to disable alarms and deciding to stop testing key emergency systems.  And at the levels of the organization and the state, choosing to subcontract in ways that encouraged risk and allowing liability caps that created moral hazard.

Reading the testimony immediately made me think of Organization At The Limit (which I reviewed here), a volume edited by Starbuck and Farjoun about the Columbia space shuttle disaster.  There, the shuttle kept shedding foam – one of the causes of the eventual disaster.  At first, it raised alarms and engineers took notice.  But after the same problem surfaced and resurfaced on subsequent tests, NASA started thinking of it as ‘in family’, their lingo for well-understood and expected issues.  Pressured by time and operating in an organizational culture focused on delivery rather than safety, they stopped attending to the problem, treating each ‘near miss’ disaster like it was a success.Fircture contributed to unsafe decisions and practices at a micro level.

The risk here is that the blame will fall on the folks who make the bad micro-level decisions, as if a massive structural problem could have been avoided with one or two bad apples gone at the lowest levels.  Firing a few irresponsible flunkies is to fixing deeply embedded structural problems as disabling an alarm is to preventing a blowout.   It just doesn’t cut it.

Photo via There, I Fixed It, diligent chroniclers of exceedingly unsafe workarounds.