Why is Andrew Hopkins’ quiet outrage making the world of work safer? Read the story behind the storyteller.
Jan Hayes & Andrew Hopkins
The worst nightmares of the oil and gas pipeline industry are coming true in the United States.
High-pressure natural gas pipelines run underground through many suburban areas as part of the network providing fuel to homes and businesses. This infrastructure poses an immense, but insufficiently recognised, threat to the general public. In 2010, one of these pipelines ruptured in San Bruno, a suburb of San Francisco adjacent to the international airport. The result was a massive explosion and fire in which eight people died, many were injured, and 38 homes were destroyed. This possibility haunts many cities around the world.
Coincidentally in the same year, another worst-case scenario came true, near Marshall, in the state of Michigan. A pipeline rupture released vast quantities of oily sludge into a local river system. The smell was so offensive that many nearby residents were forced to sell their homes and get out. The clean-up cost the pipeline owner more than a billion dollars, making it the most expensive oil spill on land in US history.
This book examines the causes of these two events. It argues that, although they were profoundly surprising to the companies concerned, from a broader perspective they were no surprise at all, stemming as they did from well-known human, organisational and regulatory failures. In particular, we emphasise two contrasting but equally flawed approaches to prevention of rare but catastrophic events.
Fantasy planning
Companies often try to convince themselves, regulators and members of the public that they have the relevant hazards under control because they have elaborate plans to deal with them. When it comes to the point, these plans turn out to be wildly optimistic and full of unjustified assumptions and inaccurate data. Their function is symbolic rather than instrumental – that is, they serve as statements that the hazard is under control, rather than as real instruments of control. Fantasy planning was very evident in both accidents.
Black swans
The second approach adopts the currently fashionable “black swan” metaphor. In Europe, historically, all swans are white, and Europeans could not conceive of a black swan – until they discovered Australia. In the 21st century, the concept of a black swan has taken on new meaning – a rare event with major impact, quite unpredictable at the time, although possibly explicable in hindsight. Nowadays, major industrial accidents, such as the blowout in the Gulf of Mexico in 2010, are sometimes referred to as black swans. But here the analogy breaks down. Black swans were unforeseeable to Europeans. Major accidents are not unforeseeable to risk analysts. In fact, it is their responsibility to foresee them and to put in place barriers against them. Accidents occur when those barriers fail. The metaphor is therefore wrong. In fact, it seems to be nothing more than a contemporary version of the idea that major accidents are inevitable – the ‘stuff happens’ view of risk management.
Integrity management
These two concepts shed new light on why integrity management is so difficult to get right and also how it can be improved. We hope that those in positions of responsibility in companies that have responsibility for hazardous facilities will feel the need to scrutinise their own integrity management systems with these absurdities in mind. The major failings we have identified provide valuable lessons for all organisations that use risk assessments to manage and prioritise routine activities.
Read the SafetyAtWorkBlog review.
Prices are shown in AUD and include GST | Copyright © 2024 Wolters Kluwer