Episode 14 – Three Mile Island and Normal Accidents

DisasterCast Safety Podcast - En podkast av Drew Rae

Kategorier:

This episode of DisasterCast covers the Three Mile Island nuclear accident, and "Normal Accidents", one possible explanation for why disasters like Three Mile Island Occur. Normal Accidents is the brainchild of the sociologist Charles Perrow. If you haven’t explicitly heard of him or of Normal Accidents, you’ve probably still encountered the basic ideas which often appear in the press when major accidents are discussed. If you read or hear someone saying that we need to think “possibilistically” instead of “probabilistically”, it’s likely that they’ve been influenced, at least in part, by Normal Accidents. In particular, there were a number of news articles written after Fukushima which invoked Normal Accidents as an explanation. Risk assessment is not a science. Whilst we can study risk assessment using scientific methods, just as we can study any human activity, risk assessment itself doesn’t make testable predictions. This may seem a bit non intuitive. Consider nuclear power. We’ve had lots of nuclear reactors for a long time - isn’t that enough to tell us how safe they are? Put simply, no. The probabilities that the reactor safety studies predict are so low, that we would need tens of thousands of years of operational evidence to actually test those predictions. None of this is controversial. Perrow goes a step further though. He says that the reason that we have not had more accidents is simply that nuclear reactors haven’t been around long enough to have those accidents. In other words, he goes beyond believing that the risk assessments are unreliable, to claiming that they significantly underestimate the risk. The theory of Normal Accidents is essentially Perrow’s explanation of where that extra risk is coming from. His starting point is not something that we should consider controversial. Blaming the operators for an accident such as Three Mile Island misses the point. Sure, the operators made mistakes, but we need to work out what it was about the system and the environment that caused those mistakes. Blaming the operators for stopping the high-pressure injectors would be like blaming the coolant for flowing out through the open valve. Perrow points to two system features, which he calls “interactive complexity” and “tight coupling” which make it hard for operators to form an accurate mental model of the systems they are operating. The bulk of his book consists of case studies examining how these arise in various ways, and how they contribute to accidents.

Visit the podcast's native language site