President Obama stood before the nation and world last week and, as oil gushed into the Gulf of Mexico, announced that his administration was "moving quickly on steps to ensure that a catastrophe like this never happens again." OK. There is one, and only one, way to ensure that deep water oil drilling never again leads to catastrophe, and that is to discontinue all deep water oil drilling. Period.
The oil catastrophe in the Gulf of Mexico was not caused by greed, corporate crime, or lax government oversight. This is not a pleasant truth to confront. It is far more comforting to blame BP's greed in rushing the operations at the Deepwater Horizon in order to move on to other profitable endeavors. Or the corners that were cut on the back-up systems. Or the gutting of regulatory oversight by eight years of Bush-Cheney. All true. Easier still to focus on the mind-numbingly callous statements of BP executives, or the aloof and seemingly disconnected response of the Obama administration. All true again.
These are more comforting answers because they give us a culprit at which to direct our anger, and they also allow us to optimistically hope that by designing better back-up systems, prosecuting criminals, reforming regulations, and tightening bureaucracies, catastrophes like this can be avoided in the future.
Tragically, this is not true. It is tragic because all of our instincts seem to point us to these conclusions which turn out to be false. Better back-up systems, criminal prosecutions, strict regulation, and more efficient oversight may well make "catastrophes like this" less frequent, but will never ensure that they "never happen again."
The BP catastrophe is what sociologist Charles Perrow termed a "normal accident," and others have called "system accidents." Perrow is referring to accidents to accidents in a very particular kind of technology. He means technologies that are complex, involving systems with lots of components. By "components" he means parts, but also procedures and operators. "Component failure" can thus be a piece of hardware, an operator error, or a flawed procedure. Component failure is an inevitable aspect of any technology. Give a sufficiently complex system of components enough time, and eventually two or more components will fail at the same time. When they do, the failures will interact, and when they do they will interact in ways that no one anticipated. Perrow noticed that some technologies are not only complex but also tightly coupled, meaning that the technology involves processes which happen very fast and cannot be turned off once they get going. If a technology is both complex and tightly coupled, the inevitable failures of the complex system will escalate into catastrophe.
A common misconception is that back-up systems can solve this dilemma. But back-up systems are often complex, tightly coupled systems themselves. Overlaying one complex system with another actually multiplies the number of unforeseen ways in which multiple failures can interact. The problems of complex systems cannot be solved by making them even more complex.
We could have a detailed discussion of any piece of this argument, but basically that's it. Quite simple really. Understanding it does not require an engineering degree or any other specialized knowledge. And it is irrefutable. But unintuitive, in the sense that the catastrophes that result from complex, tightly coupled technologies always involve hardware failure, operator error, greedy ownership, inept management, flawed procedures, or faulty designs, but none of these actually cause the catastrophe. It is a "system accident" in that the cause is the nature of the system itself. And it is a "normal accident" in that it is the result of the normal or expected functioning of the system.
By metaphor think of a firing squad. A line of soldiers with identical guns aims at a victim. All guns but one are loaded with identical bullets, but no one knows who has the blank. All fire at once. Who was the murderer? The question cannot be answered. The firing squad is a system (with parts, procedures, and operators) that is deliberately designed to make it impossible to attribute the murder to any one soldier. This is why it has been such a common method of execution since the invention of firearms.
Given enough time, complex, tightly coupled technologies inevitably arrive at a moment in which multiple components fail at the same time, interacting in a way that leads to catastrophe. But here the displacement of responsibility from component to entire system as a whole is not intentional. Instead, it results from the surprising way in which the failures interact. This confuses us, and we angrily demand to know which failure "caused" the catastrophe.
This kind of "normal accident" might be quite rare. Well-designed complex systems with multiple back-up systems may run for a very long time before multiple simultaneous failures interact in a surprising way resulting in a tightly coupled reaction that cannot be stopped. And in cases where the consequences of such a failure are limited, this is something society as a whole can live with. But if the consequences are hugely catastrophic, the fact that the failure only happens once every ten, or twenty, or fifty years is no consolation. Our bodies are comprised of multiple overlapping, complex, tightly coupled systems. We live many years before we die in an instant. Compared to living, dying is vanishingly rare. But die we must.
Fortunately, when one person dies, massive environmental catastrophe does not ensue. Our world is now full of technologies that are systemically both complex and tightly coupled, yet do not result in catastrophe when they experience "normal accidents." The computer you are reading this in is one such system, and as you are no doubt aware, computer systems crash. Billions of dollars has been spent trying to prevent your computer from crash, yet still it crashes away. Often times, the least stable computer systems are the ones that have been around long time and have been patched over and over with "back-up systems" intended to make the system more stable but have no such effect. As long as all you are doing with your computer is surfing the Web, when the system crashes all that will ensue is frustration.
But there are other technologies where system accidents result in catastrophe. Perrow singled out technologies involving (1) chemical reactions, (2) high temperature or pressure, or (3) air, vapor or water turbulence as being catastrophe prone. The Deepwater Horizon had all three. In the actual event, the pressure one mile below sea level seems to have been critical. The rig involved overlapping complex technologies with lots of parts, procedures, and operators. The simultaneous failure of multiple components interacted in an unexpected way." There were back-up systems that were thought to improve safety but only increased the complexity.
All of which is to say that this catastrophe was, first and foremost, inevitable. Not in the sense that an explosion caused by this particular combination of failures at this particular rig was inevitable, but if technology this complex and tightly coupled is going to be used to drill for oil at such extreme ocean depths, sooner or later a catastrophe of this magnitude will occur.
So, should BP be prosecuted? Sure. If it were up to me I would prosecute their CEO for criminal arrogance alone. Dismantle the regulatory agency? Why not? It was hopelessly screwed up. Fire some government officials? Couldn't hurt. But don't think for a minute that any of this will prevent eventual catastrophe if we continue drilling for oil a mile below the surface of the ocean.
As Perrow noted at the end of the last century:
Catastrophes have always been with us. In the distant past, the natural ones easily exceeded the human-made ones. Human-made catastrophes appear to have increased with industrialization as we built devices that could crash, sing, burn, or explode. In the last fifty years, however, and particularly in the last twenty-five, to the usual cause of accidents... was added a new cause: interactive complexity in the presence of tight coupling, producing a system accident. We have produced designs so complicated that we cannot anticipate all the possible interactions of the inevitable failures; we add safety devices that are deceived or avoided or defeated by hidden paths in the systems. The systems have become more complicated because either they are dealing with more deadly substances, or we demand that they function in ever more hostile environments or with greater speed and volume. And still new systems keep appearing... We seem to be unable to learn from chemical plant explosions or nuclear plant accidents. We have have reached plateau where our learning curve if nearly flat.
No one in the Obama administration seems to be aware of this kind of reasoning. It is shocking how long it took the Obama administration to grasp the scale of the catastrophe. Nearly a month after the rig exploded, the President was making absurd gestures like sending Energy Secretary Stephen Chu and a team of five scientists "with reputations for creative problem solving" to "deal with the crisis." Chu said the team would develop "plan B, C, D, E and F" and find a way to plug the well. None of Chu's team specialized in geology, oceanography, geology, chemistry, or extractive industries. One, Jonathan I. Katz, is a prominent global warming denialist whose published papers include titles like "In Defense of Homophobia" ("The human body was not designed to engage in homosexual acts... I am a homophobe, and proud") and "Anyone Who Bombs Baghdad Gets My Vote" ("Many lines of evidence argue that Al Qaida carried out the attacks of September 11, 2001 in cooperation with, and perhaps at the behest of, Saddam Hussein and Iraq"). Upon completion of their trip to the Gulf, Chu announced, "Things are looking up, and things are getting much more optimistic."
Two weeks later the administration was scrambling to spin a new tone. President Obama promised that he was "moving quickly on steps to ensure that a catastrophe like this never happens again," and would "continue to do whatever is necessary to protect and restore the Gulf Coast." The President added, "Where I was wrong was in my belief that the oil companies had their act together when it came to worst-case scenarios."
Then, after BP's effort to "top kill" the well failed and it seemed likely that oil would gush throughout the summer, Carol M. Browner, President Obama's climate change and energy policy adviser, announced that the administration was "prepared for the worst. We have been prepared from the beginning."
What, exactly, is the "worst-case scenario" the administration is now prepared for, or rather, has been prepared for from the beginning? What would it take for oil companies , the federal government, or anyone else to "have their act together when it comes to worst-case scenarios?
No one knows.
And we can actually divide the things we don't know into tow parts: things we merely don't know now, and things we will never know.
We do not know know how much oil has gushed into the ocean. We have estimates. Independent scientists who actually have relevant expertise are now on the case and agree that the original estimate made by BP and accepted by the Obama administration was wildly low. The administration is now using a much higher estimate. But the number is still an estimate, and an estimate of flow per day at that. The more days the flow continues, the greater the margin of error in our estimate. It now looks like the the well won't be sealed until August, and even then may continue to leak at a lower level indefinitely. No one will ever know how much oil will gush into the Gulf of Mexico from the Deepwater Horizon well.
Then there are the chemical dispersants BP has added to the mix. As of today (June 2, 2010), BP has added about a million gallons. BP has used two dispersants, COREXIT EC9527A and COREXIT 9500. NALCO, the company which makes these products, alleges that the chemical composition of the stuff is a "trade secret," so no one outside the company knows exactly what is involved. The company's disclosure statement for COREXIT EC9527A says, "No toxicity studies have been conducted on this product."
But even if extensive toxicity studies had been done, and we could pry the recipe from NALCO's greedy hands, there would still be much we would not know about what a million gallons and counting of the stuff will do in the Gulf of Mexico. There is currently a heated debate within the scientific community concerning how the toxicity of many industrial chemicals are measured. The debate is not over whether the measurements that are used are accurate, but concern what the resulting numbers mean. Ultimately, these debates lead to deep philosophical questions that are the underpinnings of science and which may never be resolved. Much of the toxicity debate comes from research done on animals which have been the focus of decades of study under laboratory conditions. If scientists cannot agree on chemical toxicity in lab rats, how will they agree on what effect chemical dispersants have on sea life a mile below the surface? Some of these life forms we have only recently begun to study. Many more are completely unknown.
Note that in Charles Perrow's anatomy of technological catastrophe, the dispersants constitute a "back-up system" that will now begin to interact with the ongoing catastrophe in completely unknown ways. And as is often the case with back-up systems, there is no actual evidence that the dispersants are improving the situation in any way. The only thing that is certain is that the dispersants are making the oil that has gushed harder to see from the surface and to track underwater. Which is probably why BP is using them in such massive amounts.
But we are not dealing only with the dispersant chemicals, or gushing oil. Rather, we are dealing with a mixture of millions of gallons of the two, concocted in deep sea conditions we only sketchily understand and spread by currents we cannot fully track. Some of the chemicals known to be in the dispersants are bioaccumulative, meaning that they gradually concentrate in living tissue and work their way up food chains. Hurricane season in the Gulf of Mexico officially begins today. Chemicals from this catastrophe could eventually end up in any number of living organisms across a vast geographical expanse.
It took decades to sort out the health consequences from the chemical defoliants the US military used in the Vietnam war, and there is still no scientific consensus as to the nature of Gulf War Syndrome suffered by veterans of the first invasion of Iraq twenty years ago. Both of these catastrophes involved human victims eager to participate in research that might pinpoint the cause of their problems, who could be readily located by their records of military service. The bird and sea life that will come into contact with the stuff spewed from the Deepwater Horizon and dispersed by BP are on no one's address list. They will not bang on the doors of VA hospitals to make themselves available to research. Much of this life is located so far beneath the ocean that research can only be done by robots. What we are looking at is decades of acrimonious debate as to what constitutes a "proven" health consequence of the Deepwater Horizon catastrophe. As usual, the debate will be largely framed by lawyers working in the service of the huge financial interests involved.
And Carol Browner claims that the administration is "prepared for the worst. We have been prepared from the beginning." What a laugh!
Sir Martin Rees, England's Royal Astronomer and one of the top astrophysicists of our day, anticipated catastrophes like this when he wrote a book titled Our Final Hour: How Terror, Error, and Environmental Disaster Threaten Humankind's Future in This Century - On Earth and Beyond. Rees calculated that "the odds are no better than fifty-fifty that our present civilization on Earth will survive to the end of the present century." Rees even placed a bet on Long bets.org, a web site where scientists publicly put their reputations on the line with predictions about the future. Rees bet that by 2020, a single catastrophe would lead to one million casualties. It is unclear whether Rees meant for birds and fish to be included in his "million casualties from a single event." If he did then he may have just won his bet. But then, how would we even know?