NYT econ columnist David Leonhardt thinks BP underestimated the risk of a spill:
The people running BP did a dreadful job of estimating the true chances of events that seemed unlikely — and may even have been unlikely — but that would bring enormous costs.
Perhaps the easiest way to see this is to consider what BP executives must be thinking today. Surely, given the expense of the clean-up and the hit to BP’s reputation, the executives wish they could go back and spend the extra money to make Deepwater Horizon safer.
Michael Kinsley correctly points out that this is absurd:
Well of course. But this alone doesn’t prove that BP was cavalier or even honestly mistaken about the risk of a blowout. (There is other evidence; this is not intended as a defense of BP.) I have insurance against the risk of my car being hit by another car. I don’t have insurance against the risk of my car being hit by an asteroid. If it ever is hit by an asteroid, I certainly will wish that I could go back and buy the insurance. That doesn’t mean my original decision not to was wrong. As long as the risk is above zero, there is some point at which it is worth taking. You can’t insure against everything.
My worry is the opposite of Leonhardt’s. It seems to me that in America, at least, we are much more prone to the second type of error: overestimating small risks of large disasters—and underestimating, by comparison, the smaller, everyday risks of life. Leonhardt himself provides the clearest example: people who, after 9/11, decided to drive rather than fly on long trips. In the next year there was no new terrorist attack on airplanes, but deaths in traffic accidents went up because more people were driving. That was like buying insurance against asteroids. Or, to take a more tendentious example, it’s like nuclear power. For a generation we’ve gone without it to protect ourselves from a catastrophic risk, only to find that perhaps this was a mistake because, without enough reflection, we were increasing the risk of a somewhat smaller catastrophe like the one in the Gulf.
He notes that it doesn’t help that our legal system attempts to find blame after the fact when people fail to prepare against wildly unlikely events.
If a plane crashes or a medical operation goes tragically wrong, someone will be found to blame for not doing something that would have prevented it. (Leonhardt recommends making it easier to sue on occasions like this. Oy vey!) We see it in our health care, for example in the approval process for new pharmaceuticals, where we ricochet between demands that pills be perfectly safe and complaints about how hard they are to get approved.
What to do about this is unclear. How much risk should we accept? And who should bear the burden of that risk? Deciding after the fact — the current system — is expensive and random.
Kinsley, naturally, favors a different solution: “To me, if you’re arguing for us to tolerate greater risks as a society, you need a stronger safety net for individuals, to catch them when they fall.” Which almost certainly makes sense in the case of such things as natural disasters or major calamities such as the BP oil spill. That doesn’t prevent going after the companies for legitimate negligence, of course, but it would speed aid to victims and allow society to recoup the damages later. Then again, that approach encourages more risk-taking, such as building houses where hurricanes, earthquakes, and floods are inevitable.
via Andrew Sullivan