Risk Assessment

NYT econ columnist David Leonhardt thinks BP underestimated the risk of a spill:

The people running BP did a dreadful job of estimating the true chances of events that seemed unlikely — and may even have been unlikely — but that would bring enormous costs.

Perhaps the easiest way to see this is to consider what BP executives must be thinking today. Surely, given the expense of the clean-up and the hit to BP’s reputation, the executives wish they could go back and spend the extra money to make Deepwater Horizon safer.

Michael Kinsley correctly points out that this is absurd:

Well of course. But this alone doesn’t prove that BP was cavalier or even honestly mistaken about the risk of a blowout. (There is other evidence; this is not intended as a defense of BP.) I have insurance against the risk of my car being hit by another car. I don’t have insurance against the risk of my car being hit by an asteroid. If it ever is hit by an asteroid, I certainly will wish that I could go back and buy the insurance. That doesn’t mean my original decision not to was wrong. As long as the risk is above zero, there is some point at which it is worth taking. You can’t insure against everything.

My worry is the opposite of Leonhardt’s. It seems to me that in America, at least, we are much more prone to the second type of error: overestimating small risks of large disasters—and underestimating, by comparison, the smaller, everyday risks of life. Leonhardt himself provides the clearest example: people who, after 9/11, decided to drive rather than fly on long trips. In the next year there was no new terrorist attack on airplanes, but deaths in traffic accidents went up because more people were driving. That was like buying insurance against asteroids. Or, to take a more tendentious example, it’s like nuclear power. For a generation we’ve gone without it to protect ourselves from a catastrophic risk, only to find that perhaps this was a mistake because, without enough reflection, we were increasing the risk of a somewhat smaller catastrophe like the one in the Gulf.

He notes that it doesn’t help that our legal system attempts to find blame after the fact when people fail to prepare against wildly unlikely events.

If a plane crashes or a medical operation goes tragically wrong, someone will be found to blame for not doing something that would have prevented it. (Leonhardt recommends making it easier to sue on occasions like this. Oy vey!) We see it in our health care, for example in the approval process for new pharmaceuticals, where we ricochet between demands that pills be perfectly safe and complaints about how hard they are to get approved.

What to do about this is unclear.   How much risk should we accept?  And who should bear the burden of that risk?   Deciding after the fact — the current system — is expensive and random.

Kinsley, naturally, favors a different solution:  “To me, if you’re arguing for us to tolerate greater risks as a society, you need a stronger safety net for individuals, to catch them when they fall.”   Which almost certainly makes sense in the case of such things as natural disasters or major calamities such as the BP oil spill.  That doesn’t prevent going after the companies for legitimate negligence, of course, but it would speed aid to victims and allow society to recoup the damages later.  Then again, that approach encourages more risk-taking, such as building houses where hurricanes, earthquakes, and floods are inevitable.

via Andrew Sullivan

FILED UNDER: General, , , ,
James Joyner
About James Joyner
James Joyner is Professor and Department Head of Security Studies at Marine Corps University's Command and Staff College and a nonresident senior fellow at the Scowcroft Center for Strategy and Security at the Atlantic Council. He's a former Army officer and Desert Storm vet. Views expressed here are his own. Follow James on Twitter @DrJJoyner.

Comments

  1. Dave Schuler says:

    And who should bear the burden of that risk?

    IMO those who undertake the risks and their customers.  BP didn’t carry insurance against oil spills.  It self-insured because the likelihood was so low and their liabilities were limited.  That’s what made sense.

    As I’ve said before, I’m in favor of removing the subsidies that the limitation on liability constitutes, at least over time. Perhaps a cap of $10 billion next year, $100 billion in five years, and no cap at all in ten.

    That will raise the price of gas at the pump but only because it more adequately represents the true cost of gasoline and that, as classical economics promises, will result in the optimal allocation of resources.

  2. john personna says:

    I’ve been onto this since my brush with the peak oilers.  I think this old article at Time Magazine covers it best:
     
    http://www.time.com/time/magazine/article/0,9171,1562978,00.html
     
    As to how this impacts government … it’s probably too much to assume that in aggregate we’ll get better at risk.  Maybe one person at a time.  And so governments should probably expect to be wrong, and have some humility and flexibility in their risk assessments.

  3. JKB says:

    –How much risk should we accept?  And who should bear the burden of that risk?  —

    Don’t know, how much risk, you got?  Do we go with every wild possibility or do we apply some probability?  But then how do we really determine the chances?  You can wrap some math around it but it is still a guess.  How many wells have been drilled using the exact same procedures?  Maybe even with, what we’ll decide now were questionable choices, that provided years of lack of failure.  Remember the years of “acceptable burn through on the o-rings of the Challenger”?  Once you take a risk, by choice or accident, and don’t die, it confirms in peoples mind that the risk was overblown right up until it blow out.

    This blow out, like most failures, was the result of a series of failures not just one big one.  The cement plug failed after they’d displaced the mud with seawater.  No one realized early enough that oil and gas were coming up, the BOP’s multiple valves failed to make a complete closing, etc.  Any in the series would have stopped the disaster and probably had stopped others.  But this time they lined up.

    My point is that we can’t estimate risk because we can’t see it objectively.  Even if we impose severe conservatism, it’s only a matter of 20 or 40 years till some young whippersapper comes along and gets everyone to agree the old people don’t know anything.  Then we’ll let banks sell securities, give loans on people’s made up income, come to believe if we cut garbage up into small enough pieces it becomes filet mignon, try to get bears to pose in a group photo, etc.  All because nothing bad happened for the prior 20 years or so.

  4. PD Shaw says:

    To make a risk assessment, one needs data and unfortunately bad things happening is data that is often necessary to make a reasonable risk assessment.

    I think our larger problem is to underestimate how information about bad events is addressed spontaneously.  United Airlines Flight 93 took and acted on the new information before the 9/11 Comm’n was appointed.  I imagine that all of the oil companies are examining their practices right now and preparing to respond to shareholder inquiries.

    All of these oil rigs are permitted by the USEPA, requiring the use of available pollution control technologies that are based upon risk and cost/benefit estimates.  These requirements have to be based upon evidence, not mere conjecture that an asteroid could destroy a platform.  And now the USEPA has more data, the requirements will become more stringent without Congress lifting a finger.

  5. PD Shaw says:

    Dave, one of the reasons that there are caps on natural resources damage is probably lack of availability of true insurance.

  6. john personna says:

    Also note that we ended up owning AIG.