An Issue of Risk Assessment

Drilling for Certainty
David Brooks, NY Times, May 27, 2010

Over the past decades, we’ve come to depend on an ever-expanding array of intricate high-tech systems. These hardware and software systems are the guts of financial markets, energy exploration, space exploration, air travel, defense programs and modern production plants.

These systems, which allow us to live as well as we do, are too complex for any single person to understand. Yet every day, individuals are asked to monitor the health of these networks, weigh the risks of a system failure and take appropriate measures to reduce those risks.

If there is one thing we’ve learned, it is that humans are not great at measuring and responding to risk when placed in situations too complicated to understand.  People have trouble imagining how small failings can combine to lead to catastrophic disasters. At the Three Mile Island nuclear facility, a series of small systems happened to fail at the same time. People have a tendency to get acclimated to risk. As the physicist Richard Feynman wrote in a report on the Challenger disaster, as years went by, NASA officials got used to living with small failures.

So it seems important, in the months ahead, to not only focus on mechanical ways to make drilling safer, but also more broadly on helping people deal with potentially catastrophic complexity. There must be ways to improve the choice architecture — to help people guard against risk creep, false security, groupthink, the good-news bias and all the rest.

Read the complete column here.

 

administrator