# Is Risk Useful?

There has been a recent push by the Risk Management community for a greater involvement of rigorous risk science in emergency management planning. While I certainly am all for more rigor in planning, I had to ask myself, “Is risk – as used by risk management professionals – really useful?”

Risk professionals generally see the “risk” to some entity A [facility, community, entity] from some threat T as the mathematical product of three quantities – the probability of the threat actually occurring, the vulnerability V of the entity A to the threat, and the consequences C if T actually occurs. In other words, the risk to A of T = p(T) x V(A,T) x C(A,T), where p(T) is the probability of T occurring.

Let me say right now that I agree there is value in thinking about threats in these terms. How likely is the threat? How vulnerable am I (or my home, or my community) to T? If T occurs, what are its impacts? The devil is in the details.

If we think about p(T), there are three major problems – one political and two with the math. First, suppose we say that the probability of a major earthquake hitting my home in Aiken, SC, this year is 0.001 (Historically, we only know of one that affected this area, the Charleston earthquake of 1886). While we know that a repeat of 1886 would likely cause several tens of million dollars of damage in our area, any normal politician is going to turn the math around and figure that an earthquake happens only once in a thousand years, so it’s highly unlikely to happen on his/her watch, so let’s ignore it.

This is a trivial example, but even when the probability is higher, say 0.1, politicians may be unlikely to take even minimal action in the face of the 100% probability of facing other problems – unemployment and the accompanying social stresses, falling revenue, finding funding for infrastructure and so on. In this case, the rigor of risk management actually can be counterproductive.

If we delve a little deeper into the math, we see two other problems. First, the probability is generally assumed to be Bayesian, or normal. But is it? The answer is that for many events, we simply have too few instances to be able to make that assumption, or to appropriately calculate a probability even if the assumption is true. Further, it is not at all clear that threats are distributed normally; they may well be distributed in a more skewed manner (e.g., a Poisson distribution). If so, this means that the low probability tail of the distribution may have a much higher probability than if the distribution is normal. Taken together, these mean that for many types of threats (e.g., terrorism), we simply don’t know enough to be able to make a good estimate of the probability.

The situation is similar for vulnerability. Not enough instances to be able to specify “vulnerability.” When we begin talking about consequences, though, things become more complex. We can fairly easily determine the direct physical consequences of a destructive event. Software such as HAZUS-MH can be used to give us reasonable estimates of what will stand or fall after a hurricane, or an earthquake. However, it is much more difficult to predict the indirect consequences – the cascading impacts of disaster. These will be economic and social as well as physical. In our interdependent world, these may extend far beyond Ground Zero, e.g., look at the impacts on our national economy of 9/11.

In short, while risk-based thinking can be an important guide to how we think about – and prepare for – disasters, I can’t see a case for granting a greater role to risk management. In fact, I see two other reasons against doing so.

• Though we don’t like to think about it, there are opportunities in disaster. We can rebuild better, circumventing the Tyranny of What Is. Certainly New Orleans paid a high price for its lack of preparedness for Katrina. But if one looks at neighborhoods such as Broadmoor, or even the entire city, its residents are better off than they were prior to Katrina. This is certainly so physically, and a good case can be made that this is true in terms of the economic, social and political aspects as well.

• The risk equation above ignores the importance of community resilience – the ability to adapt to adversity. Resilient communities innovate in the face of disaster – no statistical approach can adequately capture this “adaptability,” so preparedness based on the risk equation will consistently over-value prevention.

So I answer my question this way: risk is a useful way to think about threats, but not as useful – and likely counterproductive – for planning for disaster. I believe community resilience offers a better path – preventing the consequences you can prevent, adapting and innovating for those you can’t, and seizing the opportunities inherent in disaster.