A review of the book by Dylan Evans (Simon and Schuster, 2015.)
Risk is broadly about the possibility of harm. Is it possible to live without risk? If we avoid dangerous situations can we avoid risk? The answer is a clear no. Contemporary society is risk (as Ulrich Beck expounded in Risk Society). And risk is about uncertainty. The Internatial Organization for Standardization (ISO) standard for Risk Management defines risk as the ‘effect of uncertainty on objects’ (1). Pretty tough to avoid uncertainty in life and disaster preparedness.
Uncertainty can be uncomfortable, something we seek to mitigate, clarify, eliminate. Erich Fromm neatly described our common predicament:
There is certainty only about the past — and about the future only as far as it is death.
This book is not necessarily a “disaster medicine” book. But understanding risk is key to disaster preparedness. The principles are readily applied to daily life (“what is the chance my new tv will break within the time frame the extended warranty would cover?”) as well as to catastrophic risks (“how much should a government spend on preventing an unlikely terrorist attack?”). Even in weather forecasting and medical diagnoses, risk intelligence is vital (and much lacking).
Some key points are:
Risk intelligence is uncommon, we’re not naturally good at it
Worse than not knowing, is overconfidence in what little we do know, not knowing that we don’t know.
“Risk intelligence is vital in our personal and professional lives”
There is a fairly constant current of forgetfulness against disaster preparedness. The longer it has been since the latest (and publicized) disaster, the less serious it seemed to be. And it works prospectively as well. “It is a recurrent theme that the further away on is from the actual delivery of disaster care, the better prepared one perceives the system to be” (2). But on the other side is the pull of catastrophism, the ‘fallacy of worst-case thinking’, as Evans calls it. “...worst-cast thinking leads to terrible decision making. For one thing, it’s only half of the cost-benefit equation.” This might seem like an argument against preparing for disaster, but I see it rather as an argument for preparing for any and all risks, to an intelligent extent.
Here’s a political example. Dick Cheney describing the need to prepare for possibility that al-Qaeda could be developing nuclear weapons: “If there’s a 1 percent chance...we have to treat it like a certainty...It’s not about our analysis... It’s about our response.” That’s all good for response capability, but it ignores, uh, well, risk analysis. An important part of disaster preparedness, some would argue.
A lot of disaster preparedness is about reminding people that, though disasters are uncertain and infrequent, they are not unexpected. Preparation is less costly than reparation, repatriation, reconstruction, and everything else that comes from ignoring the warning signs. We have to thrive on the slopes of uncertainty, since that is the terrain we travel much of the time, globally for sure, and personally, too.
And honing our risk intelligence does not make us immune to adverse events. Bad weather, bad luck, and bad choices will come to all of us. This book is about how we improve our chances of positive outcomes of any risks. There is not guarantee.
If you want a guarantee, buy a toaster
-Clint Eastwood, playing Nick Palovski in The Rookie
References
1. Rubin, Olivier, and Rasmus Dahlberg. "risk." In A Dictionary of Disaster Management. : Oxford University Press, https://www-oxfordreference-com.login.ezproxy.library.ualberta.ca/view/10.1093/acref/9780191829895.001.0001/acref-9780191829895-e-172.
2. Kollek, D. Forward. In Kollek, Daniel, ed. Disaster preparedness for health care facilities. PMPH-USA, 2013.
Photos
Ivo Duarte Nogueira on pexels.com
Anas Hinde on pexels.com
Comments