Is it stupid to worry about unlikely things?
Why risk assessment brings us cold comfort
Can you believe the stupidity of the Centers for Disease Control calling a halt to the distribution of the Johnson & Johnson vaccine after just a handful of people out of several million developed blood clots? How about those fools continuing to wear masks outdoors when the risk of transmission outside is miniscule? And what about the idiocy of thinking schools should remain closed, despite evidence showing minimal risk of contagion among kids?
To assess risk rigorously is to demonstrate intelligence. To reject such assessments and follow ill-informed fear and anxiety is a mark of madness.
These are the messages we receive every day from a range of credentialed and freelance experts online and in various news outlets.
But reality is not so simple.
In truth, many of us struggle with how to render wise judgments and make reasonable decisions in light of the risk assessments that rain down on our heads from a multitude of authorities. We struggle because, however challenging it is to reckon with various probabilities to determine and assess risk, living with this knowledge of uncertainty, in psychological terms, is vastly harder.
Ordinary (non-quantified) human experience teaches that events in the world usually unfold in one way and very occasionally in another. Most days you don't win the lottery or get hit by a car on the way to work. But very rarely, those things do take place. When they do, we say we were lucky or unlucky.
Everyone at some level understands abnormally good and bad days will happen at some point in a life. That can be hard to live with, which is why human beings come up with a range of ways to think about such events. Some call it chance or fate. Others put their faith in God's providence or similar extra-human forces at work behind the scenes — karma, perhaps, or just a vague notion that "things happen for a reason."
Because even such ways of making sense of life's uncertainty don't eliminate fear and anxiety entirely, some people go a few steps further and turn to those who claim to possess the means to "read the tea leaves," predicting, seeing, or changing the future unfolding of events. Fortune tellers, mediums, priests leading petitionary prayers — these and many others like them offer guidance and solace in the face of uncertainty.
The turn to probabilistic risk assessment is a relatively small facet of the post-Enlightenment application of science to humanity for the sake of easing the struggles and suffering of life. But it is an important one, and becoming more so all the time. I'm not just talking about literal risk assessors toiling away at insurance companies. I mean everyone who deals in probabilities involving events that impact human life, including pollsters, climate scientists, volcanologists, oncologists, seismologists, epidemiologists, analysts at investment banks, and government officials in a long list of departments and agencies, very much including those tasked with matters of public health. All of these occupations and more devote a large portion of their time to thinking and doing calculations about the likelihood of certain future events taking place, and journalists report on this work when the results seem newsworthy.
But what do we do with this information, this effort to read the tea leaves with scientific precision, to tell us the chance that one event will happen instead of another? Does it help us to live well or better than we otherwise would? I think the answer is less obvious than many of us assume.
If I receive a cancer diagnosis, have the tumor surgically removed, endure radiation and chemotherapy, I'll then be told the likelihood of a recurrence, based on a number of risk factors: my sex, age, family history, the size and type of tumor, maybe the results of a genetic test. Say I'm told the chance of the cancer returning is 10 percent. That sounds encouraging — enough, perhaps, to move forward with my life, relegating my fear and anxiety to the deep background, something that looms in my mind only occasionally, maybe on the days just before I go back to my oncologist for a semi-annual screening.
That makes it sound like the risk assessment is an unalloyed good. But what if I were told that the chance of recurrence is 40 percent? That's a little less than even odds. Now I'll go forward with a much darker shadow looming over me, and banishing dread from my thoughts will be much more difficult and fleeting than it would have been at a 10 percent risk.
And now imagine I'm told that I have a 60 percent chance of recurrence. In this case, I'll face a greater than even likelihood of confronting another cancer diagnosis, another round of treatment, and another significant chance of dying from the disease.
Should I be grateful for that dire risk assessment? On one level, yes, absolutely, since it will motivate me to stay on top of future medical appointments, to follow the doctor's advice on how to minimize my risk as much as possible, and thereby marginally increase the likelihood that we will detect as early as possible any recurrence that appears and have the greatest possible chance of defeating the disease once again. But on another level — the level of day-to-day quality-of-life — the answer isn't entirely obvious. I might live out the rest of my days feeling like I'm confronting a death sentence, with the primary uncertainty being the precise time the punishment will be meted out.
But perhaps focusing on this wide range of responses — from gratitude at 10 percent to something approaching an anxiety-induced breakdown at 60 percent — misses an even deeper level of psychological difficulty and distress. Some people who receive the 10 percent prediction are going to have a recurrence, after all. Those who do are the unlucky ones, much like all those who receive a cancer diagnosis in the first place. Though in this case, those who learn of a recurrence will know in a much firmer way, thanks to the risk assessment, just how unlucky they are. They will know that they only had a 10 percent chance of this terrible thing happening, and yet it did.
I suspect that, consciously or not, awareness of this dynamic haunts our thinking about the risk assessments that now permeate our culture and politics. Was the CDC correct to halt the use of the Johnson & Johnson vaccine because a few people developed blood clots? If you imagine that you're one of the vast number of people who could receive the vaccine without ever developing a clot, then the move seems like an example of almost comical over-cautiousness. But if you imagine yourself as one of the unlucky few who would develop a clot, then the move seems like prudence exemplified.
If the vaccine produces a brain aneurysm in one in a million people and we don't know why, some will be bound to imagine themselves in the shoes of that one, incredibly unlucky person. Of course, it's exceedingly unlikely that any particular person will be the one. But someone will be.
Is it rational to fear a fate that we know will befall only a tiny fraction of those who take a vaccine? Maybe not. But human beings aren't rational creatures — at least not always, and least of all when it concerns the greatest of possible misfortunes.
For all of the benefits that follow from applying risk assessment to so many spheres of life, doing so hasn't eliminated thoughts of luck, fate, and fortune, and the anxieties wrapped up with them. In some ways our quantification of uncertainty has intensified those thoughts by making them more concrete.
Thanks to our obsession with quantifying risk, we know more than ever about the uncertainty that surrounds us.