Friday, October 31, 2014

Risk and the Ape

It's no secret that a sizable number of people are very concerned about the risk of Ebola and that either as part of the cause or part of the effect, the media are obsessive in their coverage, grasping for any aspect of the disease, its history and its treatment, that can be talked about by an ever-changing cast of experts as well as the same familiar faces.  They may pause to cover a plane crash, a shooting, but the business of the day is Ebola: those who have it, those who may get it and those you might get it from whether you're in Bayou Sorrel, Louisiana or Braggadocio, Missouri.

How do we choose what we worry most about?  What scares us the most?  Psychologists like Slovic, Lichtenstein and Fischoff  have done studies about the public perception of risk.  The public, they argue, will assess  the danger of death from disease as equal to death by accident as being equal, but disease is 18 times as likely to kill you as a gun or a car or certainly a policeman.  Death by lightening seems less likely to those in their studies than the risk of death from botulism, although lightening is 52 times more likely to get you.

"The Lesson is clear:"  Says psychologist Daniel Kahnemann. "estimates of causes of death are warped by media coverage. The coverage is itself biased toward novelty and poignancy.  The media do not just shape what the public is interested in, but are also shaped by it"  

Rare and unusual occasions make good press in the competitive news and entertainment game and when the supply runs low and the demand high, the more commonplace or quotidian may be dressed up for the prom.  Have you turned on CNN recently?

"The world in our heads is not a precise replica of reality"

says Kahneman, understating the obvious. People make judgements and assessments of risk by consulting their emotions and not by examining the numbers.   A scary and unusual or gruesome thing looms larger than the Flu which may be millions of times more likely to kill you than Ebola. That Tylenol overdose accounts for 33,000 hospitalizations every year and hundreds of deaths simply doesn't enter the equation when we hyperventilate about the "risk" of Ebola or international terrorism or disease-carrying Mexican immigrants. And we don't feel fear when taking it or even read the label. 

Enter affect heuristics, the snap judgement mode under which we asses risk based on quicker, emotionally biased and less accurate calculation. .As Psychologist Jonathan Haidt said:
 "The emotional tail wags the rational dog."
If this doesn't seem pertinent to you, consider the studies of Antonio Damasio with people who do not, usually because of brain damage or abnormality,  display "appropriate" emotional responses.  They tend not to make decisions as well or as beneficially as others.  Indeed one's feelings do seem to enter into decisions we think of as truly rational. Asked to assess risk Vs. reward for specific technologies, one's feelings toward technology seem to determine the outcome. If you don't see genetic engineering as having any benefit at all, if you see danger in using Ammonium nitrate from the factory over  nitrates from manure, it's probably because of your bias against or lack of knowledge about science. If you tend to overlook real dangers from nuclear power, you probably already enjoy and understand technology and science. 

Is this a terrible thing?  Does it spell some disaster in that humans cannot expect to make the right decisions based on objective reality?    The public, says Slovic, actually makes finer distinctions than  the experts who assure us that you won't get Ebola from a certain person or by breathing the same air.  Finer distinctions between random, unpredictable fatalities and fatalities, like automobile accidents, that come from voluntary decisions. From this he concludes that  we should resist the "rule" of experts. 

Others look at examples where relying on experts might have prevented  popular excess, popular emotion from entering into public policy as with the expensive fiasco in 1989 about Alar and apples, where people were so afraid of apple juice they were taking it to toxic waste dumps and making terribly unreasonable claims of conspiracy based on nothing. Popular sentiment quickly snowballed or cascaded out of hand and beyond the universe of fact and reason.

Some psychologists like Timur Kuran and Cass R. Sunstein speak of  an Availability Cascade, A  mechanism through which biases flow into public policy, a self-reinforcing cycle that explains the development of certain kinds of collective beliefs, when explaining things from the Love Canal incident which somehow didn't kill us all or even some of us, yet had a colossal affect on public policy and public spending.   Does it explain demonstrations that insist that "we can't go the movies any more" because there was an isolated shooting?  In truth, choking on milk duds poses a greater risk but our minds see some qualitative difference between those deaths. 

Can it be part of human nature that we either ignore small risks because they are small risks -- or invest them with incredible imminence and attach tremendous fear to the point where we abuse the innocent, the non-dangerous as though we were running from a burning theater with evey man for himself?  We ignore or we panic and there are no other choices.

So perhaps we're overreacting in a predictable and intrinsically human way when we see immense danger from someone who might have been exposed to Ebola but who, we are assured, isn't contagious?  Are we asking ourselves for something we are not really capable of: a rational nature?  We evolved in a world where overreacting or reacting without much thought can save our lives but doesn't do much harm if the danger was less than expected. So if this is not exactly a critique of pure reason,  I'm still  not arguing that we should or even can throw out our inbred nature and I'm suggesting that  we accept the ape even while we keep him under close supervision.

No comments: