Risk Misconceptions

by Kevin Boehnke

Last week, a man peed into the reservoir that housed some of the municipal water supply for Portland, Oregon. In response, the  public health officials in Portland, Oregon dumped and flushed the 38 million gallon reservoir. The City Commissioner, Nick Fish, claimed he “didn’t have a choice”, since he doesn’t “have the luxury of slicing it too thin when there’s a potential risk, however small, to public health”. If you click the link above, you’ll see that the reservoir is uncovered. Birds and other animals surely poop and pee in it, and it is exposed to environmental pollutants in the air and rain. So does this urination event makes it unsafe? Is it that the water is truly much dirtier with the pee in it, or that the perception of risk and safety is misguided?

Risk

During a normal day, we make hundreds of risk judgments. Whether it is drinking expired milk (it smells fine, right?), deciding to bike or drive to work, or getting up our nerve to go bungee jumping, we consciously make decisions about whether something is safe enough to try. While risk is formally calculated by multiplying the chance of something happening by the cost or benefit of that event, we don’t always take the time to do so. Instead, we use two systems of thought to examine risk: the ‘Analytic system’ and the ‘Experiential system’.

  1. Experiential system: This acts unconsciously and quickly, and is deeply affected by emotional attachments. It ties ideas together, and uses stereotypes to make snap ‘gut’ decisions. It also performs simple thinking, like addition.
  2. Analytic system: This works slowly, examining evidence and consciously appraising risk using logical justifications. Analytic reasoning is employed when focusing on specific details, like writing or doing taxes.

No one actually does this! Instead, we go by our gut. Created by Garry Trudeau

The experiential system of thought probably evolved first due to its eminently useful function in our past. There were many dangerous animals lurking in the savannas of Africa where humans originated. Imagine that you are living in the wild, and you hear rustling grass and a low growl. What would you do? You wouldn’t thoughtfully weigh the probabilities of whether or not it was a lion, that’s for sure. Instead, you would assume it was a lion, and base your response on that (the classic “flight or fight” response). Of course, most times it wouldn’t be a lion, but the risk of assuming that it might be is very low, and FAR less than the risk of being gobbled by a lion. Taking those extra 5 seconds to think about it could have been the difference between life and death for our ancestors.

Clearly, the cameraman in this video should’ve been less analytical.

Thus, having a mental shortcut (a heuristic or a bias) that prompts an unconscious first response is logical from an evolutionary standpoint. However, because our brains work this way, we often assess risk subjectively rather objectively, based more on our experiences and unconscious shortcuts than our rational decision making capacities (here’s a list of biases we use on a daily basis). For instance, our innate response to urine and other excretions is disgust, which may be why Commissioner Nick Fish decided to dump 38 million gallons of water.

While useful for avoiding lions, our experiential risk assessment is less useful when judging modern risks like exposures to chemicals. The disconnect between experiential and analytic risk assessment has been termed the “Perception Gap”, and has been used to explain how risk is misused in policy.

For example, Bisphenol A (BPA), is a ubiquitous plasticizing agent. BPA was recently banned in baby bottles because it has been shown to have endocrine disrupting activity, and people were worried about the effects it might have on children. Everyone is exposed to it, and BPA exposure is linked to cancer, diabetes, heart disease, birth defects, and adverse sexual development in children. Sounds scary, right? I don’t want my endocrines to be disrupted, nor do I want that to happen to babies and children!

Unfortunately, BPA exposure is not limited to baby bottles. BPA is found in receipt paper, canned foods, PVC-pipes, and many other products, so BPA ingestion is negligibly lowered by banning it in baby bottles. Further, BPA substitutes (e.g. bisphenol S aka BPS) have endocrine disrupting activity, and are not as well studied as BPA. At this point, it’s merely replacing one toxin with another.*

So, there was a public outcry against BPA because it harm may harm babies, and an expensive and unhelpful policy was implemented.** This is concerning, since funding for regulation and research is a zero sum game: if money is used for one thing, then it’s unavailable for others.

For public health scientists and policy makers, this is where the Perception Gap is crucial, since the current funding climate is not the most fruitful. Although BPA is correlated with many disease outcomes, a causal linkage between them is still lacking. However, it’s a ‘hot’ topic, and funding agencies earmarked funds for researchers to study it. But, spending money on BPA research means that deadly risks like diarrheal disease (which kills around 2 million people per year, many of them children) may get less funding. That said, BPA is a real risk and useful to study; it may affect development (especially in utero), and informs general research on endocrine disruption. My point is merely that its visibility may be overly blown up by misperceptions of its importance.

While the media can contribute to this problem by sensationalizing things that aren’t actually that dangerous, research suggests that:

Alarmist news coverage of risk… is hardly the creator of this underlying instinctive/emotional/affective risk perception system, but is actually a product of it, since the innate human system for gauging what is dangerous and what is not influences members of the media the same way it does members of the scientific and academic communities and the general public.

Thus, understanding how humans perceive risk is necessary to incorporate into the policy process. This could be done by making it more difficult for people’s biases to influence public health decisions. An example of this comes from California, where a recent law requires parents to explain their reasons for opting their child out of vaccination to their healthcare provider and requiring the healthcare provider to sign an exemption form. This reduces the number of parents who choose not to vaccinate their children solely based on their biases by helping them slow them down and think more carefully. Clearly, this won’t stop everyone from opting out, but gives the analytic reasoning system a chance to kick in.

We assess risk daily, since living is inherently risky. While we make many personal decisions unconsciously, we should be careful about applying that same thinking to decisions affecting other people and society at large. People may be disgusted by pee in their water supply, but should recognize that it is not dangerous or abnormal. Animals would poop in it anyway, the volumes are SO diluted (parts per billion of urine), and water will be filtered before they drink it! It is necessary to improve how risk is communicated in order to help make rational decisions. As the saying goes, think before you act.

On another note, what choices do public health officials actually have when  evaluating whether to put the public at risk? What are other consequences of our poor risk assessment, and what psychological tools do we use to justify them? Check in next time to see!
*This practice of replacing one toxin with another is not uncommon. For example, you can buy both acetone and non-acetone nail polish remover. Acetone is a solvent and the fumes are dangerous to inhale. However, the non-acetone nail polish remover contains ethyl acetate, which is also toxic (although less so). 
**Clearly, while our risk assessment capacities influence this process, there are other things going on here besides risk (mis)-assessment: funding priorities and policy changes are multi-faceted issues that cannot be summed up simply by talking about risk.  
References

Ropeik, David. “The perception gap: recognizing and managing the risks that arise when we get risk wrong.” Food and chemical toxicology 50.5 (2012): 1222-1225DOI

Slovic, Paul. “Perception of risk.” Science 236.4799 (1987): 280-285. link

Viñas, René, and Cheryl S. Watson. “Bisphenol S disrupts estradiol-induced nongenomic signaling in a rat pituitary cell line: effects on cell functions.” Environmental health perspectives 121.3 (2013): 352-358. DOI

Kahneman, Daniel. Thinking, fast and slow. Macmillan, 2011. link to summary

Slovic, Paul, et al. “Risk as analysis and risk as feelings: Some thoughts about affect, reason, risk, and rationality.” Risk analysis 24.2 (2004): 311-322. link

Bernal, Autumn J., and Randy L. Jirtle. “Epigenomic disruption: the effects of early developmental exposures.” Birth Defects Research Part A: Clinical and Molecular Teratology 88.10 (2010): 938-944. link

Curtis, Val, Robert Aunger, and Tamer Rabie. “Evidence that disgust evolved to protect from risk of disease.” Proceedings of the Royal Society of London. Series B: Biological Sciences 271.Suppl 4 (2004): S131-S133. DOI