How Risky Is It, Really?

Book How Risky Is It, Really?

Why Our Fears Don't Always Match the Facts

McGraw-Hill,


Recommendation

This lively, honest book is a pleasure to read and easy to digest. Journalist David Ropeik demystifies the common mental and social mechanisms humans use to evaluate danger. He explains how people often misrepresent and misunderstand possibly perilous circumstances and tells you how to weigh potential risk more accurately. Some explanations are too long and some “risk perception factors” are a bit similar, but, that noted, Ropeik’s many insights are so instantly applicable that any reader will find them useful. BooksInShort recommends his book to anyone involved in risk management and social policy, and to all consumers of the news.

Take-Aways

  • People respond to risks emotionally and rationally. The emotional response is quicker.
  • How you react to a specific hazard depends on how it is framed conceptually.
  • Your cultural background and orientation influence how you perceive risk.
  • Most people and societies estimate risks badly. That, in itself, creates risk.
  • A threat is more frightening if it is personalized, alien, artificial, new, seemingly unfair, aimed at children or dealt with badly by the authorities.
  • Fears abate when people have apparent choices, when the risk is familiar, when the activity also carries a balancing benefit or when the threat is chronic, not “catastrophic.”
  • Your perception of risk will change based on how you frame certain factors, such as: “risk versus benefit, control, choice, uncertainty, pain, personification” and “fairness.”
  • The media heighten your sense of danger by stressing drama and oversimplifying events.
  • To evaluate risk, keep an open mind, think things through and seek varied, unbiased data.
  • Also, reframe information, calm down, rest, take your time and be aware of the individual and group risk-perception factors that make potential harm seem more or less likely.
 

Summary

Perception and Risk

Everyone feels fear sometimes. If someone told you that a contaminant in your workplace could give you cancer, you might leave your office, even if you know that the risk is incredibly small – and even if you already do far more dangerous things, like smoking. Most people who try to evaluate risk suffer a dangerous “perception gap,” though they are not cognizant of it. This gap can lead you to evaluable possible harm inaccurately. To grasp such a situation correctly, you must know how human beings respond to risk. If you find yourself in a risky situation, before you can even think it through, you’ll feel a surge of fear. Your body’s “risk response” system will combine your “feelings and intuition and gut reactions” with your “deliberate, conscious thinking.” Evolution shaped this system, so you’ll react to contemporary threats the way your ancestors responded to physical threats. This can lead to “risky personal behavior,” and to public policies that address the perceptions of risk, but not the society’s real hazards.

The Biology of Fear

When you’re frightened, your body reacts: Your heart beats faster, you breathe rapidly, your digestion shuts down, and your hearing and vision become narrowly focused. When you’re frightened, your body releases glucose to give you energy to act. These reactions are part of the automatic “fight or flight or freeze response,” your body’s attempt to do whatever it takes to keep you alive. This happens faster than you can think about it. In fact, you’re “hard-wired to fear first, and think second.” The portion of your brain called the amygdala handles the danger input before the slower processes of rational thought kick in; what’s more, fear can trump rational thought.

“Any risk feels bigger if you think it could happen to you.”

Charles Darwin reported jumping back in fear when a snake behind a thick pane of glass at a zoo leapt at him. Imagining a danger may also trigger this kind of fear response. You can condition yourself to be less afraid through repeated exposures to a specific risk (like seeing a snake several days in a row). Frightening experiences burn themselves into your memory, coloring your future reactions. Some dread is “built-in”: Fear of spiders is common, even among people who’ve never had a bad experience with one. People are generally adept at identifying scary faces, which makes sense, given humanity’s intensely social nature. However, more complicated perils, such as climate change, are more difficult to recognize. And, if you have an emotional association with some words, such as “hospital” or “gun,” you won’t be able to think clearly about them.

The Limits of Reason

Some people believe that rationality is the best way to deal with risk. They say you should just collect all the information and think things through logically. That approach has three problems. First, before you can think, your emotions will already have responded to the predicament. Second, you can’t know everything about any situation, so you must make decisions with partial information – unless you just never decide. Third, even when you have all the crucial facts, your mind shifts to deal with them according to how they’re presented or framed. Rather than operating with sweet reason, humans must function within “bounded rationality,” and must make “decisions without perfect knowledge” by using an array of “mental shortcuts” or heuristics.

“If there is a benefit, we play down any associated risk in order to get it. If there is a risk, we play down potential benefits in order to protect ourselves.”

Some perceptions are based on “biases,” and some are reactions to the way information is given, verbally, emotionally or mathematically. People make choices and assign risk based on how they receive data. When researchers asked subjects to try two identical beef samples, one tagged “75% lean” and one marked “25% fat,” most participants found that the “lean” beef was “less greasy and tastier, though the information about the fat content of both samples was identical.” Another human tendency is to deal with situations by “categorization,” using stereotyping, or “representativeness,” to apply experience with one item in a category to the entire group. Vivid test results or published studies can be very persuasive, even if the study sample was too small to be valid. People also reason badly when dealing with probabilities, in that they detect patterns where none exist, and they expect random activities to balance out somehow. “The endowment effect,” another pervasive mental shortcut, causes you to value something you own (even if you just acquired it) more than something you don’t own.

“We have evolved to be afraid of the dark. For most of human history, the dark was a time when we were in the food chain, not at the top of it.”

“Anchoring and adjustment” often come into play when numbers are involved. Once you are made aware of a number, it tends to anchor your future processing of other numbers and you tend to make other adjustments to accommodate it. This is part of the “innumeracy” that plagues humanity. Most folks aren’t good with numbers, so they estimate badly. Awareness of a threat and how readily it comes to mind or memory also shapes your perceptions. If you’ve experienced something painful before or if the media make you extra alert to some hazard, you’re more likely to perceive it as a personal danger, regardless of the facts. Optimism also affects your judgment and sense of well being, often leading to conclusions that are too rosy.

Factors Influencing Your Perception of Risk

People use mental shortcuts or “risk perception factors” to weigh danger. Your perspective makes taking a chance seem more or less scary, so knowing these factors will help build your judgment:

  • “Trust” – Lack of trust makes a situation more threatening. When leaders transgress trust, as Japan’s officials did when they said the nation’s food was free of mad cow disease and then had to recant, those affected grow more scared. Withholding meaningful data, lying, and denying mistakes and misdeeds undercut trust.
  • “Risk versus benefits” – While the likelihood of something bad happening might be the same for you as for anyone else, individuals evaluate danger differently, in part by weighing the hazards against benefits. They value diverse elements in different situations, so everyone has a separate, evolving opinion about the risk-versus-benefit trade-off.
  • “Control” – Regardless of any real jeopardy, you will feel less fear if you believe you control your situation, whether you do or not. Many Americans chose to drive long distances, instead of flying, after September 11, 2001. Flying remained safer than driving, but driving seemed safer in light of the terrorist attacks. In the three months following 9/11, the U.S. experienced 1,018 more auto accident deaths than the statistical norm.
  • “Choice” – Situations feel more dangerous when they are imposed upon you and less dangerous when you select them. Thrill seekers choose to bungee jump, which would seem very risky if someone forced them to do it. When the U.S. government put a nuclear waste storage site on Yucca Mountain, Nevada, residents fought it. In contrast, Finland let communities help select a waste site. The town of Eurajoki, which already had two reactors, sought and won the site’s jobs and tax benefits.
  • “Natural or human-made?” – Some folks fear anything artificial or synthetic. The idea that a product is natural reassures them, even if it’s really more harmful than a manmade equivalent. Most governments regulate medication, but many don’t regulate “natural” products as stringently. A 1994 study of “Ayurvedic herbal medicines” in Boston “found that one sample in five...contained enough lead, mercury or arsenic” to be hazardous.
  • “Pain and suffering” – Individuals fear dangers they think carry a lot of agony, such as “death by shark attack,” more than hazards that seem to bring less anguish, like heart disease – even though heart disease kills far more people than sharks.
  • “Uncertainty” – When you don’t know what’s going on or how to make yourself safer, you’re more likely to be scared than when you face a danger you understand. Neighbors of Harrisburg, Pennsylvania’s Three Mile Island nuclear plant were extra fearful during its meltdown because they didn’t know their real risk.
  • “Catastrophic or chronic” – History’s worst plane crash killed 583 people in the Canary Islands in 1977. Heart disease claims that many victims, just in the U.S., “every eight hours.” But events that kill many individuals suddenly seem more frightening than slow risks like illnesses.
  • “Can it happen to me?” – Regardless of probability, you’ll feel more fear if you think something dangerous could affect you personally. Take the U.S. public’s sudden fear of terrorism after 9/11. Victims, including Americans, had been dying in terrorist attacks elsewhere for decades, but when terrorists killed Americans in the U.S., that danger became personal – and more real.
  • “New or familiar?” – A novel risk seems more threatening than a familiar one. West Nile virus is much like other deadly mosquito-borne illnesses, but it seemed mysterious and, thus, more frightening.
  • “Risks to children” – A menace that might scare you a little gets much more alarming if it threatens children. This is built into humanity on a “biological level.” Evolutionarily, you want your genes, and mankind overall, to survive; that requires protecting children.
  • “Personification” – An abstract risk expressed as a number, such as battlefield casualty statistics, affects you intellectually. Yet, if you personalize a dangerous circumstance with a face, a name or a story, it hits you harder. Photographs of soldiers’ flag-draped coffins bring a war home in ways that statistics do not.
  • “Fairness” – If a car hits a child, that’s tragic. But if a car hits a blind child, that’s especially unfair and, thus, seems more frightening.

Collective Influences on Risk Perception

In addition to these individual risk-evaluation parameters, some shared, group-based factors also affect how you see dicey situations. These influences stem from your society, community, or cultural orientation and allegiances. The most basic persuasive experience is a community-specific event. For example, a number of fires in a single town can create a sense of shared peril and awareness. Groupthink is a related factor. When a danger threatens a group, “people tend to make quick and effective decisions about risk” based on solidarity. Individuals who disagree with each other will pull together to defend their group or family. This makes evolutionary sense – a tribe may need to act quickly together to fight an enemy – but it doesn’t work well with complex, conceptual threats.

“When we get risk wrong, when we are more afraid or less afraid than the facts suggest we need to be, the Perception Gap becomes a risk in and of itself.”

“Cultural cognition,” a more subtle collective influence, deals with the way a culture shapes its members’ beliefs and guides their reactions. Scholar Dan Kahan and his fellow researchers crystallized these tendencies into two pairs of “worldviews.” People are either “individualists” or “communitarians,” and they also are either “hierarchists” or “egalitarians.” These intersecting viewpoints are more accurate indicators than gender or political loyalties when it comes to predicting how people will react to perceived risk. Individualists will accent personal responsibility and limited government, while communitarians will emphasize society’s needs and collective action. Hierarchists will say that society should use social rules to distribute rights and duties “differently.” Egalitarians will favor equal distribution of rights and goods.

“A thousand fearful images and dire suggestions glance along the mind when it is moody and discontented with itself. Command them to stand and show themselves, and you presently assert the power of reason over imagination.” (Sir Walter Scott)

The media exert a pervasive shared influence. Most people view society in this complex, global world through the lens of the media. Yet, while the professional news media posit an ideal of objectivity, media outlets – as businesses – function in ways that can distort your evaluations. Journalists shape even accurate reports to be exciting and newsworthy. They simplify situations, take things out of context and focus on threats.

Closing the Gap in Your Perceptions

Fortunately, you can reduce your “perception gap.” Start with the media. The news often reports danger in the most stirring way. If one person in a million died from a disease last year, and this year, two people died from it, the news will report this as a 100% increase. Seek both the “relative risk” (the percentage change) and the “absolute risk” (the actual number of people affected). When the news gives a threat a human face, step back and ask if that personalization is accurate. When it reports a danger, determine if you’re actually in harm’s way, how great the likelihood of harm is and how large an exposure must be to hurt you. Seek unbiased news sources that represent different ideological positions. Work to open your mind.

“Fear grows in darkness; if you think there’s a bogeyman around, turn on the light.” (Dorothy Thompson)

More generally, deal with your perception gap by acknowledging it, and by taking the factors that influence it into account. When you’re stressed or tired, you can’t think about risk as clearly as when you are rested and calm. Take time to evaluate your situation and learn more. Ask active questions about any hazard. Since most people are bad with numbers, reframe numerical statements to make sure you’re interpreting them fairly and accurately. Practice trade-offs: Do any benefits balance the potential dangers? “Risk perception factors” are at work; be aware and try to evaluate them objectively.

About the Author

David Ropeik is the co-author of RISK and has written for major metropolitan newspapers.