The Psychology of Judgment and Decision Making

Book The Psychology of Judgment and Decision Making

McGraw-Hill,


Recommendation

Making the right decisions is seldom easy. Situations change and choices confound. Faulty perceptions and biases can block clear thinking and undermine the ability to weigh alternatives rationally. As U.S. Supreme Court Justice Benjamin N. Cardozo explained 90 years ago, “We may try to see things as objectively as we please. Nonetheless, we can never see them with any eyes except our own.” This is the vexing paradox involved in making decisions: People who are in the process of deciding cannot always trust their own perceptions and thought processes. Psychologist Scott Plous, winner of numerous awards and honors, examines decision making in this rigorously scientific yet mostly accessible book, itself an award winner. BooksInShort believes it will interest decision analysts, researchers, psychologists and strategists, as well as readers who want to know why they may make poor decisions and how to make better ones.

Take-Aways

  • Context and perception have an outsized influence on people’s decisions.
  • People often make judgments that reduce their “psychological dissonance,” or apparent internal contradictions.
  • Memories are often flawed and can interfere with logical decision making.
  • A question’s order of information and how it is framed affects the response.
  • Mathematician John von Neumann and economist Oskar Morgenstern described decision-making principles that people would apply if they were rational.
  • Daniel Kahneman and Amos Tversky’s “prospect theory” is a better guide to how people really make decisions.
  • People use mental shortcuts, called heuristics, to form judgments.
  • Heuristics are helpful, but they can prejudice clear, logical thinking.
  • People often base their individual judgments on how others will react to them.
  • To make better decisions, keep accurate records, carefully estimate risks and choose your actions based on the probably outcomes.
 

Summary

It’s All About Context

Tens of thousands of choices, from hundreds of supermarket products to dozens of TV shows, challenge modern consumers. How do they handle decisions on these and life’s other options? Based on a wide variety of experiments and studies, psychology researchers have reached some scientific conclusions that can help you make better decisions. First, understand that every decision completely depends on context, which influences how you see the factors you must weigh. A famous experiment indicates that “selective perception” often colors people’s judgments. In it, a machine quickly flashes pictures of five playing cards: the five, three and ace of hearts and the five and seven of spades. Researchers ask viewers if any card seems odd. Although the normally red heart on the three of hearts is black, most people see nothing strange, due entirely to context and selective perception.

“Because judgments are so easily influenced by question wording and framing, the safest course of action is to elicit them in a variety of ways and compare the results.”

Related judgment problems stem from the general human inclination to avoid “cognitive dissonance,” or “psychological inconsistencies.” This parable explains: A gang of young hooligans gathered in front of a Jewish tailor’s shop yelling, “Jew! Jew!” The first day this happened, the tailor told the gang members he would pay them each a dime to yell “Jew!” at him. He then gave each thug a dime, implying that he was happy with the boys’ actions and, thus, confusing them about their motives. The next day, the gang came back and screamed again. This time, the tailor told the bullies he could pay them only a nickel to shout at him. He handed out nickels. The day after, they were back, yelling “Jew!” This time, the tailor said he could afford to pay each taunting boy only a penny. They got angry. “Too bad,” said the tailor. “That is all you get. Take it or leave it.” “You are crazy if you think we will yell for only a penny,” said the head of the gang, leading his hoods away in a huff. What happened? The tailor created psychological dissonance in each hooligan’s mind by changing the “gang’s motivation from anti-Semitism to monetary reward.” When the reward was no longer sufficient, they left. The tailor changed their objective from spewing bile to earning dimes, making it inconsistent – in terms of their altered motives – for the gang members to yell at him for free.

Don’t Trust Your Memory

Flawed memory can interfere with logical, emotion-free decision making. Many individuals mistakenly assume that memories are mental copies of previous experiences. They are not. In fact, memories are mental reconstructions of past events. And humans often reconstruct their personal histories in ways that have little to do with what actually happened. Indeed, research indicates that memories can be totally inconsistent with actual past events. (To get around this problem, keep good records.) Context, which affects all human perceptions, also plays a major role in memory formation. Real estate agents know that they will have a better opportunity to sell a home if they can contrast it positively (that is, place it in context) against another home that needs a lot of work or is too costly. Context plays such a large role in perception that the very idea of “context-free judgment” is, for all practical purposes, meaningless.

“Judgment and decision making depend heavily on situation-specific factors, such as how much time a decision maker has available and what mood the decision maker is in.”

Even when people answer mundane questions, context plays a large role in determining their responses. With some queries, people gladly supply answers (“pseudo-opinions”) on subjects even though they know nothing about them. In one experiment, researchers asked hundreds of college students to rate the degree of unity among the citizens of various nations. These citizens included “Danireans, Pireneans and Wallonians.” Four out of five students ventured an opinion on how closely such people felt toward their fellow citizens. Of course, Danireans, Pireneans and Wallonians do not exist. Survey experts know that the way they structure and frame questions can dramatically affect the responses. The order in which you ask questions and even the order of the words in a question affect how people respond. Remember that when you consider survey results.

Decision-Making Theories

Throughout history, decision theory has occupied great minds, from mathematicians Nikolaus and Daniel Bernoulli in the 18th century to mathematician John von Neumann and economist Oskar Morgenstern in the 20th century. In 1947, these two geniuses refined Bernoulli’s “expected utility theory” by describing how individuals would perform if they always acted rationally (which they do not) when making decisions. They found that six basic principles apply to logical decision making:

  1. “Ordering of alternatives” – This presumes people weigh choices logically.
  2. “Dominance” – Rational decision makers make never choose a decision-making strategy that is “dominated” by another strategy. That means the decision they make will be better than other choices in one or more aspects and not worse in any aspect.
  3. “Cancellation” – Similar traits of different options cancel each other. Rational decision making would require people to ignore them and focus on those aspects where the choices are not alike.
  4. “Transitivity” – The rational person who likes Result A more than Result B, and prefers Result B to Result C, will also like Result A better than Result C.
  5. “Continuity” – Assuming superior odds, a rational person would gamble between a wonderful outcome and a wretched one, instead of picking a set “intermediate outcome.”
  6. “Invariance” – Having lots of options should not deter rational decision makers.
“If you are like most people, your perceptions are heavily influenced by what you expect to see.”

In 1954, Leonard Savage proposed a notable variation, “subjective expected utility theory,” that factored “people’s subjective probabilities” into expected utility theory to add consideration of individual human variables. Numerous other decision researchers have tried over the years to supplant expected utility theory, though it remains the primary concept about decision making. In 1959, Duncan Luce added the idea of “stochastic” choice, which takes “randomness” into account and helps explain preferences. It treats choices as probabilities, not fixed options, making it possible to answer such questions as, “Why do people “prefer salad one day and soup the next?” In 1979, Daniel Kahneman and Amos Tversky developed “prospect theory,” which does a better job of describing how people actually decide because it incorporates the notion of loss aversion (people don’t like to lose money or other goods), a major factor in decision making.

Common Heuristics and Associated Biases

Heuristics are “general rules of thumb” that people use in decision making. These rules save time when you weigh options and usually work fairly well, but they can lead to biased thinking. For example, take the “representative heuristic,” which deals with whether one choice approximates or stands in for another. To understand a common bias problem with this heuristic, consider Linda, who is committed to social justice. She takes part in demonstrations against nuclear power. Which is more likely: 1) “Linda is a bank teller” or 2) “Linda is a bank teller and an active feminist.” A majority of research study respondents chose the second option, though logic dictates that having two coinciding traits (“bank teller” and “feminist”) is less probable than having either trait independently.

“We do not first see, then define; we define first and then see.” (Walter Lippmann)

Kahneman and Twersky note, “As the amount of detail in a scenario increases, its probability can only decrease steadily, but its representativeness and...apparent likelihood may increase.” That is, do not let very detailed scenarios mislead you. The more specific the situation, the less likely it is to occur, even if all the details seem right.

“Virtually all current theories of decision making are based on the results of research concerning biases in judgment.”

The “availability heuristic” concerns an event’s probability based on how common or frequent it is. It, too, is subject to bias. For example, some events that do not necessarily happen more often than others are nonetheless recent, easy to bring to mind or highly emotional, so they seem more available – and thus provide an apparently sound basis for making judgments. For example, when researchers ask people which is a “more likely cause of death in the U.S.”: a shark attack or falling pieces of airplane engines, most people guess shark attacks. Such attacks receive lots of press and popular attention (remember Jaws?). Nevertheless, statistics prove that you are 30 times more likely to die because of falling airplane parts than because of a shark attack.

“Probability and risk are everywhere: in the brakes of our cars, in the clouds above our heads, in the food we eat and in the trust we place in others.”

This wrinkle in the way that availability biases how people think about frequency has important ramifications. For example, if people are not mindful of actual mortality rates regarding such dangers as stomach cancer, they will be less inclined to live healthier lifestyles. To counter that lack of attention, a health advocacy group could post billboards reading: “This year, more people will die from stomach cancer than from car accidents.” Armed with this data, members of the public presumably would be more likely to take steps to lessen their risks of stomach cancer.

“When groups are cohesive and relatively insulated from the influence of outsiders, group loyalty and pressures to conform can lead to...‘groupthink’.”

You can reduce the biases that filter your thinking when you estimate risk or judge probabilities. Keeping accurate records helps. If you know precisely how often something occurs, you can minimize your “availability biases.” Also, avoid “wishful thinking”: People assume more often that desirable events will occur than that undesirable ones will. To avoid this common bias, ask respected, knowledgeable third parties for their assessments of your choices on major decisions.

What Other People Think

How individuals think about things and subsequently make decisions and judgments greatly affects the way that the people around them think and decide. In 1985, decision analyst Philip Tetlock advocated viewing decision makers as “‘politicians’ who are accountable to their ‘constituents’,” that is, their friends, relatives and co-workers. Tetlock says that before these politicians make decisions, they routinely ask, “How will others react if I do this?”

“Judgment and decision research is conducted by human beings who are prone to many of the same biases and errors as their experimental subjects.”

In 1982, decision analyst Irving Janis coined the term “groupthink” to explain how “group loyalty and pressures to conform” can shape people’s decisions. He described groupthink as “a deterioration of mental efficiency, reality testing and moral judgment, as a result of in-group pressures.” When President John F. Kennedy decided to invade Cuba based on the ill-conceived recommendations of his cabinet, he was influenced by groupthink.

Mental Traps

Overconfidence often leads to judgment errors. The explosion of the U.S. space shuttle Challenger provides sad proof. Prior to the launch, NASA estimated that the shuttle’s risk ratio was “one catastrophic failure in 100,000 launches.” To mitigate overconfidence when making decisions, “Stop to consider why your judgment might be wrong.” Such a mental recalibration can protect you from making potentially dangerous decisions.

“When time pressure is high, decision makers use simplifying strategies, attend to a small number

In 1948, sociologist Robert Merton defined a “self-fulfilling prophecy” as “a false definition of the situation evoking a new behavior which makes the originally false conception come true.” So, a false prediction becomes accurate precisely because of the false prediction. People who made self-fulfilling prophecies might claim that their misconceptions were right because of the outcome – even though their misconception provoked it. Such reasoning can be insidious, particularly in the area of racial stereotypes. As a preventative, don’t seek “confirming evidence” to validate initial negative judgments. However, people do have a tendency to work hard to confirm their faulty preconceptions. That is called “confirmation bias.”

Is Everyone Irrational?

When you consider the many common mental errors and biases that send judgment off track, it may seem that you should make decisions by flipping coins or throwing darts. But rationality, at least in decision making, does not mean just thinking correctly; it also means deciding by choosing the option that enables the best outcome. This varies so much from person to person, and situation to situation. No onlooker can determine who is making the most rational decisions.

“Judgment and decision research is subject to a paradox: If its findings are valid (i.e., biases and errors exist), then its findings are to some degree, subject to biases and errors.”

An outside observer does not always know what “transient and often equivocal situational factors” influence someone else’s decision. Who knows what perceptual or contextual issues may have biased someone’s judgment? Trying to understand why people decide as they do can be baffling. Researchers have tried to create “debiasing techniques” to “reduce biases and errors in judgment.” One such tactic calls for always considering alternative perspectives. To make better decisions, apply this rule to your judgments. That leaves one paradox remaining: If everything researchers found about faulty judgment is true, then their findings are inherently contestable.

About the Author

Psychologist Scott Plous, Ph.D., a business and political consultant, teaches psychology at Wesleyan University in Connecticut. Winner of a MacArthur Foundation “genius” grant, he has studied the psychology of the nuclear arms race, and ethical issues about animals and ecology.