Itâs All About Context
Tens of thousands of choices, from hundreds of supermarket products to dozens of TV shows, challenge modern consumers. How do they handle decisions on these and lifeâs other options? Based on a wide variety of experiments and studies, psychology researchers have reached some scientific conclusions that can help you make better decisions. First, understand that every decision completely depends on context, which influences how you see the factors you must weigh. A famous experiment indicates that âselective perceptionâ often colors peopleâs judgments. In it, a machine quickly flashes pictures of five playing cards: the five, three and ace of hearts and the five and seven of spades. Researchers ask viewers if any card seems odd. Although the normally red heart on the three of hearts is black, most people see nothing strange, due entirely to context and selective perception.
âBecause judgments are so easily influenced by question wording and framing, the safest course of action is to elicit them in a variety of ways and compare the results.â
Related judgment problems stem from the general human inclination to avoid âcognitive dissonance,â or âpsychological inconsistencies.â This parable explains: A gang of young hooligans gathered in front of a Jewish tailorâs shop yelling, âJew! Jew!â The first day this happened, the tailor told the gang members he would pay them each a dime to yell âJew!â at him. He then gave each thug a dime, implying that he was happy with the boysâ actions and, thus, confusing them about their motives. The next day, the gang came back and screamed again. This time, the tailor told the bullies he could pay them only a nickel to shout at him. He handed out nickels. The day after, they were back, yelling âJew!â This time, the tailor said he could afford to pay each taunting boy only a penny. They got angry. âToo bad,â said the tailor. âThat is all you get. Take it or leave it.â âYou are crazy if you think we will yell for only a penny,â said the head of the gang, leading his hoods away in a huff. What happened? The tailor created psychological dissonance in each hooliganâs mind by changing the âgangâs motivation from anti-Semitism to monetary reward.â When the reward was no longer sufficient, they left. The tailor changed their objective from spewing bile to earning dimes, making it inconsistent â in terms of their altered motives â for the gang members to yell at him for free.
Donât Trust Your Memory
Flawed memory can interfere with logical, emotion-free decision making. Many individuals mistakenly assume that memories are mental copies of previous experiences. They are not. In fact, memories are mental reconstructions of past events. And humans often reconstruct their personal histories in ways that have little to do with what actually happened. Indeed, research indicates that memories can be totally inconsistent with actual past events. (To get around this problem, keep good records.) Context, which affects all human perceptions, also plays a major role in memory formation. Real estate agents know that they will have a better opportunity to sell a home if they can contrast it positively (that is, place it in context) against another home that needs a lot of work or is too costly. Context plays such a large role in perception that the very idea of âcontext-free judgmentâ is, for all practical purposes, meaningless.
âJudgment and decision making depend heavily on situation-specific factors, such as how much time a decision maker has available and what mood the decision maker is in.â
Even when people answer mundane questions, context plays a large role in determining their responses. With some queries, people gladly supply answers (âpseudo-opinionsâ) on subjects even though they know nothing about them. In one experiment, researchers asked hundreds of college students to rate the degree of unity among the citizens of various nations. These citizens included âDanireans, Pireneans and Wallonians.â Four out of five students ventured an opinion on how closely such people felt toward their fellow citizens. Of course, Danireans, Pireneans and Wallonians do not exist. Survey experts know that the way they structure and frame questions can dramatically affect the responses. The order in which you ask questions and even the order of the words in a question affect how people respond. Remember that when you consider survey results.
Decision-Making Theories
Throughout history, decision theory has occupied great minds, from mathematicians Nikolaus and Daniel Bernoulli in the 18th century to mathematician John von Neumann and economist Oskar Morgenstern in the 20th century. In 1947, these two geniuses refined Bernoulliâs âexpected utility theoryâ by describing how individuals would perform if they always acted rationally (which they do not) when making decisions. They found that six basic principles apply to logical decision making:
- âOrdering of alternativesâ â This presumes people weigh choices logically.
- âDominanceâ â Rational decision makers make never choose a decision-making strategy that is âdominatedâ by another strategy. That means the decision they make will be better than other choices in one or more aspects and not worse in any aspect.
- âCancellationâ â Similar traits of different options cancel each other. Rational decision making would require people to ignore them and focus on those aspects where the choices are not alike.
- âTransitivityâ â The rational person who likes Result A more than Result B, and prefers Result B to Result C, will also like Result A better than Result C.
- âContinuityâ â Assuming superior odds, a rational person would gamble between a wonderful outcome and a wretched one, instead of picking a set âintermediate outcome.â
- âInvarianceâ â Having lots of options should not deter rational decision makers.
âIf you are like most people, your perceptions are heavily influenced by what you expect to see.â
In 1954, Leonard Savage proposed a notable variation, âsubjective expected utility theory,â that factored âpeopleâs subjective probabilitiesâ into expected utility theory to add consideration of individual human variables. Numerous other decision researchers have tried over the years to supplant expected utility theory, though it remains the primary concept about decision making. In 1959, Duncan Luce added the idea of âstochasticâ choice, which takes ârandomnessâ into account and helps explain preferences. It treats choices as probabilities, not fixed options, making it possible to answer such questions as, âWhy do people âprefer salad one day and soup the next?â In 1979, Daniel Kahneman and Amos Tversky developed âprospect theory,â which does a better job of describing how people actually decide because it incorporates the notion of loss aversion (people donât like to lose money or other goods), a major factor in decision making.
Common Heuristics and Associated Biases
Heuristics are âgeneral rules of thumbâ that people use in decision making. These rules save time when you weigh options and usually work fairly well, but they can lead to biased thinking. For example, take the ârepresentative heuristic,â which deals with whether one choice approximates or stands in for another. To understand a common bias problem with this heuristic, consider Linda, who is committed to social justice. She takes part in demonstrations against nuclear power. Which is more likely: 1) âLinda is a bank tellerâ or 2) âLinda is a bank teller and an active feminist.â A majority of research study respondents chose the second option, though logic dictates that having two coinciding traits (âbank tellerâ and âfeministâ) is less probable than having either trait independently.
âWe do not first see, then define; we define first and then see.â (Walter Lippmann)
Kahneman and Twersky note, âAs the amount of detail in a scenario increases, its probability can only decrease steadily, but its representativeness and...apparent likelihood may increase.â That is, do not let very detailed scenarios mislead you. The more specific the situation, the less likely it is to occur, even if all the details seem right.
âVirtually all current theories of decision making are based on the results of research concerning biases in judgment.â
The âavailability heuristicâ concerns an eventâs probability based on how common or frequent it is. It, too, is subject to bias. For example, some events that do not necessarily happen more often than others are nonetheless recent, easy to bring to mind or highly emotional, so they seem more available â and thus provide an apparently sound basis for making judgments. For example, when researchers ask people which is a âmore likely cause of death in the U.S.â: a shark attack or falling pieces of airplane engines, most people guess shark attacks. Such attacks receive lots of press and popular attention (remember Jaws?). Nevertheless, statistics prove that you are 30 times more likely to die because of falling airplane parts than because of a shark attack.
âProbability and risk are everywhere: in the brakes of our cars, in the clouds above our heads, in the food we eat and in the trust we place in others.â
This wrinkle in the way that availability biases how people think about frequency has important ramifications. For example, if people are not mindful of actual mortality rates regarding such dangers as stomach cancer, they will be less inclined to live healthier lifestyles. To counter that lack of attention, a health advocacy group could post billboards reading: âThis year, more people will die from stomach cancer than from car accidents.â Armed with this data, members of the public presumably would be more likely to take steps to lessen their risks of stomach cancer.
âWhen groups are cohesive and relatively insulated from the influence of outsiders, group loyalty and pressures to conform can lead to...âgroupthinkâ.â
You can reduce the biases that filter your thinking when you estimate risk or judge probabilities. Keeping accurate records helps. If you know precisely how often something occurs, you can minimize your âavailability biases.â Also, avoid âwishful thinkingâ: People assume more often that desirable events will occur than that undesirable ones will. To avoid this common bias, ask respected, knowledgeable third parties for their assessments of your choices on major decisions.
What Other People Think
How individuals think about things and subsequently make decisions and judgments greatly affects the way that the people around them think and decide. In 1985, decision analyst Philip Tetlock advocated viewing decision makers as ââpoliticiansâ who are accountable to their âconstituentsâ,â that is, their friends, relatives and co-workers. Tetlock says that before these politicians make decisions, they routinely ask, âHow will others react if I do this?â
âJudgment and decision research is conducted by human beings who are prone to many of the same biases and errors as their experimental subjects.â
In 1982, decision analyst Irving Janis coined the term âgroupthinkâ to explain how âgroup loyalty and pressures to conformâ can shape peopleâs decisions. He described groupthink as âa deterioration of mental efficiency, reality testing and moral judgment, as a result of in-group pressures.â When President John F. Kennedy decided to invade Cuba based on the ill-conceived recommendations of his cabinet, he was influenced by groupthink.
Mental Traps
Overconfidence often leads to judgment errors. The explosion of the U.S. space shuttle Challenger provides sad proof. Prior to the launch, NASA estimated that the shuttleâs risk ratio was âone catastrophic failure in 100,000 launches.â To mitigate overconfidence when making decisions, âStop to consider why your judgment might be wrong.â Such a mental recalibration can protect you from making potentially dangerous decisions.
âWhen time pressure is high, decision makers use simplifying strategies, attend to a small number
In 1948, sociologist Robert Merton defined a âself-fulfilling prophecyâ as âa false definition of the situation evoking a new behavior which makes the originally false conception come true.â So, a false prediction becomes accurate precisely because of the false prediction. People who made self-fulfilling prophecies might claim that their misconceptions were right because of the outcome â even though their misconception provoked it. Such reasoning can be insidious, particularly in the area of racial stereotypes. As a preventative, donât seek âconfirming evidenceâ to validate initial negative judgments. However, people do have a tendency to work hard to confirm their faulty preconceptions. That is called âconfirmation bias.â
Is Everyone Irrational?
When you consider the many common mental errors and biases that send judgment off track, it may seem that you should make decisions by flipping coins or throwing darts. But rationality, at least in decision making, does not mean just thinking correctly; it also means deciding by choosing the option that enables the best outcome. This varies so much from person to person, and situation to situation. No onlooker can determine who is making the most rational decisions.
âJudgment and decision research is subject to a paradox: If its findings are valid (i.e., biases and errors exist), then its findings are to some degree, subject to biases and errors.â
An outside observer does not always know what âtransient and often equivocal situational factorsâ influence someone elseâs decision. Who knows what perceptual or contextual issues may have biased someoneâs judgment? Trying to understand why people decide as they do can be baffling. Researchers have tried to create âdebiasing techniquesâ to âreduce biases and errors in judgment.â One such tactic calls for always considering alternative perspectives. To make better decisions, apply this rule to your judgments. That leaves one paradox remaining: If everything researchers found about faulty judgment is true, then their findings are inherently contestable.