Streetlights and Shadows

Book Streetlights and Shadows

Searching for the Keys to Adaptive Decision Making

MIT Press,


Recommendation

In 1998, Gary Klein gave readers Sources of Power, a thoughtful, innovative consideration of how to make decisions in complex situations. Here, he returns to the same subject in even greater depth. Klein has spent decades studying and interviewing people, such as firefighters, soldiers and pilots, who make decisions in complicated, shifting, high-stakes circumstances. He discusses what most people believe about making decisions, and shows how they err...some of the time. In ambiguous, unknown settings or under complex conditions, people tend to simplify until their beliefs become dangerous. This entertaining book grapples with many of life’s more challenging situations. As a result, BooksInShort recommends Klein’s insights to leaders, trainers and anyone who must make more effective decisions in crises.

Take-Aways

  • Most common beliefs about decision making are wrong or incomplete.
  • Decision making functions differently in complex, chaotic circumstances than in clearly defined, well-known situations.
  • Formal guidelines can get in the way of good decisions and learning.
  • You can’t eliminate risk, establish true common ground or set definitive goals. You must adapt as you go.
  • More information helps decision making only when lack of data is a problem.
  • Communication is always difficult, and feedback is easily misunderstood.
  • Make sense of a situation as best as you can, adapt to it and make a decision. Blend intuition with logical analysis, and always engage your expertise.
  • People overemphasize “explicit knowledge,” such as procedures and guidebooks.
  • “Tacit knowledge,” which includes intuition, is the source of pattern recognition and “mental models.”
  • Any systems or tools designed to help people make better decisions must take into account individual expertise and how the mind functions.
 

Summary

Common Beliefs About Decisions

The human eye sees in light and shadow, but uses different systems for daylight and darkness. In daytime, “cone cells” take in a lot of light, and you see the world in detail. When night falls, “rod cells,” which are more sensitive to faint light but poor at picking up details, take over. You need both systems to see well. Vision functions as an analogy for decision making. Approach decision making one way when the situation is well-known and clearly defined, and another way in unknown or ambiguous circumstances.

“Logic and statistics are useful, but they aren’t sufficient for making good decisions, and they sometimes result in worse decisions.”

Most common beliefs about decision making work in “bright and clear conditions.” People don’t know how to respond to more opaque situations, and most guidelines interfere with complex problem solving. Familiar rules of thumb don’t apply; an emphasis on directives and formal procedures can be limiting or even dangerous. A majority of the leaders and professionals surveyed supported the following incorrect beliefs about making decisions:

  • If you teach people a specific procedure, they perform assigned tasks better – Routine tasks evolve over time, but published procedures usually stay frozen. Procedures are abstract and do not take local conditions into account. They don’t allow for learning, growing expertise or focus. When people follow a procedure, they often don’t pay attention to what they’re doing; the procedure replaces their judgment. Procedures are useful during training, to jog your memory, to guide you when your attention wavers or for routine tasks. In complex situations, procedures are dangerous.
  • Bias distorts your thinking – When someone in a decision-making situation introduces a number, people often “anchor” their estimates to that number. Mentioning a number when asking a question produces answers closer to that number. Other common biases are the “framing heuristic,” in which people view information differently based on how the presentation frames it, and the “representativeness heuristic,” in which people make choices depending upon whether they find that a sample represents the situation or not. Cognitive biases exist, but they don’t pose the risk to decision making that common belief claims, because people don’t use these cognitive factors irrationally. They apply frames or examples when they find it necessary to think about choices. To help people use these focusing factors more effectively, give them useful frames, help them develop expertise and design their choices to support the right actions.
  • Use logic, rather than intuition, when making decisions – Many common decision-making models tell you to collect hard data, such as statistics, and weigh it rationally. Other studies show that deciding quickly, without too much information, produces better decisions. People may overthink and back away from strong initial decisions. People who have brain damage that eliminates emotions from their decision-making process tend to make much worse decisions, indicating that sound decision making requires both intuition and logic.
  • When making decisions, block out several choices and compare them – Many schools teach that people consider more choices when they gain more experience, that novices hurry while experts weigh options, that analyzing your choices works best in decision making, and that people decide by applying the same set of criteria to each option. Analytical methods are attractive, because employers can universalize and teach them, and they offer evidence to justify decisions. But this matrix takes too long to be useful in making most decisions. In emergencies, no one has time to apply a matrix. Many different options, or choices that differ considerably, obviate methodical thinking. In time-sensitive situations, experts read situational cues and react. If one choice doesn’t work, they shift tactics.
  • In ambiguous situations, you can “reduce uncertainty” if you get more information – This belief is true when the situation is uncertain because of a lack of information. However, you might have information that you distrust or reliable data that contradicts your values. Perhaps you possess the relevant facts, but you do not understand what to do with them or how to act on them. This belief treats all problems like puzzles: You could solve them if only you possessed the right clues. However, some problems aren’t puzzles; they are mysteries. Their solutions require analysis and action, not information. Beyond a certain amount of data acquisition, for instance, meteorologists get worse at predicting storms. However much data you have, you need to understand it.
  • Don’t jump to conclusions; wait until you have all the evidence to act – You’ve probably seen an early news report that says one thing and a later story that contradicts the first report. That turnabout might cause you to freeze in fluid situations. And that will lead to poor decisions. Draw conclusions and test them. Actively speculate about what might resolve a situation instead of passively waiting for new data.
  • Giving people “feedback on the consequences of their actions” helps them learn better – Employees can misinterpret feedback, and your input may not address their actions. Giving feedback is easier in well-ordered, well-understood situations and more difficult in complex, shifting circumstances. Complexity can create crises just when you most need to improve your performance. You might not have time to give feedback, and employees might not be in a position to receive or even detect it if they are, for example, overwhelmed. Make your feedback understandable, and shape it to the moment.
  • You make sense of events by drawing “inferences from data” – The metaphor for this rule is the “assembly line”: You get data and develop it into information; you assemble that information into knowledge and you then use reasoning to understand it. Problems arise when you can’t connect the dots because you don’t know what a dot is. Data and relevance are not self-evident; making an educated judgment requires accumulated expertise. Create narratives early in a situation and revise them as new dots of data arrive.
  • To start a project, clearly describe your goal – More people agree with this rule on decision making than any other. People like to define goals before they start, so try to provide those objectives when you can. But the world is both complex and shifting, so insisting on satisfying this rule can stall progress. What do you do with “goal trade-offs” when two values clash? Or in situations with “emergent goals,” such as when you’re developing a new technology? For example, when Xerox first marketed copiers in the 1950s, it tried several methods to promote its business before developing one that worked. This rule also pushes you to set goals you can measure “objectively.” Thus it’s of little use if your goals are qualitative. Try to establish objectives, but in complex situations, redefine your goals as you move forward. This is “managing by discovery”: Set goals, pursue them, gather data and redefine your objectives as suggested by the data and the situation.
  • Your plans will work better if you “identify the biggest risks” and get rid of them – This rule works in a “mature industry” or with a well-known activity. But if you’ve never done something before, accurate evaluations can prove elusive. The most devastating risks are “black swans,” risks you don’t know how to look for or identify, let alone eliminate. Many “risk-mitigation plans” become part of the problem because they are rigid and allow organizations to develop blinders. Practice “resilience engineering,” and work on an “adaptive mind-set” to respond quickly and effectively.
  • As a leader you should assign roles and write “ground rules” – Setting up roles in advance lets you define “common ground.” The problem with this rule is simple: Communication is always difficult, and people’s perceptions spring from their individual experiences. Recognize how “fragile” common ground can be, and how much effort, shared experience and training it takes to produce and maintain it.

How to Make Better Decisions in Complex Situations

To make better decisions in the complicated, changing, ambiguous circumstances that you encounter in times of great crisis, start by shifting your mind-set. Rather than viewing your mind as a “mental storehouse” stocked with information and guidelines – a view that assumes you can organize your mind in a stable, consistent fashion – you must practice “unlearning.” See this process in the context of the useful “snakeskin metaphor.”

“We need both intuition and analysis. Either one alone can get us in trouble.”

As you move through life, you develop, change and shed your skin, casting off parts of yourself that are no longer functional. Accepting this premise allows you to make other broad, related shifts in your approach, such as giving up a “fixation” on any specific view, goal or procedure. Adopt a mind-set that allows for continual change and growth. Consciously pause in the midst of decision making and ask what it would take to get you to change your mind about a primary issue. Pay particular attention to evidence that contradicts your position; consider how hard you would have to work to explain it away. Studying other similar cases or bringing in outsiders who are free from your specific history might help you to modify your perspective.

Make a Circle

Another broad shift in approach is to move from linear decision making to a circular, complex process. Imagine a circle with expertise in the center, and three points along the outer edge: “adapting,” “sense-making” and “decision-making.” You can start a circle at any point, and you can move in any direction along it. Making sense of a situation will lead you to making a decision that you adapt as you act on it. Or you can start by adapting to external changes as you make sense of them, progressively learning what’s required to make a functional decision. Set up your actions and your organization’s activities to draw on the accumulated knowledge of those involved.

“Explicit” and “Tacit” Knowledge

This change involves a switch in processing from explicit knowledge, such as procedures and guidebooks, to tacit knowledge. Explicit knowledge has worth, but organizations often overvalue it, though it’s only the tip of the iceberg. To understand a situation, especially one that involves making sense of complexities, look beneath the surface and call on the broad array of factors that make up tacit knowledge: your ability to recognize patterns, the “mental models” you bring to bear on a situation, your perceptions, your skills and the judgment you’ve developed to address an issue’s specific challenges. You can develop each factor that builds your tacit knowledge. For example, you can learn to build better mental models or to use your senses more skillfully.

“Too much information can make things worse.”

Explicit knowledge can exist independently of expertise. It resides in a book or computer. It may or may not be accurate, and it remains fixed until someone changes it. Tacit knowledge, on the other hard, depends on experience; how well that tacit knowledge functions depends on your level of know-how and your willingness to embrace your experiences. These two knowledge categories are not fixed. You can convert tacit knowledge to explicit knowledge. Internalizing explicit knowledge integrates it with the deep well of your tacit knowledge.

Mental Models

You become expert in your field by drawing lessons from your experience and by developing better and more sophisticated mental models about your arena of capability. Expertise doesn’t keep you from making mistakes, but can help you recognize your mistakes more quickly. As you pick up cues that something is wrong, you can match your actions and outcomes against accumulated patterns and mental models with greater skill and fluidity. Expertise allows you to notice details at all levels of performance. It enables you to judge how well someone executes a task, the pivotal elements in a skill, and the signs of failure or struggle.

“Support Systems”

Many people push for having automated systems make decisions. They try to shape “decision-support systems” to help people make choices. Many of these systems “are rejected or fail.” Valid reasons for rejecting such systems exist, particularly if they are based on faulty beliefs or lack sensitivity to tacit knowledge. Some systems substitute algorithms for individual expertise. Algorithms can help if you apply them well, but too often, these systems undervalue the processing power of human insight. People don’t think like computers, and that’s a good thing. Computers process information quickly. The human mind, on the other hand, works with far less information, but it excels at identifying and applying patterns. To make better decisions, foster this unique human cognitive “magic.”

About the Author

Research psychologist Gary Klein, author of Sources of Power, is a senior scientist at Applied Research Associates.