Dealing with Uncertainty
Predicting any future event is hard enough, but predicting a âBlack Swanâ is harder. Nassim Nicholas Taleb brought the term âBlack Swanâ into circulation as a metaphor for being blinded by past experience when you face surprising phenomena. He wrote of Europeans who were stunned to see black swans in Australia, since they thought Europeâs white swans were the only type in existence.
âThe practice of fundamental research can help decision makers adapt to a world of âBlack Swans,â the seemingly improbable but highly consequential surprises that turn our familiar ways of thinking upside down.â
Black Swans symbolize anomalies that appear abruptly and donât fit your worldview. No one can foretell all Black Swans, but âfundamental researchâ can help you predict some of them, recognize them faster than other people and respond to them more quickly.
While Black Swans occur in all fields, financial markets are especially vulnerable, since they combine the physical world, social interactions and complex organizational patterns. Sudden natural events, such as earthquakes and avalanches, are a reminder of forecastingâs volatile challenges. Weather also is subject to the âbutterfly effect,â where small actions in one place may produce disproportionate reactions elsewhere. Because of this effect, weather predictions are often good in the short run, but diverge sharply over time. When models produce such results, you can build new models â or, like weather forecasters, get better data and process it faster. Avalanches manifest âself-organized criticality,â a trait of abrupt disruptive action. You may know that adding sand to a pile can cause an avalanche, but you canât tell which grain will be the trigger.
âThe âBlack Swanâ refers to a highly improbable event that seemingly could not have been anticipated by extrapolating from past data.â
âCollective behavior,â in which people follow one anotherâs actions, rules markets by creating trends, surges and bubbles. Collective behavior is often rational, so do not depart from the general view unless you have a reason. But, sometimes, emotions drive collective behavior, making it less likely that people will check their reasoning. Those who enter markets emotionally lack professional investorsâ knowledge and are more likely to move in herds. To add complexity, each personâs actions affect other peopleâs actions. That pattern unfolded in the 1987 market crash, when investors using âportfolio insurance programsâ didnât account for how one personâs decision to sell might combine with other peopleâs decisions, leading to huge price drops.
âReal-world decision makers sometimes overlook two critical steps: calibrating models with the market and ensuring that hypotheses will be tested by upcoming information, what traders call âcatalystsâ.â
To forecast the future, you must articulate a hypothesis about what might happen and build a model â conscious or âintuitiveâ â to test that projection. With financial forecasting, since youâre working in the real world instead of a lab, âit is easy to get sloppyâ and hard to obtain reliable information or run controlled tests. Forecasters often fail to calibrate their models with the market or to identify a so-called âcatalystâ that will test their hypothesis. People âthink in probabilities,â but Black Swans are only possibilities, not probabilities or certainties. They may not occur.
âExtreme volatility may arise from the inherent complexity of the fundamental world.â
The conceptual tools that market analysts use for thinking about probabilities include the probability tree and the influence diagram, which links factors that might affect a situation. As you add data, these diagrams become more detailed and require revisiting. Probability trees display likely futures and correlate various issues. Every branch shows estimates of a different outcomeâs likelihood. New branches can sprout, showing other options in specific scenarios. Mapping all likely futures or investing results will train you to visualize multiple outcomes and attend to neglected possibilities without being hypnotized by the present. Creating trees can help you identify âasymmetric outcomes,â where positive or negative futures dominate. Fleshing out your investing tree may call for listing a range of risk factors and rates of growth. These tools, modified with more information, helped analysts at Morgan Stanley see in 2001 that Providian, âa high-flying credit card company,â was a riskier investment than the market indicated.
âPeople naturally think in probabilities. For high-stakes decisions, getting the odds right is a critical skill â especially when Black Swans are involved.â
When youâre forecasting, overconfidence and underconfidence work against you. Overconfidence tempts forecasters because of how the mind handles information. Lab tests show that people often rate their own judgment and knowledge base as better than they are. A model can easily lead you into excess certainty by overweighting a single variable. Experience teaches most professional investors to shun too much self-assurance, since arrogant traders lose money and leave the market. Sometimes when you feel sure, you are actually right. You really do know whatâs happening. But sometimes you can feel just as positive and be completely wrong. People tend to trust their memories too much and to give extra heft to data theyâve gathered through firsthand experience.
âFinding the right balance between overconfidence and underconfidence is a form of calibrating oneâs intuitive thought processes with real-world probabilities.â
Intuitively recognizing a pattern in your data can produce a âfeeling of confidence,â justified or not. First responders, such as firefighters, have no time for research, so they must be able to trust their intuitive flashes. But those with time for reflection, like investors, can go beyond intuition and build data that justifies confidence.
âThe elements of an information strategy include zeroing in on critical issues, reacting intuitively to new developments and making thoughtful decisions about allocating limited resources.â
First, examine your hypotheses: Do they focus on the right issues? Identify a catalyst that drives change and takes the larger market into account. Gather data about your competitorsâ investing. Examine the probabilities for each potential scenario, basing your estimates on âexpert knowledge.â Test your modelâs validity. Be sure it accounts for the way varied factors will influence one another. Listen to your intuition, but not to your ego.
Information and Its Challenges
Information overload, which happens because more information exists than anyone or any group can know, often catches people by surprise, particularly in changing conditions. Computers provide some help, but computing takes money, time and energy. And it produces even more data, adding to the overload. Some problems are unsolvable, and when computers try to solve them anyway, they keep going endlessly, causing a system to hang. Even the market canât respond to data with complete efficiency, because it canât possibly know everything, either. Thus, as an investor, you must make a fundamental shift in how you deal with information. Rather than waiting for it to come to you, develop an âactive strategyâ where you focus on âcritical issues,â while, by definition, paying less attention to tangential matters. Critical issues are variables that have âsignificant impactâ on the matter youâre researching and âhighly uncertainâ outcomes, forcing you to screen many possible futures. Usually people are good at identifying these issues, but they lose track when data overwhelms them. In emerging fields, where nothing is well-defined, identifying pivotal issues is an intuitive art. As you research your critical investment concerns, avoid the common pitfalls in many peopleâs information strategies. These mistakes include trying to sort all the data available, failing to communicate with key experts in your organization, losing steam due to âbureaucratic inertia,â and trusting âdata miningâ and potentially biased information (such as media reports) because of their low cost.
Cognitive Dissonance and Other Quirks
Academic discussions of decision making assume that you have endless time to dissect decisions, but in reality, situations unfold quickly and under pressure. Whatâs more, âcertain human quirksâ influence the way people take in the knowledge that shapes decisions. One quirk is cognitive dissonance, the result of receiving data that clashes with your existing conclusions. To resolve this internal conflict, people tend to overlook facts that disagree with their set opinions. This error inevitably leads to poor decisions. Cognitive dissonance is more likely when your âself-image is threatenedâ or youâre pressed to choose among limited options. This distortion can make you cling stubbornly to a single point of view, or waffle endlessly, afraid of making the wrong choice. Cognitive dissonance also can cause âchange blindness,â in which you donât realize circumstances are shifting because the changes happen so slowly. To avoid change blindness, establish a corporate culture where people accept gradations of meaning (rather than insisting on black or white answers), and where they can explore new roles. Identify the critical issues in play so you can explore them by learning new information, which has tremendous âdiagnostic powerâ as a tool for condensing âthe range of possible scenarios.â
âIn a world of information and complexity, high-powered computer models have become essential tools for many decision makers.â
Another data-related challenge is âinformation asymmetry,â which means that one side in a scenario knows more than the other. This happens in negotiations, where each side often manages the flow of information to gain a better position. Companies create asymmetry deliberately when they practice âselective disclosure.â Executives often control their firmâs flow of information so people see the company in the best light. Capital One did not disclose the real size of its subprime portfolio or its loss rate. By the time regulators finally forced the firm to commit to limiting its expansion until after it addressed this issue, Capital Oneâs stock had lost half its value.
âOne reason we are surprised by episodes of extreme volatility is that the world contains more information than any person, team or organization can process.â
Since you canât depend on getting symmetrical data, you may have to âlevel the playing fieldâ by doing your own research. Analysts studying Sallie Maeâs diminishing relationship with J.P. Morgan had to do their own field surveys to assess the impact that this change would have on student lending. If you are researching an investment, gather information from companies similar to the one whose stock youâre considering. Compare the information related firms reveal. Be sure the companyâs leaders see their industry clearly. If they seem to be in the grip of cognitive dissonance, they may not be in touch with the realities of their market. When you research a firm, focus on specific information and critical issues, so the company canât flood you with unwanted, unrelated data. Determine how its leaders made choices, and think through their reasoning.
Improving Your Analyses and Models to Develop Better Judgment
In todayâs complicated markets, investors need decision-making models more than ever. However, models can add complexity, since you must be sure your model is accurate and reliable. Use mapping to connect your critical issues with your modelâs forecasts. Your goal is to balance increased accuracy with reduced complexity. To build accuracy, break thorny problems into small pieces. In simpler models, you can map causes and effects. Sometimes you will face âone to manyâ models, in which changing a single cause alters multiple effects, and âmany to one relationships,â where multiple causes intersect to produce one outcome. Make sure your model maps the appropriate relationships. Include any âpositive feedback loopsâ where changes in one factor alter another factor, which in turn goes back and change the first. Each time you introduce an assumption about some component of your model, you add complexity. That creates variances where an analystâs bias can enter and have an effect.
âSometimes we reach accurate decisions in an intuitive flash; sometimes our instincts are fatally flawed. Some answers emerge from intensive research and sophisticated analysis; sometimes this approach fails too.â
âMonte Carlo modelsâ offer a specific set of alluring possibilities and dangerous drawbacks. When you set up a Monte Carlo model, rather than making a specific assumption about an issue, such as what a companyâs loss rate will be on a specific loan category, you work out âthousands or millions of branchesâ of a probability tree by mapping your âassumptions as random variables.â For instance, you might figure potential outcomes not just for a single assumed loss rate, but also for many possible loss rates in a specified range. Monte Carlo modeling can help you manage complex forecasts. Experts have used it for many purposes, from finding sunken ships to making animated movies, solving chemistry problems, and estimating the likely impact of legal cases on certain stocks or industries. The U.S. Navy found the SS Central America, which sank in 1857, by mapping its most likely locations with a Monte Carlo scheme.
âThe textbook approach presumes decision makers have all the time in the world to assemble and weigh information, but in the real world, things happen on the fly.â
Monte Carlo modelingâs drawback is information overload: It generates so many scenarios that people might neglect analyzing the interaction of the âcausative forces at play.â This is risky, since people want âcausative variablesâ so much that they see nonexistent cause-and-effect patterns. To avoid this, and to strengthen your use of Monte Carlo modeling, say, in an investment decision, research all the industries involved. Examine the issues in context, tracking external influences, like politics. To fill in unknowns, ask what-if questions about specific possibilities. Watch how patterns shift over time, and heed your intuition.
âIntuition works by matching patterns in the current environment to patterns in our memory.â
No matter what tools you use, you canât answer every problem. In some cases, you must exercise judgment. Traditionally, that means evaluating all the evidence before deciding. Often people try to reduce a judgment to an algorithm, as in value investing, which tries to set up investment decisions as âa set of simple rules.â This can be valuable, but algorithms always omit something. They donât take reflection into account, so value investors can fail to adjust their rules to match changing realities. Making good judgments involves a complex cycle. You gather information, analyze it and make a decision. Then, you evaluate the outcome, which offers new data. At the same time, your competitors are analyzing their own information and making their own decisions. This changes the situation youâre evaluating, and the outcome of your decisions, so you have to begin the process again. Judgment operates recursively: You must apply your analytical tools to the process you use to make decisions. If your process doesnât yield workable answers, change it.