Sunday, June 28, 2020

Biases, heuristics & statistics - a guide for foreign policy makers (2020)

The so-called availability heuristic may help explain Anthony Eden’s fateful decision to invade the Suez Canal zone in 1956. The heuristic itself was derived from an historical analogy. The nationalisation of the Suez Canal by Nasser was likened to Hitler’s decision to remilitarise the Rhineland twenty years earlier. (Eden, it is worth remembering, had resigned from the cabinet in 1938 over Chamberlain’s appeasement policy.) The rise of Nazi Germany and Britain’s failure to oppose it was the defining experience of policy makers of Eden’s generation. The “ghost of Munich” continued to loom large in their minds even after Nazi Germany had been defeated. “For Nasser read Hitler, and it’s all very familiar”, Harold Macmillan, Eden’s successor as prime minister, scribbled on a Foreign Office telegram more than seven years after the Suez Crisis. The formative experience of the failure of Britain’s appeasement policy provided the basis for an historical analogy and a related availability heuristic: confront nationalist strongmen set on revising the territorial status quo immediately, or else. This heuristic may have informed Eden’s ill-fated decision to confront Nasser.


Images, symbols, heuristics and historical analogies are powerful but somewhat underappreciated factors in international politics. Neville Chamberlain waving a piece of paper proclaiming “peace of our time” upon his return from Munich became an iconic image in its own right as well as a symbol of the folly of pursuing a policy of accommodation towards a rising, ruthless, revisionist state. The piece of paper came to symbolise the harsh reality that international agreements are not worth the piece of paper they are written on. The lesson (and heuristic) derived from the failure of Chamberlain’s appeasement policy was that revisionist states must be confronted promptly. 

The meaning of images and symbols as well as the heuristics derived from historical precedent are not set in stone. Following WWII, Churchill emerged as the hero and Chamberlain as, at best, a tragic figure. This interpretation has become more nuanced since, as have the lessons learnt (Kennedy 1992). And sometimes the images and symbols themselves change. When the USSR was a vital ally of the United States in the fight against Nazi Germany, Joseph Stalin was referred to as Uncle Joe, that is, a nice, diminutive, benevolent elderly gentleman with a big moustache. When the Cold War got underway, the image of Uncle Joe turned from nice uncle to mass-murdering, cruel, atheist lunatic. 

Even if the interpretation of an historical event remains unchanged, it leaves unanswered the question which historical precedent is most applicable to a contemporary foreign policy problem. Eden likening Nasser to Hitler concluded that Nasser needed to be confronted right away. But Eden also occasionally used the Mussolini analogy and the Mussolini analogy may have suggested a different lesson and heuristic (Freedman & Michaels 2013). After all, Italy was not about invade Western Europe, threaten to launch an invasion of the British Isles, invade the USSR and seek hegemony over (continental) Europe. Neither was Nasser.


Historical analogies often serve as a source of heuristics that guide and inform foreign policy decisions (Khong 1992). An analogy is a correspondence of partial similarity. Historical analogies reduce complexity by focussing on crucial structural similarities between a historical event and a contemporary situation. Analogies typically disregard potentially important differences. They are often vague and suggestive, and they typically fail to say much about unexamined background conditions. Historical analogies allow policy makers to derive heuristics. It is not so much the heuristics that are debatable, and more whether the past and present case are similar enough to warrant the use of the heuristic. Historical analogies need to be used very judiciously, not least because ceteris paribus conditions rarely apply (May & Neustadt 1996).

Historical analogies can also be used (or abused) to justify and advocate foreign policy decisions (Khong 1992). The vagueness and implicitness of analogies allow decision-makers to use and abuse them for political marketing purposes. By emphasising one aspect or de-emphasising another aspect of an historical analogy, policy makers can make use of a wide range of analogies to justify their foreign policies – whether or not the analogy forms the basis for the actual decision or not. Historical analogies can also shape decision-makers’ cognitive landscape and – wittingly or unwittingly – influence their decisions. See Suez. In practice, it can be difficult to establish whether an analogy (and related heuristic) was used as a device to persuade or as an instrument to inform a foreign policy decision. 

What exactly are images, biases, heuristics and historical analogies? Images and symbols are often suggestive, and they are a focal point of shared meaning. (The meaning attached to images and symbols may vary between groups.) Biases are systemic deviations from the standard of rationality in judgment or decision. They can affect individuals as well as groups. Heuristics are mental shortcuts that allow for quick judgment and decisions in the face of complex problems. Though frequently violating the laws of logic and probability, they often produce “good enough” outcomes (Gigerenzer 2015). Last but not least, historical analogies can be defined as “a historical reference to a previous situation that is relevant to the current situation due to its similarity or contrast” (Axelrod & Forster 2016). Put more succinctly, “heuristics are the ‘shortcuts’ that humans use to reduce task complexity in judgment and choice, and biases are the resulting gaps between normative behaviour and the heuristically determined behaviour (Kahneman & Tversky 1982). Historical analogies are or can be sources of heuristics and related biases.

Ignoring the existence of heuristics and biases can lead to avoidable mistakes. Governments engage in a fair amount of conscious and intentional lying (or non-truth-telling, if you prefer) in foreign affairs, mainly toward their own population rather than their adversary (Mearsheimer 2011), often taking advantage of cognitive biases to persuade. But cognitive biases can also contribute to self-deception and lead to bad foreign policy decisions (e.g. Operation Barbarossa, Bay of Pigs, Suez, Vietnam). At a minimum, policy makers should be aware of how various biases and heuristics, including heuristics related to historical analogies, contribute to mistaken decisions. 

Admittedly, tying foreign policy decisions to biases and heuristics is not a straightforward exercise. For a start, an analyst can never know what was really going on in the minds of key decision-makers. What statesmen say cannot be taken at face value. Historical sources and documents are not necessarily reliable (Trachtenberg 2006). Memoirs often suffer from self-serving and hindsight biases (sic!). In practice, it can be difficult to tell whether a bias or a heuristic was actually operative in a specific situation. In this sense, examples that link foreign policy failure to heuristics and biases must remain suggestive. 

Eliminating cognitive biases and the flawed use of historical analogies in foreign policy decision-making does not mean that mistakes can be avoided altogether. Unbiased decisions may translate into adverse outcomes due to lack of information constraints, misperception, high risk tolerance, unpredictable strategic interaction effects and so on. Moreover, rational decision-making is not always possible or even optimal given the various constraints decision-makers face (Klein 1998). This is precisely why heuristics are so widely applied. Nonetheless, understanding the role biases, heuristics and analogies play in judgment and decisions helps improve the quality of decisions. Here is an example.

The need for cognitive closure can be a significant source of bias. The 2002 Iraq War is widely regarded as a major mistake, at the very least in the sense that the decision was based on optimistic assumptions about post-war political stability. Robert Jervis points out that people seek to avoid cognitive dissonance and painful value trade-offs. As it turns out, those who believed that regime change in Iraq was the right thing to do also believed that post-war reconstruction would prove easy. By contrast, those who opposed the Iraq war believed that it would be difficult to create a stable post-war government. Few people believed that regime change was desirable, but post-war stability would be difficult to establish. And few people believed that regime change was undesirable, but that post-war stability would be easy to establish. This strongly suggests (but does not prove) that the need for cognitive coherence and avoidance of dissonance was in play. A desire for cognitive closure and an avoidance of value trade-offs biased people’s assessment of how easy or difficult post-war stabilisation was going to be. Biases matter in real-world policy decisions. This is also demonstrated by the following (non-exhaustive) list of biases and heuristics and their potential influence on important past foreign policy decisions.


What is it?
Why is it irrational?
Historical example
Sunk cost effect (related to loss aversion)
Tendency to persist in a course of action in order to recover the costs related to that action
Past losses should not matter. Escalation of commitment is irrational
US decision to prolong war in Vietnam
Loss aversion (related to endowment & IKEA effect)
Tendency to prefer avoiding losses to making gains
Impact of loss vs gain should be symmetrical
Militarily, defence (of one’s territory) stronger than attack (on other’s territory)?
Framing effect

Tendency to be influenced by how a choice is framed. Negative frame causes risk seeking (and vice versa)
Risk tolerance should not be a function of how a situation is presented
Japan’s decision to attack Pearl Harbor (1941) when faced with the possible “loss” of China due to US pressure
Status quo bias (related to loss aversion)
Preference for things to stay the same
Makes one miss upside opportunities; makes one irrationally attached to status quo
United States defending the status quo in East Asia in the face of rising China
Anchoring 

Over-reliance on an initial, arbitrary piece of information 
Arbitrary information is of likely of limited relevance/ distorts available information
US Congress passing Gulf of Tonkin resolution (1964) after initial (but mistaken) news of Vietnamese attack on US destroyer
Recency bias
(related to availability heuristics)
Tendency to attribute greater weight to more recent than less recent events
An event’s relevance does not typically depend on when it occurred 
US policy towards Vietnam in light of recent “loss” of China to communism
Availability heuristics 

Rely on immediate examples to make judgments
Only because something is easily recallable does not make it a relevant piece of information
Eden likening Nasser to Hitler and launching the occupation of the Suez Canal zone (1956)
Stereotyping (also ecological fallacy; related to outgroup bias) 

Tendency to infer an individual’s attributes on the basis of group membership
Ignores variability of attributes in other group
WWII US propaganda vis-à-vis Japan/ Tojo (Dower 1986)
Fundamental attribution error
(also: correspondence bias or attribution effect)
Under-emphasize situational explanations of others’ behaviour; over-emphasize dispositional behaviour; vice versa
Logically problematic to assume that others are driven by their “nature”, but one’s own actions are driven by circumstances
Soviet communism is inherently expansionary aka “The Sources of Soviet Conduct” (Kennan/ X 1947)
Confirmation bias (related to: availability heuristics)

Tendency to search for, interpret and favour information that confirms one’s prior personal belief
Both confirming and disconfirming information are relevant from an evidentiary point of view
US government seeking information supporting Iraqi WMD programme (2001)
Belief bias (related to: confirmation bias)

Tendency to judge validity of a hypothesis on the basis of prior belief rather than logic that supports it
Evidence should determine the justified degree of belief in a hypothesis/ explanation
Stalin dismissing intelligence that Germany was about to invade the USSR (1941)
Avoidance of cognitive dissonance
(not a bias strictly speaking, but least to biases; related to: confirmation bias)
Tendency to avoid having inconsistent thoughts and beliefs
Competing explanations may be equally well-supported by the evidence and neither can/ should be discarded
Supporters of Iraq war (2002) were unconcerned about post-war stability; opponents were very concerned (Jervis 2010)
Overconfidence effect 
(related to: gambler’s fallacy and Dunn-Krueger effect)
Subjective confidence in one’s own judgment greater than objectively warranted
Confidence should be a reflection of available evidence and assessed independently of past performance
German confidence prior to invasion of the USSR (1941)
By-stander effect (not a bias strictly speaking)

Tendency of individuals not to help a victim if other people are present
May be rational from an economic point of view (=> logic of collective action)
Stresa Front & German remilitarisation of the Rhineland in 1936 (Mearsheimer 2001)
Groupthink

Tendency of groups to make irrational decisions in order to minimise intra-group conflict
Often leads to relevant alternatives not being considered and decisions taken based on socio-psychological dynamics rather than evidence
JKF’s ill-fated decision to proceed with Eisenhower administration’s plans to invade the Bay of Pigs (1961) (Janis 1972)

Probability theory and statistics feature less prominently in foreign policy decisions and analysis. High-stakes foreign policy decisions in particular are unlikely to be based on historically observed frequencies – and not just because trying to do so would immediately raise the reference class problem. But subjective probabilities are bound to feature prominently in such decisions given the interactive context of foreign policy. Statistics and probability theory also help identify and correct for biases and help narrow the gap between normatively rational and heuristically driven behaviour. One can gain a valuable perspective with respect to a qualitative problem by applying the statistics and probability theory (King, Keohane & Verba 1994). This helps put the problem in a broader (unbiased) context and avoid biases.

Biases and heuristics also affect statistical judgment (Kahneman 2013). For instance, the clustering illusion leads individuals to see patterns where in fact there is randomness. The conjunction fallacy demonstrates the tendency to misunderstand probabilities. The base rate fallacy leads individuals to misjudge the probability of an event by not taking into account all relevant data. The survivorship bias (also known as Walden problem) generates a biased reference sample, as do various types of selection bias (e.g. drunkard’s search), the tendency of over-interpret data (e.g. absence of evidence is not evidence of absence), ignore sample size, possible mean reversion and/ or fall pretty to the gambler's and other well-known numerical fallacies (e.g. Simpson's paradox). Nonetheless, statistics and probability properly deployed can help correct for biases.

Acknowledging the importance of cognitive biases and heuristics as well as the epistemic challenges of using historical analogies is an important element of high-quality foreign policy decision making. Some of the 20th centuries more disastrous foreign policy decisions may well have been avoided, had decision-makers been more aware of the power of biases, the problems with heuristics and the danger of historical analogies. Maybe, just maybe Eden could have avoided the Suez disaster, the US might have cut its losses in Vietnam early on and Imperial Japan would not have provoked a war against the United States it could not win.