Friday, January 24, 2020

Cognitive biases and foreign policy decisions (2020)

In his classic The Essence of Decision (1971), Graham Allison proposed three models to explain foreign policy decisions: the rational actor model, the governmental politics model and the organizational process model. Allison used the last two models to explain foreign policy behaviour that deviated from the predictions of the rational actor model during the Cuban missile crisis. The rational actor model is, well, a model and as such its assumptions do not necessarily have to be realistic. Its heuristic usefulness, one may argue, lies in the accuracy of its predictions (Friedman (1953). That said, microeconomics has incorporated more realistic assumptions into its models in the past few decades, including notions such as bounded rationality, satisficing, information gathering costs, cognitive biases and so on. Not only has this made the models look more realistic, but it has arguably also helped improve their predictive accuracy. So has modelling micro behaviour in systemic and interactive terms (Schelling 1960; Schelling 1978; Waltz 1979).

Daniel Kahneman’s Thinking Fast and Slow (2011) has popularised the importance of cognitive biases. The focus on cognitive biases has led to the behavioral revolution in economics and finance. Cognitive biases have had less of an impact on foreign policy analysis (Mintz & DeRouen 2010). This is a little surprising given Allison’s insight that a state’s foreign policy actions often deviate substantially from the predictions of the rational actor model. As mentioned, Allison attributed these deviations to political and bureaucratic factors rather than systemic cognitive error. This does not mean that foreign policy decisions can be solely explained in terms of individual or social psychology. In fact, psychological explanations are often highly idiosyncratic (e.g. the Oedipal Bush Jr went to war against Iraq because Saddam Hussein tried to call Bush Sr.) and Karl Popper would certainly have dismissed such explanations as un-scientific. Governmental politics and organizational process models tell us a great deal about how the choices of ultimate decision-makers are constrained or influenced and suggest how what looks like a foreign policy decision is no decision at all and instead just the actions of lower-level bureaucracies. Foreign policy decisions are rarely simply cognitive exercises. Nonetheless, the question to what extent empirically well-established cognitive biases influence foreign policy decisions is not asked often enough.  

Granted, it can be difficult to establish an unambiguous empirical link between a cognitive bias and an actual decision, including foreign policy decisions. Whether cognitive biases were actually operative in particular situation can only be established with the help of a detailed historical analysis. And even detailed historical analyses may fail to unambiguously establish that it was a cognitive bias “who did it”. After all, the reasoning and thought processes of decision-makers are often not known and sometimes decision-makers themselves are not fully aware of the reasons (or biases) that made them take certain decisions. Here is JFK: “The essence of ultimate decision remains impenetrable to the observer – often, indeed, the decider himself” (quoted in Allison 1971). It is nonetheless worthwhile to explore to what extent cognitive biases can in principle account for foreign policy decisions, even if a water-tight empirical link is difficult to establish in practice.

The sunk cost fallacy refers to a situation where an actor continues a path of action because of the costs already incurred although it is not meeting his or her expectationsThis may explain why armed conflicts last so long even if their military and political outcome can be predicted quite accurately. Why does the losing side in an armed conflict often fail to cut its losses early on rather than continue fighting to the end? Why did the US keep on fighting in Vietnam, even though it knew it was not going to win? Why did Germany fight for another three and half years after Stalingrad? A staggering 2/3 of all German military personnel losses occurred in 1944-45. The sunk cost fallacy can account for what on the face of its looks like irrational behaviour. Decision-makers attempt to recover the costs incurred by an action, even though these retrospective costs should be, economically speaking, irrelevant to any decision today. The focus should be on prospective costs (and benefits). However, cutting one’s losses is a difficult decision, not just psychologically but also politically. (This is why investment banks’ trading desk put in place stop-loss triggers to prevent traders from desperately seeking to recover their losses.) Having expended human and material resources in a war while failing to gain any tangible advantage makes it difficult to “cut one’s losses”. Agreeing to re-establish the status quo ante was next impossible as WWII dragged on, even though the military stalemate in 1915 or 1916 would have strongly counselled that. Cognitive biases offer a possible psychological explanation. Of course, other explanations may also help account why armed conflicts last longer than they rationally should, including the impact of a peace settlement on the political leaders’ domestic standing, concern about a state’s international reputation in terms of their willingness to incur costs to defend allies (Kissinger 2003), or excessively optimistic assessments of the military situation and the prospects of winning the conflict by bureaucratic interests such as the military (McMaster 1998). The sunk cost effect may also explain why Japan, and especially the Japanese army, rejected US demands to withdraw from mainland China in 1940-41. The reluctance to give in to US demands as contained in the Hull note ultimately led Japan to attack Pearl Harbor in pursuit of a high-risk strategy that ultimately ended in total defeat (Utley 2005).

The way issues are framed is important in terms of the willingness of actors to take a risk-seeking or risk-avoiding course of action. The framing effect may have influenced Japan’s decision to attack Pearl Harbor. Prospect theory suggests that an actor faced with a choice leading to a loss will tend be more risk-seeking and vice versa. Gains are valued less than losses. Framing US demands as a prospective loss and putting less value on the potential gains of improved relations with the US, Japan opted for a high-risk strategy. The Japanese leadership, including Yamamoto and Tojo were acutely aware of just how high the risks were, but it seemed the only course of action that might allow to avoid losses. Again, other explanatory approaches may help explain Japan’s decision. For instance, the Japanese military was wedded to a doctrine of ‘decisive battle’ a la Clausewitz and Mahan. It may have made strategic sense not to be seen as giving in to US pressure for fear that showing weakness might lead the US to make further demands that would diminish Japan’s position further. But the framing effect may well have informed the decision to opt for an extremely high-risk strategy that ultimately led to disaster. 



The recency effect and availability heuristics overestimate the relevance and salience of recent experience. This may explain Britain’s fateful decision in 1956 to seize the Suez canal. People often take mental shortcuts and fail to evaluate in what relevant way the problem they are dealing with is different from their recent experience or a recent event. This may help explain Britain’s catastrophic decision to invade Suez in 1956. Anthony Eden had resigned from the cabinet in protest over Chamberlain’s appeasement policy before WWII. Following a horrific war that may have been avoided had Western powers not pursued appeasement, it is perhaps not surprising that Eden came to see Nasser as another Hitler, who needed to be confronted right away. While research suggests that heuristics can be quite useful and surprisingly efficient in terms of making decisions (Gigerenzer 1999), the complex and interactive nature of foreign policy decision-making suggests that analogies are decidedly less useful in international affairs. At a minimum, a systematic evaluation of the relevant similarities and differences is crucial to evaluate to what extent analogies or heuristics can offer useful guidance (May & Neustadt 1986). Often analogies are used to ‘sell’ foreign policy to a domestic or international audience. Similarly, Truman used the Nazi German foreign policy analogy to justify US intervention in Korea in 1950. The analogy may or may not have been used primarily to rally domestic support. As a guide to action the Nazi Germany analogy was flawed in important respects, as in the case of Korea it did not take into consideration the possibility of Soviet or Chinese counter-intervention and it overestimated the degree to which North Korea represented a threat to the balance-of-power in East Asia. In the case of Suez, it failed to take into account the Eisenhower administration opposition to the UK-French-Israeli intervention. But it is easy to see how Eden and perhaps Truman came to see Hitlers everywhere given the recency bias.

The overconfidence effect occurs when a decision-maker’s subjective confidence in his or her ability is greater than his or her actual ability or performance. This bias may have led Germany astray in its decision to attack the USSR in 1941. Leaving aside strategic calculus and ideological motivation, the German military’s confidence to defeat the USSR in a matter of a few months was misplaced. The conflict lasted four years and ended with Germany’s defeat, a defeat largely attributable to the Red Army rather than the Western allies. Napoleon’s defeat at the hands of Russia should have counselled caution. Perhaps it did. And perhaps Germany had good reason to believe that the USSR would be defeated quickly. The Soviet army had a lost a large number of senior officers during the Stalinist purges and the Soviet army had performed poorly in the 1939-40 Winter War against Finland. However, it is not difficult to see why the German military suffered from over-confidence. Superior Blitzkrieg tactics had helped it defeat Poland, Norway, Denmark, Belgium, Holland, France and Greece, not to mention the British in Cyprus. The standard account of Germany’s defeat often invokes strategic miscalculation. At the tactical-operational level, however, it was faulty intelligence, particularly about the number of Soviet divisions and the USSR’s ability to replenish them that made a German victory next to impossible. Germany may also have overestimated its own capabilities due to the over-confidence effect. Napoleon’s decision to invade Russia in 1812 may similarly have been affected by the over-confidence bias. Both the grande armée and the Wehrmacht had won victory after victory – often against of odds. The over-confidence effect made it more difficult to properly assess the risks of invading Russia. To the extent that underestimating your opponent’s capabilities is the flipside of overestimating our own, both France and Germany fell afoul of the over-confidence effect.

The superiority illusion elevated to a social-psychological level may explain why Germany underestimated the USSR and the US underestimated Japan in military terms. Nazi discourse of racial superiority and Slavic inferiority led German military intelligence to underestimate Soviet strength. Similarly, widespread US racist views of Japanese, whose short-sightedness allegedly made it impossible for Japan to have first-rate fighter pilots, led the US military to underestimate Japan’s capabilities (Dower 1986). Not many analysts believed that Japan would be able to pull off carrier attack on Pearl Harbor and in part this can be attributed to such racial stereotypes. It is difficult to establish to what extent key decision-makers themselves subscribed to such views and to what extent they regarded them as useful tools to rally political support for their policies. The superiority illusion may lead decision-makers astray not just when it comes to dealing with adversaries. The bias may also help explain why the US was confident that it could win the Vietnam war, while France had failed. (Admittedly, the US was substantially more powerful than France in the 1950s.) Similarly, the US in 2001 must have felt immensely superior to the USSR in 1980 and this led US policymakers to believe that they could succeed where the USSR had failed a little more than a decade earlier (aka Afghanistan).

The importance of groupthink in foreign policy decision-making was established by the classic work of Irving Janis (1972) about the failure of the Bay of Pigs invasion and need not be recounted in detail. The catastrophic decision to go ahead with the CIA-sponsored Bay of Pigs invasion can be attributed to groupthink. Not only were the key decision-makers a homogenous crowd. Young and inexperienced administration officials deferred to experienced Eisenhower administration holdovers and failed to question key details of the operational plan. Groups can be highly effective in terms of producing high-quality decision provided certain conditions are met, including diversity, independence, decentralization, aggregation and trust (Surowiecki 2004). Luckily, JFK learnt his lesson and avoided groupthink during the Cuban missile crisis.




In the case of confirmation bias and cognitive dissonance, a decision-maker disregards evidence that tends to disconfirm the original hypothesisMany historical accounts depict Stalin as refusing to believe that a German attack would take place in 1941, all the evidence to the contrary. Had Germany defeated the USSR in late 1941, this may have gone done as the most egregious and far-reaching example of cognitive dissonance in history. Or it may not, for others have argued that the USSR, including Stalin, was fully aware that an attack was imminent. After all, three million soldiers - even along a 2,900 km long border that runs from the Black Sea to the Barents Sea - are difficult to hide. The USSR did not want to give Germany any pretext for an invasion in order to maintain the moral high ground and (Weber 2019). During armed conflicts, military leaders often find it difficult to overcome cognitive dissonance if what the incoming information suggests is that the conflict is unwinnable. This may apply to Hitler in 1945 as much as to Westmoreland in 1968.

Attributing another agent’s behavior to their ‘nature’ (whatever that may be), but one’s own behavior to situational factors, is called fundamental attribution errorWhile others launch wars of choice because this is simply their nature (Iraqi invasion of Kuwait in 1991), one is forced to wage war due to situational factors (US invasion of Iraq in 2003). (From the Iraqi point of view, it would no doubt be the other way around.) Given the existence of the Schlieffen plan and the prospect of a two-front war, the German general staff would have been reluctant to call WWI a war of choice following Russia’ partial mobilisation, while others of course believed that this is precisely what it was. More generally, Prussia’s and later Germany’s aggressive foreign policy backed up by military might was attributed to culture or nature rather than geo-graphic and geo-political position. A small, vulnerable, resource-constrained (Prussia) or a larger, but vulnerable and resource-constrained (Imperial Germany) faced with actual or perceived ‘encirclement’ will lead a country to to rely on a strong military and pre-emptive military strategies (Jaeger 2019). If it does not, it may risk ending up like Poland at the end of the 18th century. Similarly, George Kennan’s famous article in Foreign Affairs (1947) attributed Soviet behavior to Russia’s aggressive nature rather than the USSR’s sense of vulnerability following two massive invasions (Napoleon, Hitler) and a Western intervention in the Russian civil war in support of the Whites following WWI. The question what determines behaviour, agency or structure, is an old one and interesting answers have been proposed (Giddens 1984). Psychologically, overemphasizing dispositional and underemphasizing situational factors may lead to adverse outcomes that may have been avoided. Maybe Japan would have pursued a less aggressive foreign policy in the 1930s had it been possible to address it security concerns. If one views Japan as hopelessly militaristic and aggressive, such a potential solution is precluded right from the start.

Cognitive biases may not offer a complete explanation of catastrophic foreign policy decisions. Nonetheless, foreign policy makers would be well-advised to be aware of the nefarious effects cognitive biases (and analogies) may have on their decisions. Decision-makers should therefore use analogies and heuristics cautiously and avoid applying them one-for-one. Decision-makers should put in place a protocol to help mitigate cognitive biases. Decision-makers should establish a process and structure that can help minimise the effect of cognitive and avoid groupthink (as JFK did during the Cuban missile crisis). Decision-makers should view their decisions in an interactive and systemic context and be aware of possible unintended consequences. Decision-makers should try to control the lower-level bureaucracy and monitor their actions in terms of the signals their actions might send to the other party. An awareness of cognitive biases, historical decision-making mistakes and potential traps may help improve the quality of decisions. It is to be hoped that the increased popularity and awareness of cognitive biases will help inform the decisions taken by foreign policymakers.