Monday, March 22, 2021

Biases & Evidence .... F R agm EnT (2021)

“In many ways, thinking about how to do science is thinking about how not to fall afoul of cognitive and affective biases as well as social prejudices” (Richards Heuer)

"Analysts should be self-conscious about their reasoning processes. They should think about how they make judgments and reach conclusions, not just about the judgments and conclusions themselves" (Richards Heuer)

Here is a (disjointed and fragmented and unedited) list of things the seasoned analyst as well as the more casual observer of the social and political world should avoid and/ or be aware of.

Abduction -- process of getting to the best available conclusion in light of a given set of observations. COMMENT: Difficult to say what constitutes “best”. SEE ALSO: induction, deduction

Absence of evidence =/ evidence of absence -- Only because there is no evidence for a hypothesis does not mean that the hypothesis is false. EXAMPLE: Rumsfeld’s response to the question why no WMD were found in post-invasion Iraq. SEE ALSO: streetlight effect or drunkard's search, dog that didn't bark, proving a negative

Affinity bias -- favorably biased towards people like us, including their views. SEE ALSO: in-group bias, groupthink, outgroup homogeneity bias

Ambiguity effect -- dislike of ambiguity. CHALLENGE: Analysts settle too quickly on an answer instead of retaining two or more competing hypotheses. “The test of a first-rate intelligence is the ability to hold two opposed ideas in the mind at the same time and still retain the ability to function” (F. Scott Fitzgerald). SEE ALSO: cognitive closure, coherence, consistency, belief bias, confirmation bias, availability bias, recency bias

Analogies -- explaining by way of actual, but not necessarily relevant, similarities. EXAMPLE: (Infamous) Munich analogy. SEE ALSO: models, theories, cognitive economy

Anchoring -- heavy reliance on the first piece of available information. CHALLENGE: Resist going with the first plausible hypothesis. SEE ALSO: framing, explore and exploit

Availability bias -- belief that what can be easily recalled is in fact the case. EXAMPLE: Anthony Eden in 1956, recalling Germany’s militarization of the Rhineland. SEE ALSO: recency bias, streetlight effect and drunkard's search

Authority bias -- lend credence to what a person in authority says. Tends to increase in potency during times of crisis when cognitive confusion and uncertainty are particularly high. SEE ALSO: Milgram experiment, in-group bias, groupthink

Backfire effect -- disproven evidence has the paradoxical effect of confirming one's beliefs. Hardly rational, sadly very common. SEE ALSO: cognitive dissonance, Dunning-Kruger effect

Bandwagon effect -- adopt the beliefs of the majority. SEE ALSO: in-group bias; groupthink, affinity bias

Bayes’ theorem -- the holy grail! The holy grail? SEE ALSO: evidence, hypothesis

Belief bias -- subjective sense of plausibility overrides objective evidence. EXAMPLE: Stalin in 1941 just before the German invasion; maybe US military before Pearl Harbor. SEE ALSO: cognitive dissonance, confirmation bias, coherence


Bikeshedding -- mistaken focus on trivial issues (like rearranging the deck chairs on the Titanic). Very characteristic of bureaucracies. The news media and politicians knowing/ unknowingly exploit it.

Blind spot bias -- recognizing biases and mistakes in others, but not oneself. SEE ALSO: fundamental attrition error

Bootstrapping -- resampling of sample data. SEE ALSO: selection bias

Bounded rationality – funny, it should have taken Herbert Simon to suggest something that is obvious to most people. SEE ALSO: cognitive economy

Clustering illusion (or oversensitivity to consistency) -- seeing patters in random data. May explain paranoia? SEE ALSO: coherence, cognitive economy, ambiguity effect

Cognitive closure -- preference for cognitive closure and certainty. Explains preference for stories, coherence, simplicity, and seeing patterns. CHALLENGE: when an explanation feels psychologically satisfying, it should be high time to challenge it. SEE ALSO: cognitive economy

Cognitive dissonance -- state of having inconsistent beliefs or thoughts or attitudes. CHALLENGE: Decision-makers seek to avoid it. Good analysts must not avoid it. SEE ALSO: coherence, confirmation bias, belief bias

Coherence -- holding beliefs that cohere rather than contradict each other. Cognitively soothing; uses up less human computing power. CHALLENGE: to be handled with caution. SEE ALSO: cognitive economy, consistency

Commitment bias -- stick to an idea or a position even though it is wrong. Probably was not very rational for the belligerents in WWI to keep fighting after the autumn of 1914. SEE ALSO: sunk cost fallacy, escalation of commitment, loss aversion, status quo bias, endowment effect

Confirmation bias -- tendency to interpret incoming evidence as supporting the initial hypothesis; assimilation of facts. CHALLENGE: important to question confirming evidence as intensively as disconfirming evidence. Or as Keynes quipped: “When the facts change, I change my mind. What do you do, sir”. SEE ALSO: Bayes’ theory, Festinger’s aliens

Curse of knowledge -- we know something, so everybody else knows it. SEE ALSO: mirror imaging, empathy

Declinism -- romanticize the past and view the future negatively. May in party help explain populism and nativism in light of multiculturalism. SEE ALSO: impact bias

Deduction – derive hypotheses or conclusion from premises. CHALLENGE: How to decide when to reject the deduced hypotheses? SEE ALSO: induction, abduction

Disposition effect -- tendency to sell assets that have appreciated and hold on to assets that have depreciated. SEE ALSO: loss aversion, sunk cost fallacy

Distinction bias -- things appear more different when viewed together. A lot depends on the baseline. SEE ALSO: outgroup homogeneity bias, framing

Dog that didn’t bark -- available evidence is consistent with more than one hypothesis. CHALLENGE: Don’t settle on the first explanation that accounts for the available data. There was after all a person in the stable ... but the dog knew the person, SEE ALSO: absence of evidence, streetlight effect, drunkard's search, Rumsfeldian known unknowns

Drunkard’s search -- only looking for evidence where you can find evidence (easily). CHALLENGE: what evidence would you expect and could it have gone missing. Big issue in evolutionary biology and paleontology and in intelligence analysis where evidence may be being concealed. SEE ALSO: Rumsfeldian unknown unknowns, streetlight effect

Dunning-Kruger effect -- the less you know, the more confident you are. Experts are in fact more knowledgeable than laymen. But Tetlock’s foxes are more knowledgeable than hedgehogs. Being educated does in fact not make one more likely be to correct about climate change. Being educated is different from being an export. SEE ALSO: knowledge illusion

Endowment effect -- value things that we own more than other things. SEE ALSO: Ikea effect, loss aversion, status quo bias

Evidence -- argument in support of a conclusion; sth that bears on a hypothesis. SEE ALSO: Bayes, inference

Explanation -- What is an explanation? An answer to a why question? To a what question? Something that feels like an explanation. A far from trivial question! SEE ALSO: coherence, consistency

Explanatory bias -- tendency to attribute events to causes. SEE ALSO: cognitive economy

Fallacy of identity -- big causes have big effects.

False consensus -- more people agree with us than is actually the case. SEE ALSO: spotlight effect

Frame of reference -- changing the frame can help fit the evidence …. and generate amazing explanations. Look at Einstein or heliocentrism. Bayesians are skeptical. SEE ALSO: structure of scientific revolutions. SEE ALSO: explanatory bias

Framing effect -- derive different conclusions from the same information depending on how it is presented. Probably the reason why bankers and lawyers dress well. CHALLENGE: Negative frames tend to stick (e.g. “Sleepy Joe”, “Crooked Hillary”). SEE ALSO: negative partisanship

Fundamental attribution error -- attribute others’ behaviour to dispositional and one’s own to situational factors. PARADOXICALLY: people also overestimate their impact on others’ behaviour. This appears logically somewhat inconsistent with the belief that the other’s behaviour is driven by dispositional factors. EXAMPLE: Japanese and German expansionism pre-WWII is often attributed to dispositional rather than situational factors. SEE ALSO: empathy, mirror imaging, self-serving bias


Groupthink -- desire for conformity and harmony in group. Explains Bay of Pigs disaster, SEE ALSO: in-group bias, bandwagon effect

Halo effect -- tendency to be influenced by previous judgment/ performance when evaluating somebody or something. SEE ALSO: coherence, consistency

Hard/easy effect (or discriminability effect) -- EXAMPLE: Bush administration underestimated how difficult post-invasion occupation of Afghanistan and Iran would be. SEE ALSO: knowledge illusion

Hindsight bias -- memory bias. Students misremembered Nixon and China. SEE ALSO: self-serving bias

Hot-hand fallacy -- something is random, but is thought to continue in the future SEE ALSO: clustering, coherence, consistency

Hyperbolic discounting -- value the present more than the future, more than we should. SEE ALSO: endowment effect

Hypothesis – often based on assumptions. In 1941 both Stalin and Admiral Kimmel based their hypotheses on the wrong assumptions in 1941. SEE ALSO: Bayes, selection bias

Illusion of control -- again, people don’t like randomness. SEE ALSO: explanatory bias

Illusion of explanatory depth -- mistaken belief that one understands the world at a deeper level than is the case. Check out the so-called bicycle problem. SEE ALSO: Dunning-Kruger effect

Illusory correlation -- interpretation and prediction of given data set. SEE ALSO: cluster illusion, hot hand fallacy

Illusory truth effect -- believe sth that is repeated sufficiently often. SEE ALSO: cognitive economy

Illusory validity -- overconfident in our own forecasts. SEE ALSO: overconfidence bias, hard/ easy effect

Impact bias (or durability bias) – overestimate length and / or intensity of future emotional state. SEE ALSO: declinism

Inattentional bias -- EXAMPLE: Gorilla in the crowd. SEE ALSO: cognitive economy, Von Restorff effect

Induction -- inference from particular cases to the general case (if only in probabilistic terms). SEE ALSO: deduction, abduction and inference-to-the-best explanation

Inference -- conclusion reached on the basis of evidence or reasoning. SEE ALSO: induction, deduction, abduction

In-group favoritism -- tendency to favor members of one's own group. SEE ALSO: authority bias, fundamental attribution error

Knowledge illusion -- law of triviality. SEE ALSO: Dunning-Kruger effect

Known knowns, known unknown & unknown unknowns -- A Rumsfeld classic, but one that raises serious epistemological issues. Interestingly, the memo the then Secretary of Defense wrote talked about “unknown knowns” … which is also worth contemplating as an epistemological concept. SEE ALSO: wild card, streetlight effect

Law of large number -- as sample size grows, its mean approaches the average of the entire population. SEE ALSO: central limit theorem

Law of small numbers -- belief that small samples resemble population from which they are drawn. SEE ALSO: cognitive economy

Loss aversion -- value avoiding losses more than making gains. EXAMPLE: Japan feeling US geoeconomics pressure and framing it as a loss led it to double down and attach Pearl Harbor. SEE ALSO: framing effect, endowment effect, sunk costs

Mirror imaging -- assume other think the same way we do. One reason for the intelligence failure that was Pearl Harbor was the US belief that it would be irrational for Japan to attack the economically much more powerful US. SEE ALSO: fundamental attribution error, empathy


Mundus vult decipi, ergo decipiatur -- Biases - cognitive and affective - facilitate deception and lies – not just in our age of post-truth politics.

Narcissism -- excessive admiration of oneself. Often related to intelligence failure. SEE ALSO: hedgehogs and foxes, sociopaths vs psychopaths

Omission bias -- Very prevalent in hierarchical bureaucracies. In part explains the performance of the Red Army during the opening stages of Barbarossa

Outcome bias -- judge decision outcome (selection effect). SEE ALSO: selection bias, survivor bias

Outgroup homogeneity bias – perceive members of the out-group as homogenous and in-group as diverse. SEE ALSO: in-group bias, cognitive economy

Overattribution -- Stalin believed that mobilising Soviet troop would provoke a German attach. SEE ALSO: spotlight effect

Overconfidence -- subjective confidence in judgments, decisions, ability is greater than their objective accuracy. SEE ALSO: narcissism, hindsight effect

Overjustification effect -- once rewards/ compensation is received, motivation decreases. Sometimes an appeal to altruism is more effective than offering compensation. SEE ALSO: extrinsic vs intrinsic motivation

Planning fallacy -- predictions about how long it will take a future task is too optimistic (complexity?). The Bush administration’s optimistic assessment of how long/ easy to stabilise post-war Afghanistan/ Iraq. Germany underestimating USSR in 1941. SEE ALSO: Complexity

Pre-mortem -- assume an outcome, particularly one that one deems improbable, and ask what would need to happen for it to occur. SEE ALSO: post-mortem

Probability -- Average or probabilistic forecasts not all that relevant given high stakes (poker > chess). In intelligence analysis, degree of subjective probability. SEE ALSO: central limit theory, unbiased estimator etc. etc. etc.

Probabilities, biases -- Base rate fallacy (or bias or neglect) – tendency to attach greater weight to event-specific information than underlying base rates; Berkson’s paradox; Conjunction fallacy (or Linda problem) – belief that conjunction of two things is more likely than one or both of them; Defender’s fallacy -- confusing P (A/B) and P (B/A); Disjunction fallacy (CARDS) – disjunctive statement less probably than any of its component; Gambler’s fallacy -- presumption that the past influences the future; Multiple comparison fallacy (birthday paradox); Prosecutor’s fallacy -- confusing P (A/B) and P (B/A); Regression fallacy -- mean reversion => KEY ISSUE: when do structural/ regime breaks occur; Representativeness bias -- Simpson’s paradox => Berkeley; Will Rogers paradox


Proving a negative -- is it really impossible to do? SEE ALSO: counterfacturals. EXAMPLE: Was Bush war on terror effective?. SEE ALSO: absence of evidence, streetlight effect

Pygmalion effect -- high expectations of a person improves her performance (and vice versa). 

Reference class problem -- determining the probability of a single event when this is event can be part of many different reference categories. ALSO: probability

Repetition bias -- willingness to believe what one has been told most often and/ or by the greatest number of sources. SEE ALSO: availability heuristics, groupthink

Risk -- Risk is ignorance that can be quantified. Uncertainty cannot. It is important to distinguish between ontological (or fundamental) and epistemological uncertainty. SEE ALSO: probability, uncertainty

Salience bias -- predisposes analysts towards data that are more prominent or emotionally striking. SEE ALSO: availability heuristics, recency bias

Science -- “advances one funeral at a time” (Max Planck).

Selection bias -- failure to randomisation. SEE ALSO: survivorship bias

Self-serving bias -- our failures are situational, our successes are our responsibility. SEE ALSO: fundamental attribution error, structuration. structure/ agency, group/ individual

Signal-to-noise ratio -- Roberta Wohlstetter's classic explanation of Pearl Harbor intelligence failure. SEE ALSO: probability

Situational logic -- intelligence analysts and foreign policy crisis managers are forced to work out what is going and/ or is going to happen. Theories and comparative history may be of some, but often limited help. SEE ALSO: abduction

Spotlight effect -- overestimate how much others pay attention to us. But remember, even schizophrenics have enemies (quote attributed to Henry Kissinger). SEE ALSO: drunkard's search

Status quo bias -- prefer things to stay the same. SEE ALSO: endowment effect

Stereotyping -- belief that members of a group will have certain characteristics in spite of not having information about a particular individual. Mobilised especially aggressively by during wartime. SEE ALSO: cognitive economy, in-groups bias, outgroup homogeneity bias

Sunk cost fallacy (or escalation of commitment) – tendency to retrieve losses. CHALLENGE: too influenced by what went prior. EXAMPLE: Nixon/ Kissinger can do what LBJ could not. SEE ALSO: loss aversion, status quo bias

Survivorship bias -- failure to recognise that sample is produce of a selection process. CHALLENGE: biases our conclusions. EXAMPLE: Walden problem. SEE ALSO: selection bias

Uncertainty -- unquantifiable ignorance; comes in epistemological and ontological form. SEE ALSO: probability, risk

Value trade-off -- avoid affectively painful trade-offs related to one’s beliefs. CHALLENGE: biases conclusions. EXAMPLE: Robert Jervis points out that most people who opposed the Iraq war as unnecessary also believed that the post-war situation would be a mess, while those supporting it believed that post-war stabilization would be easy. Both opponents and proponents avoided potentially painful value trade-offs, even though the two propositions are logically unrelated. SEE ALSO: cognitive economy, coherence, consistency

Von Restorff effect -- memories are representations, not exact copies. CHALLENGE: tendency to misremember

Wild card/ black swan -- Low probability/ high impact event. CHALLENGE: tendency to underestimate the frequency with which such events occur. SEE ALSO: probability, estimating averages

Wednesday, March 17, 2021

Germany Between a Rock and a Hard Place (2021)

The Biden administration has just issued its Interim National Security Strategic Guidance. The guidance document states the need to “build back better at home” and acknowledges that “international economic policies must serve all Americans” – a theme often referred to as “foreign policy for the middle class”. While the interim guidance does not preclude cooperation with China in selected policy areas, it is unambiguous in considering China a strategic competitor. The prospect of intensifying China-US geopolitical and (geo)economic competition is bad news for Germany, which has high value trading and investment relationships with both countries.




Thursday, March 11, 2021

Sino-US Competition and Waltz’s Third Image (2021)

A third-image explanation can account for why Sino-US competition has arisen and why it is here to stay (Waltz 1959). It so happens that domestic-level factors in both the US and China also point towards continuing and intensified competition.

Conflicting strategic objectives

For more than a century, US grand strategy has consisted of preventing the emergence of a hegemonic power on either side of the Eurasian landmass. The emergence of a hegemonic power in Europe or East Asia would have the potential to threaten US security and it might lead to the economic exclusion of the US from major overseas markets. Whenever a state threatened to dominate Europe or East Asia, the US ended up going to war. This was the case in WWI, WWII and the Cold War. 

China, on the other hand, considers the extensive US military presence and US alliances in Asia a potential threat to its security as well as its domestic political stability. China’s increasing integration into the international economy over the past four decades has translated into increased dependence on overseas trade, creating hitherto non-existing economic and political vulnerabilities, particularly with respect to strategic, essential imports like energy. Not surprisingly, China is eager to mitigate these vulnerabilities (e.g. BRI, blue water navy, RCEP) and, thanks to its economic rise, it is now in a position to increasingly contest the US military presence in Asia. China sees the US presence in the region as providing Washington with the means to destabilize China, economically and politically, while China’s increasing power and assertiveness have the US worried about its own strategic position in East Asia. In short, conflicting strategic interests and a classic security dilemma strongly favor Sino-US competition, if not confrontation.


Increasing security and economic competition

China’s military modernization is increasingly challenging the US position in Asia. In addition to its investment in asymmetric capabilities, China’s maritime strategy of “near seas defense and far seas protection” as well as the construction of a blue water navy is not surprisingly considered a threat by Washington. But the construction of a blue water navy to defend sea lanes is only logical given China’s increased dependence on international trade. So is its so-called string of pearls policy that is meant to support China’s international trade and sea lane protection. Meanwhile, US military doctrine and concepts such as offshore control heightens Chinese security concerns about open sea access and chokepoints. Beijing is challenging the territorial and especially maritime status quo in the East and South East China Sea in order to push out its security perimeter. In other words: China and the US confront a classic security dilemma. 

It is not surprising that security competition has spilled over into the economic and technological realm, undermining further any residual trust that may have existed between China and the US. Washington has not been shy about weaponizing economic interdependence by, for example, limiting Chinese access to key US technologies or imposing tariffs on Chinese imports. Being in the weaker position, China’s response has been relatively muted (or proportional). But it has spurred China to address the risks and vulnerabilities arising from economic interdependence through diversification (e.g. EU-China investment treaty), regional economic integration (RCEP, BRI) and greater emphasis on indigenous economic and technological capabilities (e.g. Made in China 2025, “dual circulation”). All of these policies aim to reduce China’s economic-technological vulnerability in the context of intensifying Sino-US security competition. Technological competition has kicked into full gear and both sides are proactively seeking to limit their dependence on each other in terms of other critical goods (e.g. rare earth, semiconductors). Security competition is negatively affecting bilateral economic relations. High politics beats low politics.

US domestic attitudes towards China have hardened

Only a little more than a decade ago, senior US government officials were hoping that China – benefitting from the existing international economic and political order – would emerge as “responsible stakeholder”. It was believed (or hoped) that economic modernization would lead to domestic political liberalization, which – via democratic peace theory – might in turn allow China to rise peacefully. This has not come to pass. 

In fact, it is next to impossible to find any senior government official, member of congress or think tank policy wonk in Washington today who is not a China hawk. For a start, virtually all the Biden administration officials occupying senior position in the national security and economic realm are firmly critical of China. This is not surprising. Most of them served in the Obama administration and experienced first-hand how the US-China relationship deteriorated. In particular, President Xi reneging on his 2015 promise to President Obama not to militarize the disputed artificial islands in the South China Sea is often cited as a wake-up call. Then again, ultimately the drivers of competition are structural. 

Biden administration officials are not ideological China bashers, in spite of the administration’s renewed emphasis on a value-based US foreign policy. They accept strategic competition as a fact of life but strive for “competition without catastrophe”. They also believe that it is necessary to engage and seek cooperation with China in selected areas (e.g. climate, public health), but they accept that the overall relationship is and will remain competitive, even adversarial. Across the administration, across the Washington political class and across the think tank world, there is virtual unanimity with respect to the need to compete with, and even confront, China. 

Not only is a hawkish China policy one of the very issues that commands bipartisan support of the political class these days. But, according to Pew Research, public opinion is also taking an increasingly negative view of China. A recent Pew poll showed that ¾ of Americans have a negative view of China. Although some of the negative polling is due to the pandemic, popular opinion is unambiguously anti-China. Few elections will be won by being “soft” on China. Corporate America has –if not turned against China – become more China-skeptical. Limited access to the Chinese markets, Chinese industrial, competition and technology policies disadvantaging foreign companies and, more generally, greater Chinese competitiveness has led many (but not all) sectors to become wearier of China. US domestic grievances abound. In short, both international-systemic factors and domestic political dynamics point to a more competitive and hawkish US China policy.