Monday, March 22, 2021

Biases & Evidence .... F R agm EnT (2021)

“In many ways, thinking about how to do science is thinking about how not to fall afoul of cognitive and affective biases as well as social prejudices” (Richards Heuer)

"Analysts should be self-conscious about their reasoning processes. They should think about how they make judgments and reach conclusions, not just about the judgments and conclusions themselves" (Richards Heuer)

Here is a (disjointed and fragmented and unedited) list of things the seasoned analyst as well as the more casual observer of the social and political world should avoid and/ or be aware of.

Abduction -- process of getting to the best available conclusion in light of a given set of observations. COMMENT: Difficult to say what constitutes “best”. SEE ALSO: induction, deduction

Absence of evidence =/ evidence of absence -- Only because there is no evidence for a hypothesis does not mean that the hypothesis is false. EXAMPLE: Rumsfeld’s response to the question why no WMD were found in post-invasion Iraq. SEE ALSO: streetlight effect or drunkard's search, dog that didn't bark, proving a negative

Affinity bias -- favorably biased towards people like us, including their views. SEE ALSO: in-group bias, groupthink, outgroup homogeneity bias

Ambiguity effect -- dislike of ambiguity. CHALLENGE: Analysts settle too quickly on an answer instead of retaining two or more competing hypotheses. “The test of a first-rate intelligence is the ability to hold two opposed ideas in the mind at the same time and still retain the ability to function” (F. Scott Fitzgerald). SEE ALSO: cognitive closure, coherence, consistency, belief bias, confirmation bias, availability bias, recency bias

Analogies -- explaining by way of actual, but not necessarily relevant, similarities. EXAMPLE: (Infamous) Munich analogy. SEE ALSO: models, theories, cognitive economy

Anchoring -- heavy reliance on the first piece of available information. CHALLENGE: Resist going with the first plausible hypothesis. SEE ALSO: framing, explore and exploit

Availability bias -- belief that what can be easily recalled is in fact the case. EXAMPLE: Anthony Eden in 1956, recalling Germany’s militarization of the Rhineland. SEE ALSO: recency bias, streetlight effect and drunkard's search

Authority bias -- lend credence to what a person in authority says. Tends to increase in potency during times of crisis when cognitive confusion and uncertainty are particularly high. SEE ALSO: Milgram experiment, in-group bias, groupthink

Backfire effect -- disproven evidence has the paradoxical effect of confirming one's beliefs. Hardly rational, sadly very common. SEE ALSO: cognitive dissonance, Dunning-Kruger effect

Bandwagon effect -- adopt the beliefs of the majority. SEE ALSO: in-group bias; groupthink, affinity bias

Bayes’ theorem -- the holy grail! The holy grail? SEE ALSO: evidence, hypothesis

Belief bias -- subjective sense of plausibility overrides objective evidence. EXAMPLE: Stalin in 1941 just before the German invasion; maybe US military before Pearl Harbor. SEE ALSO: cognitive dissonance, confirmation bias, coherence


Bikeshedding -- mistaken focus on trivial issues (like rearranging the deck chairs on the Titanic). Very characteristic of bureaucracies. The news media and politicians knowing/ unknowingly exploit it.

Blind spot bias -- recognizing biases and mistakes in others, but not oneself. SEE ALSO: fundamental attrition error

Bootstrapping -- resampling of sample data. SEE ALSO: selection bias

Bounded rationality – funny, it should have taken Herbert Simon to suggest something that is obvious to most people. SEE ALSO: cognitive economy

Clustering illusion (or oversensitivity to consistency) -- seeing patters in random data. May explain paranoia? SEE ALSO: coherence, cognitive economy, ambiguity effect

Cognitive closure -- preference for cognitive closure and certainty. Explains preference for stories, coherence, simplicity, and seeing patterns. CHALLENGE: when an explanation feels psychologically satisfying, it should be high time to challenge it. SEE ALSO: cognitive economy

Cognitive dissonance -- state of having inconsistent beliefs or thoughts or attitudes. CHALLENGE: Decision-makers seek to avoid it. Good analysts must not avoid it. SEE ALSO: coherence, confirmation bias, belief bias

Coherence -- holding beliefs that cohere rather than contradict each other. Cognitively soothing; uses up less human computing power. CHALLENGE: to be handled with caution. SEE ALSO: cognitive economy, consistency

Commitment bias -- stick to an idea or a position even though it is wrong. Probably was not very rational for the belligerents in WWI to keep fighting after the autumn of 1914. SEE ALSO: sunk cost fallacy, escalation of commitment, loss aversion, status quo bias, endowment effect

Confirmation bias -- tendency to interpret incoming evidence as supporting the initial hypothesis; assimilation of facts. CHALLENGE: important to question confirming evidence as intensively as disconfirming evidence. Or as Keynes quipped: “When the facts change, I change my mind. What do you do, sir”. SEE ALSO: Bayes’ theory, Festinger’s aliens

Curse of knowledge -- we know something, so everybody else knows it. SEE ALSO: mirror imaging, empathy

Declinism -- romanticize the past and view the future negatively. May in party help explain populism and nativism in light of multiculturalism. SEE ALSO: impact bias

Deduction – derive hypotheses or conclusion from premises. CHALLENGE: How to decide when to reject the deduced hypotheses? SEE ALSO: induction, abduction

Disposition effect -- tendency to sell assets that have appreciated and hold on to assets that have depreciated. SEE ALSO: loss aversion, sunk cost fallacy

Distinction bias -- things appear more different when viewed together. A lot depends on the baseline. SEE ALSO: outgroup homogeneity bias, framing

Dog that didn’t bark -- available evidence is consistent with more than one hypothesis. CHALLENGE: Don’t settle on the first explanation that accounts for the available data. There was after all a person in the stable ... but the dog knew the person, SEE ALSO: absence of evidence, streetlight effect, drunkard's search, Rumsfeldian known unknowns

Drunkard’s search -- only looking for evidence where you can find evidence (easily). CHALLENGE: what evidence would you expect and could it have gone missing. Big issue in evolutionary biology and paleontology and in intelligence analysis where evidence may be being concealed. SEE ALSO: Rumsfeldian unknown unknowns, streetlight effect

Dunning-Kruger effect -- the less you know, the more confident you are. Experts are in fact more knowledgeable than laymen. But Tetlock’s foxes are more knowledgeable than hedgehogs. Being educated does in fact not make one more likely be to correct about climate change. Being educated is different from being an export. SEE ALSO: knowledge illusion

Endowment effect -- value things that we own more than other things. SEE ALSO: Ikea effect, loss aversion, status quo bias

Evidence -- argument in support of a conclusion; sth that bears on a hypothesis. SEE ALSO: Bayes, inference

Explanation -- What is an explanation? An answer to a why question? To a what question? Something that feels like an explanation. A far from trivial question! SEE ALSO: coherence, consistency

Explanatory bias -- tendency to attribute events to causes. SEE ALSO: cognitive economy

Fallacy of identity -- big causes have big effects.

False consensus -- more people agree with us than is actually the case. SEE ALSO: spotlight effect

Frame of reference -- changing the frame can help fit the evidence …. and generate amazing explanations. Look at Einstein or heliocentrism. Bayesians are skeptical. SEE ALSO: structure of scientific revolutions. SEE ALSO: explanatory bias

Framing effect -- derive different conclusions from the same information depending on how it is presented. Probably the reason why bankers and lawyers dress well. CHALLENGE: Negative frames tend to stick (e.g. “Sleepy Joe”, “Crooked Hillary”). SEE ALSO: negative partisanship

Fundamental attribution error -- attribute others’ behaviour to dispositional and one’s own to situational factors. PARADOXICALLY: people also overestimate their impact on others’ behaviour. This appears logically somewhat inconsistent with the belief that the other’s behaviour is driven by dispositional factors. EXAMPLE: Japanese and German expansionism pre-WWII is often attributed to dispositional rather than situational factors. SEE ALSO: empathy, mirror imaging, self-serving bias


Groupthink -- desire for conformity and harmony in group. Explains Bay of Pigs disaster, SEE ALSO: in-group bias, bandwagon effect

Halo effect -- tendency to be influenced by previous judgment/ performance when evaluating somebody or something. SEE ALSO: coherence, consistency

Hard/easy effect (or discriminability effect) -- EXAMPLE: Bush administration underestimated how difficult post-invasion occupation of Afghanistan and Iran would be. SEE ALSO: knowledge illusion

Hindsight bias -- memory bias. Students misremembered Nixon and China. SEE ALSO: self-serving bias

Hot-hand fallacy -- something is random, but is thought to continue in the future SEE ALSO: clustering, coherence, consistency

Hyperbolic discounting -- value the present more than the future, more than we should. SEE ALSO: endowment effect

Hypothesis – often based on assumptions. In 1941 both Stalin and Admiral Kimmel based their hypotheses on the wrong assumptions in 1941. SEE ALSO: Bayes, selection bias

Illusion of control -- again, people don’t like randomness. SEE ALSO: explanatory bias

Illusion of explanatory depth -- mistaken belief that one understands the world at a deeper level than is the case. Check out the so-called bicycle problem. SEE ALSO: Dunning-Kruger effect

Illusory correlation -- interpretation and prediction of given data set. SEE ALSO: cluster illusion, hot hand fallacy

Illusory truth effect -- believe sth that is repeated sufficiently often. SEE ALSO: cognitive economy

Illusory validity -- overconfident in our own forecasts. SEE ALSO: overconfidence bias, hard/ easy effect

Impact bias (or durability bias) – overestimate length and / or intensity of future emotional state. SEE ALSO: declinism

Inattentional bias -- EXAMPLE: Gorilla in the crowd. SEE ALSO: cognitive economy, Von Restorff effect

Induction -- inference from particular cases to the general case (if only in probabilistic terms). SEE ALSO: deduction, abduction and inference-to-the-best explanation

Inference -- conclusion reached on the basis of evidence or reasoning. SEE ALSO: induction, deduction, abduction

In-group favoritism -- tendency to favor members of one's own group. SEE ALSO: authority bias, fundamental attribution error

Knowledge illusion -- law of triviality. SEE ALSO: Dunning-Kruger effect

Known knowns, known unknown & unknown unknowns -- A Rumsfeld classic, but one that raises serious epistemological issues. Interestingly, the memo the then Secretary of Defense wrote talked about “unknown knowns” … which is also worth contemplating as an epistemological concept. SEE ALSO: wild card, streetlight effect

Law of large number -- as sample size grows, its mean approaches the average of the entire population. SEE ALSO: central limit theorem

Law of small numbers -- belief that small samples resemble population from which they are drawn. SEE ALSO: cognitive economy

Loss aversion -- value avoiding losses more than making gains. EXAMPLE: Japan feeling US geoeconomics pressure and framing it as a loss led it to double down and attach Pearl Harbor. SEE ALSO: framing effect, endowment effect, sunk costs

Mirror imaging -- assume other think the same way we do. One reason for the intelligence failure that was Pearl Harbor was the US belief that it would be irrational for Japan to attack the economically much more powerful US. SEE ALSO: fundamental attribution error, empathy


Mundus vult decipi, ergo decipiatur -- Biases - cognitive and affective - facilitate deception and lies – not just in our age of post-truth politics.

Narcissism -- excessive admiration of oneself. Often related to intelligence failure. SEE ALSO: hedgehogs and foxes, sociopaths vs psychopaths

Omission bias -- Very prevalent in hierarchical bureaucracies. In part explains the performance of the Red Army during the opening stages of Barbarossa

Outcome bias -- judge decision outcome (selection effect). SEE ALSO: selection bias, survivor bias

Outgroup homogeneity bias – perceive members of the out-group as homogenous and in-group as diverse. SEE ALSO: in-group bias, cognitive economy

Overattribution -- Stalin believed that mobilising Soviet troop would provoke a German attach. SEE ALSO: spotlight effect

Overconfidence -- subjective confidence in judgments, decisions, ability is greater than their objective accuracy. SEE ALSO: narcissism, hindsight effect

Overjustification effect -- once rewards/ compensation is received, motivation decreases. Sometimes an appeal to altruism is more effective than offering compensation. SEE ALSO: extrinsic vs intrinsic motivation

Planning fallacy -- predictions about how long it will take a future task is too optimistic (complexity?). The Bush administration’s optimistic assessment of how long/ easy to stabilise post-war Afghanistan/ Iraq. Germany underestimating USSR in 1941. SEE ALSO: Complexity

Pre-mortem -- assume an outcome, particularly one that one deems improbable, and ask what would need to happen for it to occur. SEE ALSO: post-mortem

Probability -- Average or probabilistic forecasts not all that relevant given high stakes (poker > chess). In intelligence analysis, degree of subjective probability. SEE ALSO: central limit theory, unbiased estimator etc. etc. etc.

Probabilities, biases -- Base rate fallacy (or bias or neglect) – tendency to attach greater weight to event-specific information than underlying base rates; Berkson’s paradox; Conjunction fallacy (or Linda problem) – belief that conjunction of two things is more likely than one or both of them; Defender’s fallacy -- confusing P (A/B) and P (B/A); Disjunction fallacy (CARDS) – disjunctive statement less probably than any of its component; Gambler’s fallacy -- presumption that the past influences the future; Multiple comparison fallacy (birthday paradox); Prosecutor’s fallacy -- confusing P (A/B) and P (B/A); Regression fallacy -- mean reversion => KEY ISSUE: when do structural/ regime breaks occur; Representativeness bias -- Simpson’s paradox => Berkeley; Will Rogers paradox


Proving a negative -- is it really impossible to do? SEE ALSO: counterfacturals. EXAMPLE: Was Bush war on terror effective?. SEE ALSO: absence of evidence, streetlight effect

Pygmalion effect -- high expectations of a person improves her performance (and vice versa). 

Reference class problem -- determining the probability of a single event when this is event can be part of many different reference categories. ALSO: probability

Repetition bias -- willingness to believe what one has been told most often and/ or by the greatest number of sources. SEE ALSO: availability heuristics, groupthink

Risk -- Risk is ignorance that can be quantified. Uncertainty cannot. It is important to distinguish between ontological (or fundamental) and epistemological uncertainty. SEE ALSO: probability, uncertainty

Salience bias -- predisposes analysts towards data that are more prominent or emotionally striking. SEE ALSO: availability heuristics, recency bias

Science -- “advances one funeral at a time” (Max Planck).

Selection bias -- failure to randomisation. SEE ALSO: survivorship bias

Self-serving bias -- our failures are situational, our successes are our responsibility. SEE ALSO: fundamental attribution error, structuration. structure/ agency, group/ individual

Signal-to-noise ratio -- Roberta Wohlstetter's classic explanation of Pearl Harbor intelligence failure. SEE ALSO: probability

Situational logic -- intelligence analysts and foreign policy crisis managers are forced to work out what is going and/ or is going to happen. Theories and comparative history may be of some, but often limited help. SEE ALSO: abduction

Spotlight effect -- overestimate how much others pay attention to us. But remember, even schizophrenics have enemies (quote attributed to Henry Kissinger). SEE ALSO: drunkard's search

Status quo bias -- prefer things to stay the same. SEE ALSO: endowment effect

Stereotyping -- belief that members of a group will have certain characteristics in spite of not having information about a particular individual. Mobilised especially aggressively by during wartime. SEE ALSO: cognitive economy, in-groups bias, outgroup homogeneity bias

Sunk cost fallacy (or escalation of commitment) – tendency to retrieve losses. CHALLENGE: too influenced by what went prior. EXAMPLE: Nixon/ Kissinger can do what LBJ could not. SEE ALSO: loss aversion, status quo bias

Survivorship bias -- failure to recognise that sample is produce of a selection process. CHALLENGE: biases our conclusions. EXAMPLE: Walden problem. SEE ALSO: selection bias

Uncertainty -- unquantifiable ignorance; comes in epistemological and ontological form. SEE ALSO: probability, risk

Value trade-off -- avoid affectively painful trade-offs related to one’s beliefs. CHALLENGE: biases conclusions. EXAMPLE: Robert Jervis points out that most people who opposed the Iraq war as unnecessary also believed that the post-war situation would be a mess, while those supporting it believed that post-war stabilization would be easy. Both opponents and proponents avoided potentially painful value trade-offs, even though the two propositions are logically unrelated. SEE ALSO: cognitive economy, coherence, consistency

Von Restorff effect -- memories are representations, not exact copies. CHALLENGE: tendency to misremember

Wild card/ black swan -- Low probability/ high impact event. CHALLENGE: tendency to underestimate the frequency with which such events occur. SEE ALSO: probability, estimating averages