Saturday, September 26, 2020

Some thoughts on the law of unintended consequences (2020)

The story may be apocryphal. Se non è vero, è ben trovato. The British colonial government in India was concerned about venomous cobras. It therefore decided to offer a bounty to locals for every cobra they would catch. Initially, the number of cobras fell. But then business-minded locals began to breed cobras in order to continue receiving rewards. As soon as the British government found out what was going on, it discontinued the policy. With cobras offering no further rewards, the locals released the cobras and the cobra problem ended up being worse than before. The policy intervention inadvertently modified incentives and changed behaviour that led to a perverse outcome. Unintended consequences need not be perverse. They can also be adverse in terms of causing costs unrelated to the original objective or intended outcome. Unintended consequences can also be beneficial (e.g. invisible hand).

Robert Merton (1936) has termed this phenomenon the “law of unintended consequences” (Or more precisely: “The Unintended Consequences of Purposive Social Action”). Other examples include a seatbelt mandate that leads to increased traffic-related deaths. Drivers feel safer, leading them to drive more recklessly (so-called risk compensation). The law of unintended consequences often pops up in cases of quantification-based management (Muller 2018). Not only are what is measured and what is meant to be targeted not completely congruent. The focus on one (quantifiable) target may lead to a modification of behaviour in order to better meet the quantitative rather than the intended target. This is also known Goodhart’s law: when a measure becomes a target, it cases to be a good measure.

Merton attributed the failure to anticipate unintended consequences to a variety of causes: (1) ignorance, (2) error in analysis, (3) immediate interest overriding long-term interests, (4) basic values prohibiting action that would prevent adverse longer-term outcomes and (5) self-defeating prophecies (where a solution is found before the problem occurs). More broadly, unintended consequences may be unavoidable in the context of complex systems, where outcomes cannot be controlled (“nature”). Unintended consequences may also be due to analytical ‘errors of analysis’. A failure to evaluate counterfactuals, to take into account the targeted party’s perceptions and interests and/ or to anticipate second- and third-round effects can lead to unintended, albeit in principle avoidable consequences (“epistemic”). Relatedly, unintended consequences may also be due to psychological biases, including self-deception and stupidity (“psychology”). Again, correcting for biases should in principle help avoid unintended consequences or at least make them predictable.


Normal accident theory, for instance, postulates that in complex systems characterised by non-linearity and tight coupling accidents and mistakes are unavoidable (Perrow 1999). This view has not gone unchallenged. That said, complex systems do exhibit behaviour that is difficult, even impossible to predict with any degree of probability. While not a magic fix, a sensitivity to initial conditions and an acknowledgement of the existence of non-linearities may make decision-makers more aware of the potential pitfalls of certain courses of actions. By limiting over-confidence, it may also help limit psychological sources of mistakes. If the system is truly complex, even all of this will fail to help decision-makers avoid unintended consequences.

World War I may be attributed to complexity, non-linearities and tight coupling. The action of one individual (Serbian nationalist assassinating the Austro-Hungarian crown prince and his wife) triggered a string of events (tightly coupled actions) that ultimately led to the death of 20 million people. Relatedly, the so-called “security dilemma” may be thought of in systemic, but perhaps not complexity terms. Here the consequences of action are unintended, but relatively easy to anticipate. If country A increases its military power, opposing country B will do so, too, in turn leading country A to increase its military power further. 

“Blowback” is an example of maybe ignorance, maybe immediate interests overring longer-term considerations causing unintended and perhaps unanticipated consequences. The US decision to support fundamentalist militants to fight the Soviet occupation of Afghanistan reflected a desire to weaken the USSR. However, when the occupation ended, the militants set their sights on destabilising US allied governments in the region (Johnson). Maybe policymakers did not anticipate such a turn of events. Maybe they favoured short-term expediency over longer-term and admittedly somewhat uncertain adverse consequences.

The increase in China-US tensions seems to have led to greater animosity and a greater sense of vulnerability on both sides (Jaeger 2020). This much, while perhaps unintended, could have been anticipated. But Sino-US tensions have also arguably spilled over into Sino-Indian relations and led to a significant increase in bilateral frictions. While perhaps inevitable, this is leading to a (tighter) coupling between Sino-US and Sino-Indian relations, thereby giving rise to the geo-political construct of the Indo-Pacific. One can only speculate why Beijing is willing to adopt a more assertive stance vis-à-vis India (if this is indeed what is happening) given that an unintended, but easy-to-anticipate consequences is a closer US-Indian strategic partnership.

US support for China’s international economic integration led to the emergence of China as a strategic competitor. While China’s economic development was perhaps anticipated, its speed and impact almost certainly were not. China’s emergence as a geopolitical rival was neither intended nor anticipated. The anticipation (or hope?) that China might become a responsible stakeholder may be interpreted as stupidity, error of analysis or as an example of immediate interests overring longer-term strategic considerations. The latter is perhaps more applicable to Nixon’s opening of China rather than the post Cold War US policy encouraging China’s integration into the global economy.

The Truman administration considered launching a pre-emptive nuclear war against the USSR. An unintended, but anticipatable consequence of not launching such an attack was the emergence of the USSR as nuclear. This would be a clear-cut example of basic values prohibiting a policy and thereby leading to an unintended, but anticipated and, from the US point of view, adverse outcome – at least in material and power terms.

Or to take the Munich Agreement. Prime Minister Chamberlain’s decision to cede the Sudeten areas to Germany in 1938 averted a military conflict. Only detailed historical research can reveal why Britain decided not to stand its ground. The decision is often chalked down to stupidity. After all, it improved Germany’s geo-strategic position tremendously. Then again, the decision did buy Britain time to ramp up its armament production and – which would prove absolutely vital – strengthen its air force and air defence system (Kennedy 1992). Maybe the Munich agreement is an example of an unintended consequence, namely Chamberlain’s belief that it would bring “peace for our time”. But it may also have been an expedient but deliberate short-term decision to avoid war in the short term, while fully recognising that it would strengthen Germany incomparably. In this case, the decision brought about consequences that were unintended but (largely) anticipated. 

Unintended and unanticipated consequences can often be attributed to cognitive biases. The US neither anticipated nor intended the limited military intervention to transmute into a full-scale war, let alone one it would lose. The US had just defeated Nazi Germany and Imperial Japan. The US was not a war-weary and economically weak France. The unintended consequence of losing the war was not anticipated. The US may have felt it had no choice but to intervene given the belief in the domino theory. This, however, would only explain why the US got involved rather than explain why it failed to anticipate the unintended consequence of defeat. Unintended consequences as far as defeat was concerned were not seriously evaluated.

Pearl Harbor was the unintended and unanticipated consequence of a more hawkish US policy towards Japan. Tightening US economic sanctions and moving the Pacific Fleet to Pearl Harbor were not intended to provoke an unanticipated Japanese attack on the US naval base. (It may have sought to provoke Japan to attack US forces in the Philippines). Here the unintended consequences were not due to short-term expediency, not self-deception, but instead may be attributed to cognitive biases and/ or the ability to put oneself in the other party’s shoes. Ultimately, cognitive biases and group biases led the US failure to anticipate that Japan, instead of being deterred by hawkish US policies, would respond by launching a pre-emptive against the US.

None of this is meant to suggest scholars and practitioners are invariably incapable of dealing with the “law of unintended consequences”. Think of, for example, the reluctance of Northern European EU member-states to agree to permanent financial transfers during the euro crisis a decade ago. The Northerners did – correctly, one may argue – anticipate moral hazard and its potentially negative longer-term consequences for euro area stability. Or maybe the domestic politics of some Northern European states did not allow them to agree to permanent financial transfers. Again, this is for historian to figure out.

It is extremely important to think about “purposeful social action” (aka policies and policy decisions) in systemic terms and in terms of the law of unintended consequences. This is something that comes more natural to ecologists, biologists and engineers, and maybe certain stripes of sociologists. In spite of concepts like systems and bureaucracies featuring in prominent International Relations theories, the concept of unintended consequences is somewhat under-appreciated, under-studied and, importantly, under-taught. Students of international politics and foreign policy practitioners would do well to familiarise themselves with complexity theory, unintended consequences and cognitive biases. This should allow scholars to provide better explanations and policymakers to make better decisions – or at least to make decisions whose consequences are less unintended and/ or better-anticipated.