Pages

3 September 2022

Do Not Trust Your Gut: How to Improve Strategists’ Decision Making

James M. Davitch

INTRODUCTION

Military strategists often make decisions based on instinct and emotion rather than careful deliberation. However, as Daniel Kahneman’s book Thinking, Fast and Slow explains, most humans do as well because of mental limitations called cognitive biases.[1] Kahneman argues the mind’s tendency to make snap decisions rather than to proceed with caution can inhibit effective judgment. Cognitive biases may cause strategists to overlook salient yet inconvenient information and waste time pursuing solutions to the wrong problems. Unfortunately, the list of known cognitive biases is extensive and growing.[2] Faced with an overwhelming number of challenges to decision making, a strategist might question whether some biases are harmful specifically in a military planning environment and if there are any techniques to address them. This essay argues that, yes, some specific biases may directly affect military planning. Further, it argues that while there are historical examples of their negative influence, they can be mitigated through certain techniques.[3]

Each section below focuses on one of four cognitive limitations. The essay will describe each bias, relate it to an example in military history, and conclude with steps to mitigate it. This essay illustrates through historical analogies why confirmation bias, fundamental attribution error, anchoring bias, and representative bias are detrimental to a military strategist’s decision-making process.[4] These biases can cause strategists to privilege facts that confirm previously held beliefs, automatically attribute nefarious motivations to others’ actions, fixate on initial information, and draw incorrect associations between dissimilar events. Examples from the Cuban Missile Crisis, the Korean War, Operation Iraqi Freedom, and World War I provide historical context for these cognitive limitations. The mitigation steps include engaging in active open mindedness, empathy, consideration of the opposite, and the so-called “what if” technique. The common theme throughout the mitigation steps in this essay is that strategists would benefit from exercising patience and intellectual humility in their deliberations.

System 1 versus System 2

The recommendations described in this essay draw heavily on behavioral psychology research, especially Kahneman’s description of how the brain makes decisions. Kahneman presents a struggle in the mind between two modes of thinking that he calls System 1 and System 2. During System 1 thinking, the mind operates “automatically and quickly, with little or no effort.”[5] Most of the time this process works well enough, allowing one to proceed without a second thought. The cognitive process often fails, however, when System 1 suggests intuitive answers to complicated questions. This is because in System 1 thinking, one’s mind is prone to take shortcuts that sacrifice mental rigor for expediency. Most of the time these shortcuts, which behavioral psychologists call biases and heuristics, are innocuous and involve low stakes decision making where instinctive judgment suffices. However, when evaluating possibilities during strategic planning for military operations, one’s instinctive judgment, laden with unconscious biases, may be detrimental.

Kahneman contrasts System 1 with its mental partner, System 2. System 2 “allocates attention to the effortful mental activities that demand it.”[6] System 2 thinking requires focus, and because it is often easier to make a snap decision than it is to concentrate, the mind sacrifices System 2 in favor of System 1 most of the time.[7] For the strategist that means most of the decisions one makes resemble knee-jerk reflexes rather than carefully considered conclusions. In the parlance of behavioral psychology, humans often privilege their intuitive judgment in conditions of uncertainty. Put differently, a strategist may tend to trust their gut (i.e., system 1), even in situations where they probably should not. While junior strategists are liable to this weakness, perhaps counterintuitively, seniority and experience increase this tendency. A large body of literature shows the mixture of authority and overconfidence can result in an even more toxic combination for decision making because senior strategists may possess the influence junior strategists lack.[8]

As an additional complication, there is no way to switch off System 1. It constantly, though unconsciously, suggests answers to deal with the many decisions one makes every day. Most of the time, for routine decision making, this is perfectly fine. However, when complexity increases, one is more prone to fall into cognitive traps that can lead to poor outcomes. Therefore, part of mitigating these problems involves slowing down one’s decision making to engage the thoughtfulness inherent in System 2 and resisting the easy answer offered by System 1. Unfortunately for the military strategist, doing so is much easier said than done, especially with respect to confirmation bias.

CONFIRMATION BIAS

Human minds crave order, and they try to minimize the discomfort of uncertainty by suggesting ways to make sense of chaos and disorder. One of the ways they do this is by encouraging us to accept information that confirms preexisting views or ideas. Oftentimes one sees what one wants to see, frequently to the exclusion of other relevant factors like a valid-but-contradictory viewpoint.[9] This is called confirmation bias, and it can be problematic when it leads strategists to expend less mental effort on a problem or question than it warrants (i.e., when the strategist impulsively accepts the System 1 answer). The deleterious effects of confirmation bias may alter a strategist’s perception of reality, leading to neglect of the fundamental problem one must address.

Confirmation bias in the Cuban Missile Crisis

Unfortunately, examples of confirmation bias abound in military history. One instance occurred prior to the 1962 Cuban Missile Crisis, when the United States failed to respond to the introduction of Soviet military equipment in Cuba.[10] On September 19, 1962, the Central Intelligence Agency stated it was not likely the Soviet Union would put nuclear missiles on the island.[11] In October, less than a month later, analysis of pictures taken from U-2 reconnaissance aircraft showed the Soviet Union had done exactly that.[12] However, the October surveillance photos were not the first piece of evidence that Russia was militarizing Cuba. Some parts of the U.S. intelligence community had observed dozens of shipments of conventional weapons and military personnel preceding the delivery of nuclear weapons and predating the October photo analysis.[13] Confirmation bias contributed to analysts’ neglect of the deployment of conventional weapons and the surprise of U.S. national security enterprise at the deployment of nuclear-armed ballistic missiles.[14]

View fullsize

The fear inspired by the Cuban Missile Crisis (Bettmann/Corbis)

Throughout 1962, the Soviet government repeatedly denied any desire to militarize Cuba. The Soviet foreign minister privately assured President Kennedy that Soviet Premier Khrushchev would not do anything to complicate American domestic matters before the congressional elections in November. Secretary of State Dean Rusk believed what the Soviet Union was saying, publicly and in private.[15]

The lack of overhead imagery complicated the problem.[16] On August 29, 1962, a U-2 overflew Cuba, and subsequent imagery analysis revealed defensive weapons (i.e., surface-to-air missiles) and probably Soviet personnel, but no offensive missiles on the ground.[17] It would take six more weeks for the next U-2 to fly over Cuba. When it did, on 14 October, it was too late.

The August U-2 photos showing defensive weapons, combined with Moscow’s repeated declarations, resulted in a confirmation bias to believe that no nuclear buildup was forthcoming. In his memoirs, presidential adviser Clark Clifford wrote that the state of mind within the intelligence community rejected the possibility of offensive missiles in Cuba.[18] Though there were some who argued otherwise, including the director of the CIA, by and large the U.S. intelligence community believed what it wanted to believe and privileged evidence which supported that belief.

How to Mitigate Confirmation Bias

Since people often seek and readily accept confirming evidence for beliefs they already hold, the trick to dealing with confirmation bias is to actively seek out disconfirming evidence. Humans want to believe what they think is true is actually true. So, the strategist must convince his or her mind to un-believe it.

The first step, as with most cognitive debiasing strategies, is to simply slow down. Many times, military professionals are in a rush to judgment, mainly to fix a problem and move on to the next. However, to prevent confirmation bias a good technique is to consciously delay one’s decision and ask what it would take for the opposite viewpoint to be true. The next step is to exercise humility and acknowledge there may be other points of view worth considering before reaching a final verdict. This method is a part of a larger concept called active open-mindedness.

Actively open-minded thinking refers to the consideration of all evidence prior to a decision.[19] The main problem confirmation bias presents is that when evaluating evidence, one may only consider the evidence one wants to believe is true. Therefore, strategists should flip the evidence on its head and try to disprove it—asking, for example, “What would it take to disprove what I believe to be true?” Even better, one can ask what evidence would be necessary to prove the assessment wrong. This line of questioning is a useful technique when evaluating someone else’s claim or assessment and can minimize the effects of overconfidence. “What evidence would you have to see to make you change your mind?”[20] This allows strategists to remain open to alternative possibilities, which is important when dealing with other biases like the fundamental attribution error.

FUNDAMENTAL ATTRIBUTION ERROR

Fundamental attribution error is a term that suffers from a confusing name but is nevertheless a common cognitive pitfall. It refers to an individual’s tendency to attribute another’s actions to something fundamental about that person, like their background, while attributing one’s own behavior to factors beyond one’s control.[21] Fundamental attribution error is partially about assigning blame, but it is also the tendency to ascribe to others what one may be less likely to attribute to oneself. Writing about this bias, the CIA noted that it occurs when the behavior of others is attributed to some fixed nature, while claiming that one’s own behavior is a function of the situation in which he or she finds themself.[22] President George W. Bush once said, “Too often we judge other groups by their worst examples, while judging ourselves by our best intentions.”[23] This hints at the hypocrisy inherent in fundamental attribution. One may observe fundamental attribution error in action when one interprets another’s behavior by heavily weighting personal characteristics like where someone else is from, their social class, gender, etc., and lightly weighting situational factors.[24] Unacknowledged fundamental attribution errors may warp a strategist’s understanding of the situation they face and delay attention to the correct problem.

Fundamental Attribution Error in the Korean War

The way that some American political leaders evaluated U.S. and Chinese actions during the Korean War provides an example of how the fundamental attribution error can cloud a strategist’s judgment. In October 1950, U.N. forces had recovered from North Korea’s summer surprise attack. As they began their advance north, the U.S.-led U.N. coalition had a tricky geopolitical needle to thread. How could they defeat North Korea without drawing China into the conflict?

U.S. President Harry Truman at his desk in the Oval Office with Secretary of State Dean Acheson, 1950 (National Archives)

However, not all members of President Truman’s cabinet felt there was a risk of alarming Beijing. Secretary of State Dean Acheson felt there was little danger of provoking China because, from his perspective, America’s intentions were benign. Said Acheson, "No possible shred of evidence could have existed in the minds of the Chinese Communists about the non-threatening intentions of the forces of the United Nations.”[25] While the U.S. felt its decisions were nonthreatening, that is not how they were interpreted in Beijing.[26]

Therefore, it was with astonishment that some in the U.S. government received news in late October 1950 that Chinese forces had covertly crossed the Yalu River and attacked U.N. forces. Senior leaders in Washington were, “incapable of interpreting the Chinese intervention as a reaction to a threat. Instead, the Americans interpreted the Chinese reaction as an expression of fundamental hostility toward the United States.”[27] John Lewis Gaddis notes the Chinese leadership likely viewed advancing Allied forces as a threat to their regime. Fundamental attribution error in this case, manifested itself in the U.S.’s appraisal of the Chinese response as a hostile one, rather than one borne of the situation (i.e., the U.N. force’s advance towards the Chinese border). The initial U.S. judgment of China as congenitally hostile and a belligerent actor, and its appraisal of its own actions as benign and righteous, may have colored subsequent interactions and prolonged the stalemate until the 1953 armistice.[28]

How to Mitigate Fundamental Attribution Error

Fighting fundamental attribution error is about exercising a sense of understanding and subordinating arrogance to take the other side’s point of view into consideration. Thus, it is beneficial for the strategist to operate in System 2, because System 1 is rarely empathetic. Like mitigating confirmation bias, one can attempt to overcome fundamental attribution error by considering others’ viewpoints as well as alternative explanations for observed evidence. The human mind often does not because it is easier to rush to a judgment.

An important consequence of fundamental attribution error is how it affects one’s ability to evaluate the adversary. As described in joint doctrine, understanding the adversary is a crucial step in planning military operations.[29] Doing this analytical task well allows the military strategist to intelligently predict what the adversary is going to do next. Improperly evaluating the adversary’s point of view can lead to an inaccurate estimate of possible enemy courses of action.[30] Additionally, as Kahneman writes, “If people are often poorly equipped to explain the behavior of their adversaries, they are also bad at understanding how they appear to others.”[31] This echoes the contention of Robert Jervis that “[i]t is rare for actors to realize that what matters in sending a message is not how you would understand it, but how others will understand it.”[32]

With fundamental attribution error, the Golden Rule is a good guide—treat others how you would like to be treated. A software researcher in Silicon Valley put it this way:

Me ten years ago, on seeing a poorly designed interface, “Wow, what idiot designed this?’

Me, today, “What constraints were the team coping with that made this design seem like the best possible solution?” Empathy trumps fundamental attribution errors.”[33]

ANCHORING BIAS

Anchoring bias, or the anchoring effect, refers to the human tendency to attribute outsized influence to the first piece of information one encounters. That information then has the propensity to influence subsequent estimates and discussions.[34] For example, military budgeting can suffer from anchoring bias as discussions about future fiscal requirements sometimes begin from the previous year’s figure (the anchor point) rather than from the need or requirement. For instance, if the budget is known to be ~$700 million, that figure itself has power in anchoring conversations about future budgets. Discussions then become fixated on the $700 million figure and about how much to add or subtract from it, rather than about warfighting requirements. Sometimes this is a logical method for thinking about the future, but not when the anchor or reference point stifles creative thinking about alternative solutions.

Psychologists have found that specialization in a particular area may make this cognitive bias more pronounced. Based on their research of those in certain professions who possessed a great deal of expertise, Northcraft and Neale showed that “expertise in the subject matter does not seem to mitigate anchoring to any significant extent.”[35] Thus, the anchoring effect may have deleterious effects on military strategists especially susceptible to this bias precisely because of their deep subject matter expertise.

Anchoring Bias in Operation Iraqi Freedom

Major Blair Williams wrote in a 2010 edition of Military Review that U.S. military planners were slow to adjust to changing realities after the onset of fighting in Iraq during the mid-2000s. Despite warnings prior to Operation Iraqi Freedom that the planned number of ground forces in Iraq was too low, the average number of U.S. troops from 2003 to mid-2007 remained around 138,000.[36] Historians such as Andrew Bacevich believe Secretary of Defense Rumsfeld was attempting to execute a military design concept called the Revolution in Military Affairs whereby the U.S. would fight conflicts and prevail by virtue of technological superiority rather than mass. Troop numbers did not increase until President Bush surged forces in 2007. Despite Iraq being on the verge of a civil war throughout the middle part of the 2000s, decision makers were tied to their initial estimate of the necessary number of ground troops. Thus, military and civilian defense professionals are no exception to the reality that people who are experts in a field can fall victim to the effects of the anchoring bias. Major Williams summarized the situation by stating, “The anchoring phenomenon kept the value closer to the initial value than it should have been.”[37]

U.S. troops pull down a statue of Saddam Hussein in central Baghdad (Goran Tomasevic/Reuters)

The insidiousness of each of these biases stems from the fact that their presence is not mutually exclusive, but rather additive in their effect. Once people are anchored to a particular figure, it may make an organization susceptible to the effects of confirmation bias where one may accept information more readily that conforms to previously held beliefs. For example, Gordon and Trainor noted the White House believed the manpower requirement for Iraq’s post-conflict stability operations would be minimal because Iraqis would, “do the work of Phase IV themselves.”[38] Therefore, intelligence analysis and assessments supporting the light footprint planning assumption confirmed the White House’s existing inclination to avoid nation building.[39] The low troop figure provided a permission structure for accepting evidence that confirmed the existing bias and anchored subsequent planning assumptions. Likewise, Jervis described several instances of this phenomenon during the prelude to the 2003 Iraq War regarding confirmation bias and White House decision making.[40] Since anchoring and confirmation bias can be mutually reinforcing, it is imperative that strategists, especially those faced with planning amidst the fog and friction of war, learn techniques to mitigate their effects.

How to Mitigate Anchoring Bias

System 1 tries to create a world that justifies why the anchor number is correct. It tries to make one’s perception true so the brain can deal with the next issue. The first step to mitigate the brain’s inclination to anchor on the first figure it encounters is to, again, slow down. After the strategist has successfully fought the urge to accept the suggestions of System 1, he or she can begin to consider the opposite.

The consider-the-opposite strategy works exactly like it sounds. Individuals are induced to contemplate the possible outcomes at odds with their prevailing belief. The means can vary, but one study found that by employing explicit instructions test subjects were able to retrain their thought process to avoid rash judgments.[41] In a strategy team, it may be effective for the team leader to set the expectation that subordinate team members will show the results of their consider-the-opposite methodology. This technique is designed to fight the brain’s desire to make something seem true by forcing one to consider alternatives or alternative explanations. Multiple studies have shown that test subjects who consciously consider the opposite are less susceptible to the anchoring bias because they take the time to consider the possibility of the opposite outcome.[42] A strategy to consider the opposite may “disrupt the fast heuristic processing of System 1 and activate System 2, which requires more cognitive efforts and information elaboration.”[43] The consider-the-opposite strategy helps to render the initial anchoring figure irrelevant—because it often is.

REPRESENTATIVE BIAS

Representative bias might more aptly be called the deja vu bias, and it is a close cousin of another heuristic Kahneman calls availability bias. A key feature of both biases is the tendency to associate a new event with previous occurrences that seem analogous. This bias, just like the others this essay discussed, is the brain’s attempt to quickly categorize new information. In this instance, System 1 tries to rapidly search for instances similar to a present situation. It can be comforting when one associates uncertainty with a familiar situation because it suggests that similar tools may be used to address it. Problems arise when the circumstances at hand are unlike previous situations, despite the System 1 suggestion to treat them as the same.[44]

Representative Bias in World War I

While World War I is replete with examples of poor judgment, the case study of the Austro-Hungarian Empire’s mobilization for war is especially notable. This is primarily because of its military chief of staff’s failure to adequately predict and prepare for war with Russia. Instead, he relied on his recent memory of past experiences as a guide for the future. Austro-Hungarian field marshal and chief of the general staff, Count Franz Conrad von Hötzendorf, demonstrated, perhaps, the most spectacular failure of leadership in the entirety of the war. According to Mark Grotelueschen, Conrad failed for two reasons: he failed to plan for the right war and, once engaged, he failed to fight the war effectively.[45] This example will only focus on his pre-war mistakes.

Franz Graf Conrad von Hötzendorf, Austro-Hungarian general, Chief of the General Staff of the Austro-Hungarian Army (Hermann Torggler/Wikimedia)

Prior to World War I, Austria-Hungary had challenges with two countries on its borders: Russia and Serbia. However, the former represented a much larger and more lethal threat to the Habsburg Empire than the latter. Conrad was the chief strategist of the war plans that were devised to deal with both countries. However, Conrad never tasked his subordinates to create a plan to fight both countries at the same time, which is exactly what Austria-Hungary required in 1914. Therefore, while separate war plans existed at the start of the war, no combined plan did. Nevertheless, Conrad repeatedly pushed civilian politicians in the Austro-Hungarian empire for war, specifically against Serbia, despite the likelihood that Russia would intervene on Serbia’s behalf. His bellicosity stemmed partially from outdated mobilization estimates—he thought it would take Russia too long to prepare for war. Conrad failed primarily because he was unable to separate his perception of the situation (that Russia would take too long to mobilize) from the reality that Russia would enter the war quickly and in force. His susceptibility to representative bias—the inability to effectively judge the situation based on likely probabilities—contributed to the Austro-Hungarian Empire’s eventual demise.

How to Mitigate Representative Bias

In 2009, the Central Intelligence Agency produced a tradecraft primer designed to improve intelligence analysis for the agency and other similar intelligence organizations. This handbook provides a host of structured analytic techniques that, if used properly, may improve one’s ability to “challenge judgments, identify mental mindsets, stimulate creativity, and manage uncertainty” to “wrestle with difficult questions.”[46] Some have argued the structured analytic techniques have not been rigorously tested and others have tried to suggest ways for doing so.[47,48] Nevertheless, there is merit to some of the methods that provide individuals with approaches for engaging with uncertainty more deliberately.

One technique called what-if analysis may be very helpful for strategists dealing with representative bias.[49] The what-if technique suggests that one should start with the end state and then attempt to provide the logical pathway that led to that conclusion. Representative bias may be prevalent when one uses past experiences as a guide for the present. By thinking backwards, what-if analysis allows one to avoid letting the past influence the present and instead accept a future condition as a given. Then one can apply analytical thinking to identify why hypothetical events in an imagined future transpired the way they did.

If Conrad had applied more humility in the face of uncertainty and what-if analysis to the problem at hand, he might have considered both the best- and worst-case outcomes. For Conrad, and the Austro-Hungarians, the best-case outcome could have been the future Conrad wanted—a war with either Russia or Serbia but not both. He might have then considered the events that led to a worst-case scenario, the scenario which ultimately occurred. If he had engaged in a more critical, open-minded approach to strategic decision making, he may have then observed that the worst-case scenario was a much more likely eventuality and that his personal opinion, clouded by representative bias, was in error.

CONCLUSION

Human brains operate mostly on autopilot, and the cognitive biases employed to live life often inhibit the ability to make good decisions. In low stakes situations, like deciding what to eat or what to wear, the negative impacts of cognitive biases are negligible. But in the military—where the stakes are often higher and effective decision making can mean the difference between mission success and failure—the consequences of cognitive biases corrupting decision making can be calamitous.

This essay described how confirmation bias, fundamental attribution error, anchoring bias, and representative bias are detrimental to a military strategist’s cognitive process. It argued that they can cause strategists to privilege facts that confirm one’s beliefs, automatically attribute nefarious motivations to others’ actions, fixate on initial information, and draw incorrect associations between dissimilar events. The essay used examples from the Cuban Missile Crisis, the Korean War, Operation Iraqi Freedom, and World War I to provide a historical context. Through these vignettes and many others, one may appreciate how some of the most consequential military events hinged on the influence of unconscious biases during critical junctures.

Effective decision making in complex environments is incredibly challenging even without the limitations that humans bring via System 1 thinking. To avoid undermining the decision-making process and compounding the difficulty, it may be useful for strategists to employ the critical thinking techniques described above. These methods—including engaging in active open mindedness, empathy, consideration of the opposite, and the what-if technique—are derived from decades of behavioral psychology research. They may help mitigate cognitive bias and allow one’s brain to transition from System 1 to System 2 when the situation requires.

Engaging in System 2 thinking mainly requires two things, the ability to slow down one’s rush to judgment and the subordination of one’s pride to acknowledge that their instinctive answer may not be correct. Thus, critical thought can benefit from both patience and intellectual humility. Military strategists should thoughtfully consider their cognitive limitations as well as the range of possible outcomes in pursuit of political goals and in support of civilian leaders. Strategists who devote attention to thinking about thinking and learning from the mistakes of the past may improve their ability to plan for the future.

No comments:

Post a Comment