International Security

International Security is America's leading peer-reviewed journal of security affairs.

International Security
Policy Briefs
from International Security

Crossing the Rubicon: The Perils of Committing to a Decision

Download

Bottom Lines

  • The Commitment Trap. When policymakers shift from deciding among several options to carrying out a chosen option, they "cross the Rubicon," triggering an "implemental mind-set" and a set of powerful psychological biases, including overconfidence.
  • War and Peace. In international politics, implemental mind-sets increase the attraction of war and promote risky military plans.
  • Surviving the Rubicon. Psychological biases are subconscious and difficult to counter, so policymakers must try to anticipate the effects of implemental mind-sets ahead of time and alter the way they make decisions to avoid costly mistakes.

Crossing the Rubicon

When Julius Caesar crossed the Rubicon River in 49 B.C., he broke an ancient law forbidding any general to enter Italy with an army—thus making war with Rome inevitable. Ever since, "crossing the Rubicon" has come to symbolize a point of no return, when the time for deliberation is over and action is at hand. When decisionmakers cross the Rubicon, or stop debating which of several options to pursue and start implementing a chosen policy, the psychological effects can shape the political world in powerful ways—including the outbreak of war.

We found a puzzling phenomenon in many wars in history, from World War I in 1914 to the Iraq War in 2003: as conflict drew near, people's confidence in victory increased. This is surprising because the mere fact that war is closer in time should not affect the probability of winning. If anything, people on the brink of conflict should be more wary about the outcome, not less, as the risks of war loom large. But when leaders move closer to the abyss, they seem to become more eager to take the leap.

The "Rubicon model" of decisionmaking in psychology can explain this puzzle. All policymakers will have experienced the shift from weighing which of several options to pursue on a given issue to putting the selected option into practice. They may be less aware, however, that this shift causes dramatic changes in the way the human brain receives and processes information.

Before making a decision, people maintain a "deliberative" mind-set, in which they weigh the costs, benefits, and risks of different options in a reasonably fair manner. Following a decision, however—after crossing the Rubicon—they switch into an
"implemental" mind-set that triggers a set of powerful psychological biases.

  • People adopt a kind of tunnel vision, focusing intensely on the task at hand and ignoring incoming information—especially if it questions the wisdom of the course of action they are pursuing.
  • They process information in a selective and biased manner, reinforcing the chosen option.
  • They create self-serving illusions about their effectiveness as decisionmakers, believing that they are more skillful than they really are.
  • They become vulnerable to what psychologists call the "illusion of control," the tendency to believe that events can be controlled, even if they are inherently uncontrollable.
  • They become overconfident about the outcome of events, generating unwarranted expectations of success.

In combination, these biases undermine rational decisionmaking and can have potentially dramatic consequences for international politics and beyond.

Key Findings

In everyday life, implemental mind-sets can be useful because they help people focus on the task at hand and avoid being distracted by alternatives or doubts. In the high-stakes realm of international politics, however, implemental mind-sets can be extremely dangerous, making bad policies seem good and good policies turn bad.

A leader's belief that conflict is looming can trigger an implemental mind-set, which inflates the leader's confidence in victory and encourages provocative or escalatory policies that can push states into war. And once decisionmakers have settled on a military solution, they will be reluctant to reassess this choice and step back from the brink. Even if the facts on the ground change and suggest that a hawkish policy should be abandoned, leaders can become trapped by a mind-set that favors war. In addition, implemental mind-sets encourage risky military planning. Newly optimistic about the outcome, leaders will become even more committed to the selected war plan, even if evidence mounts that it is likely to fail.

In 1914, for example, when policymakers believed that war was near, many became more confident their country would win. As a result, they adopted provocative and reckless military strategies. Austria-Hungary, France, Germany, and Russia all engaged in ambitious offensive strategies and failed to prepare for a long struggle—after all, the war would supposedly be won by the time the leaves turned brown.

Implications for Policymakers

It is crucial for policymakers in any arena to understand that the simple act of making a decision can trigger overconfidence and closed-mindedness—limiting rational judgments and the ability to find compromise. This is true even in the domain of war and peace, where the stakes are highest.

In principle, implemental mind-sets could be advantageous, helping leaders to strive harder, ignore distractions, and persist in the face of adversity, potentially increasing the probability of victory in war. There is, however, a fine line between boldness and excessive risk taking. In the complex world of warfare, implemental mind-sets may produce hazardous overconfidence, drawing leaders into campaigns against enemies that are stronger than they thought and promoting overly optimistic war plans.

The Rubicon theory of war offers several key lessons for policymakers. First, decisionmakers must guard against implemental mind-sets. The rush of confidence as conflict approaches may be exhilarating, but it can be a dangerous delusion. Because psychological biases are difficult for individuals to resist, or even recognize, leaders must reform the way they make decisions to block the negative effects of implemental mind-sets.

For example, after adopting a policy, decisionmakers should resist the temptation to marginalize any skeptics. Indeed, it may be advisable for someone to deliberately play the role of "devil's advocate" and question optimistic appraisals of likely outcomes. Following the 2002–03 decision to invade Iraq, U.S. war planners were extremely overconfident about the prospects for stabilizing the country. Skeptical voices were sidelined or excluded. If senior officials had anticipated the shift to implemental mind-sets and the associated overconfidence, a "devil's advocate" would have helped to challenge shaky assumptions behind the strategy.

Second, leaders must consider the enemy's perspective. Even if one manages to avoid implemental biases on one's own side, an opponent may still adopt an implemental mind-set and become more overconfident and reckless as a result—potentially undermining deterrence or negotiation and dragging both sides into war. Implemental mind-sets make brinksmanship even more dangerous than scholars thought.

Third, planning military campaigns after leaders are already committed to war can be a perilous undertaking, because implemental mind-sets contaminate the formation of strategy. Instead, leaders should plan for conflict well ahead of time, before they enter the danger zone of implemental biases. The Iraq War might have been less costly if the United States had enacted OPLAN 1003-98—Gen. Anthony Zinni's 1999 plan for the invasion of Iraq with 400,000 troops.

Conclusion

When people shift from deliberating alternative policies to implementing a chosen policy, they cross a psychological Rubicon and adopt an implemental mind-set that utterly changes the way they think. Implemental mind-sets undermine accurate assessments, encourage risky and overoptimistic policies, and increase the danger of being drawn into unnecessary or overly costly wars. The effect is magnified if both sides suffer from implemental mind-sets, as rival states become locked in a dangerous cycle of mutual misperception and overconfidence.

Here we have focused on the application of implemental mind-sets to war, but they are likely to be powerful—and dangerous—in many other political arenas as well, from election campaigns to the economy. Leaders must understand that the world will look very different when they reach the far bank of the Rubicon—and the seemingly short path to success can prove a costly mirage.

 

Related Resources

Gollwitzer, Peter M. "Mindset Theory of Action Phases." In Handbook of Theories of Social Psychology, edited by Paul A.M. Van Lange, Arie W. Kruglanksi, and E. Tory Higgins. London: Sage, 2011.

Gollwitzer, Peter M., and Paschal Sheeran. "Implementation Intentions and Goal Achievement: A Meta-analysis of Effects and Processes." Advances in Experimental Social Psychology 38 (2006): 69–119.

Johnson, Dominic D.P. Overconfidence and War: The Havoc and Glory of Positive Illusions. Cambridge, Mass.: Harvard University Press, 2004.

Johnson, Dominic D.P., and James Fowler. "The Evolution of Overconfidence." Nature (in press).

Tierney, Dominic. How We Fight: Crusades, Quagmires, and the American Way of War (New York: Little, Brown, 2010).

Walt, Stephen M. "Wishful Thinking: Top 10 Examples of the Most Unrealistic Expectations in Contemporary U.S. Foreign Policy." Foreign Policy, April 29, 2011, http://www.foreignpolicy.com/articles/2011/04/29/wishful_thinking.

 

Dominic D.P. Johnson is Reader in Politics and International Relations at the University of Edinburgh. Dominic Tierney is Associate Professor of Political Science at Swarthmore College. They are the coauthors of Failing to Win: Perceptions of Victory and Defeat in International Politics (Cambridge, Mass.: Harvard University Press, 2006).

 

International Security is America’s leading peer-reviewed journal of security affairs. It provides sophisticated analyses of contemporary, theoretical, and historical security issues.

International Security is edited at Harvard Kennedy School’s Belfer Center for Science and International Affairs and is published by The MIT Press.

For more information about this publication, please contact the International Security publications coordinator at 617-495-1914.

Statements and views expressed in this policy brief are solely those of the authors and do not imply endorsement by Harvard University, the Harvard Kennedy School, or the Belfer Center for Science and International Affairs.

Recommended citation

Johnson, Dominic D.P. and Dominic Tierney. “Crossing the Rubicon: The Perils of Committing to a Decision.” September 2011