Read A Sense of the Enemy: The High Stakes History of Reading Your Rival's Mind Online
Authors: Zachary Shore
Tags: #History, #Modern, #General
The most troubling aspect of the code was its notion that the Soviets perceived compromise as weakness. “When an opponent begins to talk of making some concessions or offers them, it should be recognized that this is a sign of weakness on his part. Additional and perhaps major gains can be made by continuing to press the opponent under these circumstances.” George acknowledged that observations such as these lack “operational content,” but while they cannot be readily applied in concrete situations, they probably represent “a valuable part of the cognitive and affective makeup of a good Bolshevik.” These maxims beg the question: Do any statesmen fail to press an advantage? The larger issue, of course, is that enemies are frequently assumed to view attempts
at compromise as signs of weakness. With this assumption as a starting point for understanding the enemy, compromise solutions are rapidly excluded.
Whenever an enemy is thought to view compromise as weakness, the prescription must always be at least steely resolve and at worst intransigence. Standing up to aggression is of course crucial. The problem is not with the prescription; the problem is with the analysis. The assumption is that the enemy always intends to be aggressive—to push until the other side yields. Therefore, all of the enemy’s actions are interpreted as such, and a compromise solution is precluded.
Despite the problematic aspects of Leites’s concept, many political psychology scholars have embraced his operational code as a useful means for understanding foreign policy decision-making. From studies in the 1970s by Margaret Hermann
18
and Ole Holsti,
19
to more recent works by Stephen G. Walker and Mark Schafer, scholars have sought to unravel the codes of statesmen from Bill Clinton and Tony Blair to Saddam Hussein and Kim Il Sung. At times, political psychologists have even sought to analyze the frequency and usage patterns of particular words that leaders employ in their public pronouncements.
20
These scholars acknowledge that speech writers and other government officials typically craft major policy addresses, not the leaders themselves. Nonetheless, those who dissect word usage believe that their methods can reveal hidden clues to a leader’s worldview and future actions. Part of the impetus behind these projects is the desire to demonstrate that the beliefs of individual statesmen matter in policy formation—a fact that few historians would dispute but one that some international relations theorists cannot accept.
All statesmen have particular leadership styles, and these styles can and often do affect foreign policy decision-making. Leadership style is one factor among many that influences policy formation and outcomes. The same is true of worldviews. All statesmen have them, and their views impact policy. Beneath the superficial level of leadership style, and deeper down below the realm of worldviews, lies a single or small set of core drivers—the motivations most vital to a leader. These are the ambitions that propelled that person to seek and maintain a leadership role. They are the raison d’être of leadership, and they are specific to each statesman. At times of crisis, when the stakes are highest, a statesman’s
underlying drivers are revealed. Strategic empathy is the skill one needs to spot and comprehend them.
Crowe, Kennan, Leites, and many other policy analysts all committed the fundamental attribution error, convincing themselves and the statesmen they served that they truly grasped the enemy mind.
21
Scholarship on differing cultural norms can certainly provide insights into a group’s behavior, but it cannot serve as a reliable guide to understanding the underlying drivers of individual foreign leaders.
These analysts simultaneously employed the continuity heuristic in formulating their policy recommendations. Crowe looked to prior German actions as the best guide to future behavior: “For there is one road which, if past experience is any guide to the future, will most certainly not lead to any permanent improvement of relations with any Power. . .” He concluded that because of an essentially aggressive German nature and Germany’s rising power, British concessions could never be fruitful.
22
Similarly, in his long telegram, Kennan argued that the Soviets’ pattern of past behavior clearly indicated the course of their future behavior. The type of analysis that both men used was strikingly akin.
The root problem with the continuity heuristic is that it identifies a behavior pattern, such as productivity or aggressiveness, without clarifying why that behavior exists. In contrast, the pattern-break heuristic focuses our attention on what underlies that behavior. It suggests why the enemy was aggressive in the first place by spotlighting what is most important to that individual or group.
The Empty Couch
The continuity heuristic is, of course, not simply applied to groups: Germans, Bolsheviks, or others. Statesmen and their advisors have often made similar assumptions about individuals. In 1965, the U.S. Central Intelligence Agency (CIA) established a division to assess the psychology of foreign leaders. It was a daring project, for it required psychoanalyzing individuals without the subject’s presence.
At first, this group was housed within the CIA’s division of medical services, but soon it migrated to the Directorate of Intelligence, the part of the agency that deals with analysis (as opposed to operations).
The Center for the Analysis of Personality and Political Behavior recruited highly-trained psychologists and other experts in the behavioral sciences to scrutinize foreign leaders’ biographies. The teams of psychobiographers were tasked with inspecting a leader’s early childhood and later life experiences, all in the hope of drafting a composite picture of that person’s character. With good reason, presidents and principals (the heads of American national security departments) wanted to know what made foreign leaders tick.
For twenty-one years, Jerrold Post headed this division. In his book about psychobiography, he explains that his Center focused on the key life events that shaped each leader. Post makes the assumption behind the Center’s work explicit:
Moreover, one of the purposes of assessing the individual in the context of his or her past history is that the individual’s past responses under similar circumstances are, other things being equal, the best basis for predictions of future behavior.
23
As I argued above, scrutinizing past behavior does not tell you what you truly need to know. It cannot reveal someone’s underlying drivers. At best it can provide reasonable predictions only if future conditions are sufficiently similar to prior conditions. Unfortunately, in international affairs, the most crucial decisions are typically made under dramatically new settings, when old patterns are being upended and standard procedures overthrown. At such times, what statesmen need is heuristics for discerning their opponent’s underlying drivers—the things that the other side wants most. Psychobiographies can be helpful in many realms, such as determining an individual’s negotiating style or understanding his personal quirks, but they are less valuable when statesmen need to anticipate an enemy’s likely actions under fresh circumstances.
The history of twentieth-century conflicts has been marked by the inability to gain a clear sense of one’s enemies. Grasping the other side’s underlying drivers has been among the most challenging tasks that leaders have faced. The analysts discussed above were by no means fools. They were smart, sober-minded students of international affairs, but they sometimes lacked an essential component to policymaking: a deep appreciation for what drives one’s enemies. Much of their difficulty
stemmed from two flawed assumptions. First, the other side possessed a rigid, aggressive nature. Second, past behavior was the best predictor of future actions. Both assumptions not only proved to be untenable, they also helped to create a dynamic out of which conflict was more likely to flow.
If the twentieth century saw frequent cases of the continuity heuristic, the twenty-first has begun with its own form of mental shortcuts: an excessive faith in numbers. Modern advances in computing, combined with increasingly sophisticated algorithms, have produced an irrational exuberance over our ability to forecast enemy actions. While mathematical measures can offer much to simplify the complex realm of decision-making, an overweighting of their value without recognizing their limitations will result in predictions gone horribly awry. The crux of those constraints rests upon our tendency to focus on the wrong data. And although that mental error is not new to the modern era, it has been magnified by modernity’s advances in technology. Our endless longing to use technology to glimpse the future might be traced back to the start of the 1600s, when a small boy wandered into an optics shop, fiddled with the lenses, and saw something that would change the world.
The Quant’s Prediction Problem
NO ONE KNOWS PRECISELY
how Hans Lippershey came upon the invention. One legend holds that some children wandered into his spectacle shop, began playing with the lenses on display, and suddenly started to laugh. Tiny objects far away appeared as though they were right in front of them. The miniscule had become gigantic. Though the truth of that tale is doubtful, the story of the telescope’s invention remains a mystery. We know only that four centuries ago, on October 2, 1608, Hans Lippershey received a patent for a device that is still recognizable as a modern refractory telescope.
1
Not long after Lippershey’s patent, the device found its way to Pisa, where it was offered to the duchy for sale. Catching wind of this new invention, Galileo Galilei quickly obtained one of the instruments, dissected its construction, and redesigned it to his liking.
2
Galileo intended it, of course, for stargazing, but his loftier intentions were not shared by the Pisans. This new tool had immediate and obvious military applications. Any commander who could see enemy ships at great distance or opposing armies across a battlefield would instantly gain a distinct advantage. That commander would, in effect, be looking forward in time, and, with that literal foresight, he could predict aspects of the enemy’s actions. The telescope offered its owner a previously unimaginable advantage in battle. It brought the invisible to light. It altered the perception of time. It presented a genuine glimpse into the future, beyond what the naked eye could see. We don’t know whether
Lippershey, Galileo, or some other crafty inventor made the first sale of a telescope to a military, but when he did, that exchange represented one of the earliest mergers of Enlightenment science with the business of war. From that moment on, modern science has been searching for ways to extend its gaze into the future, and militaries have been eager to pay for it.
In the seventeenth century, merely gaining an early glimpse of the enemy’s actions was enough to advantage one side over the other. By the twentieth century, strategists needed much more. They needed greater predictive power for anticipating enemy moves. Technology alone could not, and still cannot, fill that gap. Strategists have always needed to develop a sense of the enemy, but the craving for more concrete, reliable predictions has left militaries easily seduced by science. Lately, that longing has led them to focus on the wrong objective: predicting the unpredictable.
The Numbers That Count
The rush is on to quantify as much as possible and let the algorithms tell us what the future holds. While this method offers obvious advantages, it is not without serious pitfalls. In many realms of prediction, we often go astray when we focus on the facts and figures that scarcely matter, as Nate Silver has shown in his thoughtful, wide-ranging study,
The Signal and the Noise
. Silver is America’s election guru. He has rocketed to prominence for his successful forecasts of U.S. primary and general election results. In his book, Silver concentrates on those predictions reliant on large, sometimes massive, data sets—so-called “big data.” Silver himself dwells mainly in the realm of number crunchers. He quantifies every bit of data he can capture, from baseball players’ batting averages to centuries of seismologic records, from poker hands to chessboard arrangements, and from cyclone cycles to election cycles. In short, if you can assign a number to it, Silver can surely crunch it.
After four years of intensive analysis, Silver concludes that big data predictions are not actually going very well. Whether the field is economics or finance, medical science or political science, most predictions are either entirely wrong or else sufficiently wrong as to be of minimal value. Worse still, the wrongness of so many predictions, Silver says, tends to proliferate throughout academic journals, blogs,
and media reports, further misdirecting our attention and thwarting good science. Silver contends that these problems mainly result from our tendency to mistake noise for signals. The human brain is wired to detect patterns amidst an abundance of information. From an evolutionary perspective, the brain developed ways of quickly generalizing about both potential dangers and promising food sources. Yet our brain’s wiring for survival, the argument goes, is less well-suited to the information age, when too much information is inundating us every day. We cannot see the signal in the noise, or, more accurately put, we often fail to connect the relevant dots in the right way.