Cybersecurity and Cyberwar (27 page)

Read Cybersecurity and Cyberwar Online

Authors: Peter W. Singer Allan Friedman,Allan Friedman

BOOK: Cybersecurity and Cyberwar
9.78Mb size Format: txt, pdf, ePub

Another challenge for the offense is that the outcome of a cyberat-tack can be highly uncertain. You may be able to get inside a system or even shut it down, but that is only part of the story of what makes a good offense. The actual effect on your target is hard to predict, and any damage assessment can be
difficult to estimate
.

Nor is the defender so helpless in cybersecurity. The attackers may have the luxury of choosing the time and place of their attack, but they have to make their way through a “cyber kill chain” of multiple steps if they actually want to achieve their objectives. Charles Croom is a retired US Air Force lieutenant general, who once led the Defense Information Systems Agency (DISA), the agency that services the IT needs of the entire US military, and now is Lockheed's vice president for cybersecurity solutions. As he explains, “The attacker has to take a number of steps: reconnaissance, build a weapon, deliver that weapon, pull information out of the network. Each step creates a vulnerability, and all have to be completed. But a defender can
stop the attack at any step
.”

And, as we saw earlier, defenders who are losing in the cyber realm don't have to restrict the game to just that domain. They can try to impose some other costs on the attacker, whether they be some kind of economic or diplomatic costs, traditional military action, or
even a cyber counterattack
. Rather than just sitting there defenseless, they can take action either to deter the attack or reduce the benefits from it.

The most important lesson we have learned in traditional offense-defense balances, and now in cybersecurity, is that the best defense actually is a good defense. Regardless of which side has the advantage, any steps that raise the capabilities of the defense make life harder on the offense and limit the incentives for attacks in the first place. In cybersecurity, these include any and all measures that tighten network security and aid in forensics to track back attackers.

Beyond the best practices we'll explore in
Part III
that weigh risks and make individuals and important infrastructure more secure, there is also the potential for new technology that can continue to function even if compromised. The idea is to build systems where the parallel for measuring offense and defense isn't war, but biology. When it comes to the number of bacteria and viruses in our
bodies, our human cells are actually outnumbered by as much as 10 to 1. But the body has built up a capacity of both resistance and resilience, fighting off what is most dangerous and, as Vint Cerf puts it, figuring out how to “
fight through the intrusion
.” No computer network will mimic the human body perfectly, but DARPA and other groups are working on “intelligent” computer security networks that learn and adapt to resist cyberattacks. The defense would start to outsmart an adversary and turn the tables on them. Just the mere existence of such a system would always
sow doubt in the offense
that the attack is going to work.

The final question, though, is whether an offensive advantage (if it is even possible) actually does have to doom a system to instability and risk. Some are now arguing that the real problem in cyber is not that the offense may have an advantage, but that it isn't talked about enough, which fails to warn all sides of the risks if ever used. “We've got to step up the game; we've got to talk about our offensive capabilities and train to them; to make them credible so that people know there's a penalty to this,” said James Cartwright, the four-star Marine Corps general who led much of the initial US strategy in cyber issues until his retirement in 2011. “You can't have something that's a secret be a deterrent. Because if you don't know it's there,
it doesn't scare you
.” (Two years later, this quote took on far greater resonance, when General Cartwright was reported to have been the alleged source of leaks to the media that revealed the US role in building Stuxnet, the first true use of a cyberweapon.)

A New Kind of Arms Race: What Are the Dangers of Cyber Proliferation?

In 280 BC, King Pyrrhus of Epirus invaded Italy with an army of 25,000 men, horses, and war elephants. In the battle of Asculum, his force soundly defeated the Romans, but at the loss of a large portion of his force. When one of his officers congratulated him on the win, a despondent Pyrrhus supposedly responded, “One more such victory and
we shall be utterly ruined
.”

This idea of a “Pyrrhic victory” has come to describe accomplishments that seem to offer great benefit but ultimately sow the seeds of defeat. Many now describe Stuxnet in a similar way. The development and use of a cyber weapon seriously damaged Iran's nuclear
program in a way that avoided direct military confrontation for several years. But by proving that it could be done, the episode also perhaps opened the page to a new kind of arms race.

Over the last decade, the idea of building and using cyber weapons moved from science fiction to concept, and now to reality. Much of the work is naturally shrouded in secrecy, but most estimates are that this new arms race is quite global. As one report put it, “By one estimate, more than one hundred nations are now
amassing cybermilitary capabilities
. This doesn't just mean erecting electronic defenses. It also means developing ‘offensive' weapons.”

The capabilities of these nations, though, greatly differ. Just as in traditional military clout, Burundi's cyber power pales compared to that of the United States or China. McAfee, a Santa Clara, California, computer security firm, for instance, estimates that there are only around twenty countries that actually have “advanced cyberwar programs” that could build something comparable to a new
Stuxnet-like weapon
.

Michael Nacht, the former US Assistant Secretary of Defense for Global Strategic Affairs, told us how all this work impacts global politics: “An arms race is already going on in cyberspace and
it is very intense
.” The irony is that, just as in past arms races where nations rushed to construct everything from battleships to nuclear weapons, the more states compete to build up their capabilities, the less safe they feel. As we saw, the United States and China are perhaps the two most important players in this game, but both are deeply on edge about the threat they see from the other. This is perhaps the true hallmark of an arms race.

What sets this twenty-first-century cyber arms race apart from the past is that it is not just a game of states. As we've explored again and again, what makes cyberspace so simultaneously positive and problematic from a policy standpoint is that it is populated by both public and private actors. So when it comes to arms races within it, there are new wrinkles of decentralization and scale.

While the impact of individuals is often overstated in cybersecurity (the best types of malware often require the cooperation of multiple experts, skilled in a variety of areas, rather than the popular trope of a single teenaged hacker in his parents' basement), the cyber realm is one in which small groups can potentially generate enormous consequences. In software programming, businesses
like Google and Apple have found that the productivity difference between a good and an elite programmer can be several orders of magnitude. The same goes for those who program malware. Nonstate actors all the way down to individuals are now key players in a major arms race, something that hasn't happened before.

Ralph Langner, the cybersecurity expert who discovered Stuxnet, for example, discussed with us how he would rather have
ten experts
of his own choosing versus all the resources of the US Cyber Command at his disposal. While Ralph was slightly exaggerating to make a point, the fact is that small groups or organizations can be meaningful in a manner unimaginable in earlier times. New malware can be extremely harmful on a global scale and yet can be developed and deployed by only a few people.

The key is not just these groups' power but their ability to share it, what arms control experts call proliferation. Unlike with battleships or atomic bombs, those same groups or individuals can, if they wish, almost instantaneously communicate knowledge of how to create any new capability to millions of others. For example, it may have taken the combined efforts of a team of experts almost a year to build Stuxnet, but within weeks of its discovery an Egyptian blogger had posted an online how-to guide to building this new cyber weapon.

This new kind of cyber proliferation can take two paths. One is just to try to use the new capability “as is” by making a direct copy. This wouldn't seem like such a big problem, as good defenses would plug any gap identified and exploited by the use of a new weapon like Stuxnet. Except many pieces of malware turn out to be more than one-time-only weapons because their potential targets are irresponsible, and fail to adapt their defenses. Part of Langner's original motivation to go public about Stuxnet was to encourage adoption of the vendor patches needed to prevent future exploitation among potential targets in the West. Yet a full year after Stuxnet was first revealed to the world, Langner and other security experts were lamenting that that a number of major public infrastructure companies had still not plugged the
vulnerabilities that Stuxnet attacked
.

The more problematic proliferation path, however, is via inspiration. Each construction and use of a new type of cyber weapon lowers the bar for the rest of the crowd. Stuxnet had a complex
infection package that included new zero-day attacks, as well as a novel payload that attacked SCADA controllers, but its beauty (and the lesson for others) was in how the different parts of this complex attack worked together. Some of the copycats were fairly simple. Duqu, for example, was a worm that was discovered in the wild soon after Stuxnet using very similar Microsoft Windows–exploiting code. Many took to calling it “son of Stuxnet,” with the idea that it must be the next version designed by the same team. However, while there are key similarities, experts also have noticed key differences and thus now believe that it was more a
case of inspiration
than evolution. As Ralph Langner describes this new kind of proliferation problem:

Son of Stuxnet is a misnomer. What's really worrying are the concepts that Stuxnet gives hackers. The big problem we have right now is that Stuxnet has enabled hundreds of wannabe attackers to do essentially the same thing. Before, a Stuxnet-type attack could have been created by maybe five people. Now it's more like 500 who could do this. The skill set that's out there right now, and the level required to make this kind of thing, has dropped considerably simply because you can
copy so much from Stuxnet
.

The booming underground black market of creating and distributing malware, in which transnational criminal groups buy and sell specialized cyber capabilities, makes this proliferation even
smoother and more worrisome
.

This combination is what makes the cyber realm so different when it comes to arms races. It's not just that the ideas behind the weapons spread globally in mere microseconds, but that the required tools to turn a blueprint into action do not require the kind of large-scale human, financial, or physical resources one used to need. To make a historic comparison, building Stuxnet the first time may have required an advanced team that was the cyber equivalent to the Manhattan Project. But once it was used, it was like the Americans didn't just drop this new kind of bomb on Hiroshima, but also kindly dropped leaflets with the design plan so anyone else could also build it, with no nuclear reactor required.

Are There Lessons from Past Arms Races?

“We are here to make a choice between the quick and the dead.… If we fail, then we have damned every man to be the slave of fear. Let us not deceive ourselves; we must elect
world peace or world destruction
.”

In June 1946, Bernard Baruch, the personal representative of President Truman, made this speech to the United Nations as part of an amazing offer that history little remembers. Despite the fact that the United States was the only nation with nuclear weapons at the time, it offered to turn over all its nuclear bombs to the United Nations. Baruch's condition was that all other nations also agree not to build them and open themselves up to inspection. It seemed a noble gesture, but the Soviets (who wouldn't be able to figure out how to build nuclear bombs for another three years) balked. They demanded that the United States instead first give up its weapons and only afterward should the world develop a system of controls. They were also deeply suspicious of the UN, feeling that it was too US-dominated to be trusted (how things have changed!). With the two superpowers at loggerheads, the Baruch plan fell apart. Instead, a nuclear arms race would shape the next 50 years of global politics, a time in which over one hundred thousand atomic bombs would be built and the world would almost be destroyed several times over, as during close calls like the Cuban Missile Crisis.

While today's emerging cyber arms races are far from identical to the Cold War, there are still lessons that can be learned from it. Or, to paraphrase Mark Twain, while history may not always repeat itself, “It does rhyme.”

One of the most instructive lessons is that the initial periods of a burgeoning arms race are often the most dangerous. These early days have a dark combination. The possessors of the new technology see themselves as having a unique advantage but one that is fleeting, creating a “use it or lose it” mentality. It is also the period in which the technology and its consequences are least understood, especially by senior leaders. In the Cold War, for example, probably the scariest time was not the Cuban Missile Crisis, but the late 1940s and 1950s when the real-world versions of
Dr. Strangelove
were taken seriously, arguing that nuclear war was something that was not only survivable but winnable. This was a period that saw
everything from General Douglas Macarthur's 1951 demand to be given sole discretion to drop atomic bombs on mainland China to perhaps one of the most outrageous nuclear concepts of all, Project A-119. When the Soviets launched the Sputnik satellite into space in 1957, the US Air Force proposed a nuclear missile be shot at the moon, just to demonstrate that the United States could also do exciting things in space.

Other books

Morality for Beautiful Girls by Smith, Alexander Mccall
Roman Nights by Dorothy Dunnett
Crescent City by Belva Plain
Straw Men by Martin J. Smith
Falling by Gordon Brown