Countdown to Zero Day: Stuxnet and the Launch of the World's First Digital Weapon (25 page)

BOOK: Countdown to Zero Day: Stuxnet and the Launch of the World's First Digital Weapon
4.36Mb size Format: txt, pdf, ePub

42
Bill Gertz, “Computer-Based Attacks Emerge as Threat of Future, General Says,”
Washington Times
, September 13, 2011.

43
Joe P. Hasler, “Investigating Russia’s Biggest Dam Explosion: What Went Wrong,”
Popular Mechanics
, February 2, 2010.

44
“Pipeline Rupture and Subsequent Fire in Bellingham, Washington June 10, 1999,” published by the National Transportation Safety Board, 2002, available at
ntsb.gov/doclib/reports/2002/PAR0202.pdf
.

45
“Pacific Gas and Electric Company Natural Gas Transmission Pipeline Rupture and Fire,” National Transportation Safety Board, September 9, 2010, available at
ntsb.gov/investigations/summary/PAR1101.html
.

46
J. David Rogers and Conor M. Watkins, “Overview of the Taum Sauk Pumped Storage Power Plant Upper Reservoir Failure, Reynolds County, MO,” presented at the 6th International Conference on Case Histories in Geotechnical Engineering, Arlington, VA, August 11–16, 2008, available at
web.mst.edu/~rogersda/dams/2_43_rogers.pdf
.

47
Emitt C. Witt III, “December 14th, 2005 Taum Sauk Dam Failure at Johnson’s Shut-Ins Park in Southeast Missouri,” National Oceanic and Atmospheric Administration, available at
crh.noaa.gov/lsx/?n=12_14_2005
.

48
Lyndsey Layton, “Metro Crash: Experts Suspect System Failure, Operator Error in Red Line Accident,”
Washington Post
, June 23, 2009.

49
Graeme Baker, “Schoolboy Hacks into City’s Tram System,”
Telegraph
, January 11, 2008.

50
From author interview, August 2012.

51
A YouTube video of the simulation can be seen online at:
youtube.com/watch?v=kc_ijB7VPd8
. Or see links to Davis’s presentation slides and two other smart meter simulations at
ioactive.com/services_grid_research.html
.

52
NERC has cyber security regulations that utilities are supposed to follow, but they apply only to bulk electric systems (defined as facilities and systems that operate at or above 100 kilovolts) and compliance doesn’t guarantee a system won’t get hacked. Security is an evolving condition, not a static one, and can change anytime new equipment is installed or configurations are changed.

53
US-Canada Power System Outage Task Force, “Final Report on the August 14th Blackout in the United States and Canada,” April 2004, available at
https://reports.energy.gov/BlackoutFinal-Web.pdf
.

54
Kevin Poulsen, “Software Bug Contributed to Blackout,”
SecurityFocus.com
, February 11, 2004, available at
securityfocus.com/news/8016
.

55
Rebecca Smith, “U.S. Risks National Blackout from Small-Scale Attack,”
Wall Street Journal
, March 12, 2004.

56
The scenario was similar to a real-life incident that occurred at a Coors bottling plant in 2004 when an employee mistakenly changed the settings on a system responsible for greasing the bearings on a bottling line. Instead of greasing the bearings every twenty minutes he set it to grease them every eight hours, and eventually the bottling line seized up.

57
Justin Blum, “Hackers Target US Power Grid,”
Washington Post
, March 11, 2005.

58
Florida Power and Light, “FPL Announces Preliminary Findings of Outage Investigation,” February 29, 2008, available at
fpl.com/news/2008/022908.shtml
.

59
From an undated DHS slide presentation obtained through a FOIA request made by the author. The slide presentation is titled “Control Systems Vulnerability—Aurora.”

60
The figure comes from a cost assessment developed for the Aurora test and released by DHS in the author’s FOIA request.

61
As an example of what can happen when the coupling on a turbine is damaged, in 2011, a steam turbine generator at a power plant in Iran exploded in Iranshahr and was attributed to a coupling failure. The explosion was so forceful that investigators couldn’t even
find
the power turbine after the accident. Couplings need to be inspected regularly for signs of wear and need to be lubricated to maintain operations and prevent accidents. The plant in Iranshahr had three oil burners in the room where the generator was installed, which likely exacerbated the explosion when it occurred. The explosion could indeed have been the result of badly maintained coupling or faulty installation, but there were some at the time who thought it might have been the result of sabotage on par with the Aurora attack.

62
Author interview, August 2012.

63
Ibid.

64
60 Minutes
, “Cyber War: Sabotaging the System,” original air date November 6, 2009, CBS.

CHAPTER 10
PRECISION WEAPON

Ralph Langner sat in his Hamburg office and watched as his two engineers fed a stream of artful lies to the Stuxnet code they had installed on their test machine. Langner, an expert in the arcane field of industrial-control-system security, had been working with his colleagues for days to identify and re-create the precise conditions under which the stubborn code would release its payload to their PLC, but it was proving to be more difficult than they’d expected.

Days earlier, Langner’s team had set up a computer with the Siemens Step 7 software installed and connected it to a Siemens PLC they happened to have on hand. They also installed a network analyzer to watch data as it passed between the Step 7 machine and the PLC. Unlike the Symantec researchers, Langner and his team worked with PLCs all the time and knew exactly what kind of traffic should pass between the Step 7 machine and the PLC; as a result, they assumed it would be easy to spot any anomalies in the communication. But when they initially infected their Step 7 system with Stuxnet, nothing happened. Stuxnet, they discovered, as others had before, was on the hunt for two
specific
models of Siemens PLC—the S7-315 and S7-417—and they didn’t have either of these models on hand.

So they installed a Windows debugger on their test machine to observe the steps Stuxnet took before releasing its payload and devised a way to trick the code into thinking it had found its target. Stuxnet ran through a long checklist pertaining to the target’s configuration, each seemingly more specific than the last. Langner and his colleagues didn’t know what exactly was on the checklist, but they didn’t need to know. As Stuxnet queried their system for each item on the list, they fed it a series of manufactured responses, until they landed on the answers Stuxnet wanted to hear. It was a crude, brute-force method of attack that took several days of trial and error. But when they finally got the right combination of answers and ran the code through its paces one last time, they saw exactly what the Symantec researchers had described: Stuxnet injected a series of rogue code blocks into their PLC. “That’s it,” Langner recalls thinking. “We got the little motherfucker.”
1

They only noticed the rogue code going into the PLC because the blocks of code were slightly larger than they should have been. Before infecting their Step 7 system with the malware, they had transferred blocks of code to the PLC and captured them with the analysis tool to record their basic size and characteristics. After infecting the machine with Stuxnet, they transferred the same blocks of code again and saw that they had suddenly grown.

They couldn’t yet see what the Stuxnet code was doing to the PLC, but the injection itself was big news. It was way beyond anything they’d ever warned customers about and way beyond anything they expected to see in the first known attack against a PLC.

WHEN SYMANTEC HAD
revealed on August 17 that Stuxnet was bent on sabotaging PLCs, it might have seemed to Chien and Falliere that no one was listening. But six thousand miles away, Langner was sitting in his small office in a leafy suburb of Germany, reading Symantec’s words with
great interest. Langner had been warning industrial clients for years that one day someone would devise a digital attack to sabotage their control systems and now, it appeared, the day had finally arrived.

Langner was the owner of a three-man boutique firm that specialized in the security of industrial control systems. It was the only thing his company did. He had no interest in general computer security and couldn’t care less about announcements warning of the latest viruses and worms infecting PCs. Even zero-day exploits held no allure for him. So when Stuxnet first made headlines in the technology press and became the subject of extensive chatter on security forums, he paid it little notice. But when Symantec wrote that Stuxnet was sabotaging Siemens PLCs, Langner was immediately intrigued.

Symantec didn’t reveal what Stuxnet was doing to the PLCs, only that it was injecting code into the so-called ladder logic of the PLC—whether that meant bringing the PLC to its knees, or worse, the antivirus firm didn’t say.
2
But it struck Langner that thousands of Siemens customers, including many of his own clients, were now facing a potential killer virus and were waiting anxiously for Siemens or Symantec to tell them what exactly Stuxnet was doing to their PLCs. But, oddly, after making their startling announcement, the Symantec researchers had gone quiet.

Langner suspected the researchers had hit a wall, due to their lack of expertise with PLCs and industrial control systems. But curiously, Siemens had also gone silent. This was strange, Langner thought. It was, after all, Siemens controllers that were being attacked; the company had an obligation to analyze the malevolent code and tell customers what it might be doing to their systems. But after a couple of brief announcements the German company had made in July, it had gone mum.
3

Langner was incensed. Although Stuxnet appeared to be targeting only Siemens Step 7 machines, no one really knew what the malicious code was capable of doing or if it might be laced with bugs that could damage other PLCs. And there was one more important concern: the vulnerability that let Stuxnet inject its malicious code into the ladder logic of a Siemens PLC also existed in other controllers.
4
Samples of Stuxnet were already available for download on the internet; any random hacker, criminal extortionist, or terrorist group could study the code and use it as a blueprint to devise a more wide-scale and destructive attack against other models of PLCs.

This made the silence of two other parties even more perplexing—the CERT-Bund, Germany’s national computer emergency response team; and ICS-CERT in the United States. Both organizations were tasked with helping to secure critical infrastructure systems in their respective countries, but neither party had said much about Stuxnet. There was no talk in an ICS-CERT alert about injecting ladder logic into the Siemens PLCs, or even any mention of sabotaging them. There was also nothing at all about the dangers that Stuxnet presented for future attacks.
5
The silence of German authorities was even stranger, since Siemens controllers were installed in almost every German plant or factory Langner could name.

Langner talked it over with his two longtime engineers, Ralf Rosen and Andreas Timm. None of them had any experience reverse-engineering viruses or worms, but if no one else was going to tell them what Stuxnet was doing to the Siemens PLCs, then they would have to take it apart themselves. It would mean days of doing pro bono work squeezed in
between other assignments from paying customers, but they concluded they didn’t have a choice.

Langner and his colleagues made an odd but effective team. In a profession sometimes characterized by frumpy, pale, and ponytailed engineers, Langner, a vigorous fifty-two-year-old with short dark hair minus any gray, sported crisp business suits and finely crafted leather shoes. He had piercing blue eyes in a vacation-tanned face and the trim, toned frame of a seasoned mountaineer—the by-product of ski excursions in the Alps and rugged hikes in the hills. If Langner’s dapper appearance didn’t set him apart from the pack, his brusque and bold manner did. He had a reputation for being an outspoken maverick, and often made provocative statements that riled his colleagues in the control-system community. For years he had railed about security problems with the systems, but his blunt, confrontational manner often put off the very people who most needed to listen. Rosen and Timm, by contrast, were both graybeard engineers in their forties who had a more relaxed approach to dress and fitness and took a quieter, more backseat role to Langner’s conspicuous one.

Although the three of them seemed in many ways mismatched, there was probably no better team suited to the task of examining Stuxnet. Timm had worked for Langner as a control-system expert for at least a decade, and Rosen for three years longer than that. During that time, they’d amassed extensive knowledge about industrial control systems in general, and Siemens controllers in particular. Siemens, in fact, was a longtime customer. The company bought software products from Langner’s firm, and he and his engineers sometimes trained Siemens employees on their own systems. There were probably only a handful of Siemens employees who knew the Siemens systems better than they did.

Langner’s path to ICS security had been a circuitous one, however. He was a certified psychologist by training, something seemingly far removed from the world of control systems. But it was his psychology background that actually led to his present career. In the 1970s, while studying psychology and artificial intelligence at the Free University of Berlin, he began writing software to do statistical analysis of data collected from experiments.
He also wrote a program that modeled human decision-making patterns to arrive at psychiatric diagnoses.

But it was a driver program he wrote to connect his home computer to the university’s mainframe that ended up launching his ICS career. In college, Langner owned an early-generation PC that lacked the computational power needed to conduct statistical analysis. Whenever he wanted to crunch data collected from one of his experiments, he had to travel to the campus and plug his computer into the college mainframes. Langner hated the commute to campus, so he studied the protocols needed to communicate with the servers remotely and wrote a driver program that let him dial in via modem from home.

Other books

Hellhound by Mark Wheaton
A Knight to Remember by Maryse Dawson
Sword of Light by Steven Tolle
Lambrusco by Ellen Cooney
A Million Wishes by DeAnna Felthauser
The House of Tudor by Alison Plowden
It's a Match! by Zoë Marshall