Authors: Bruce Schneier
PRIVACY BREACHES
In 1995, the hacker Kevin Mitnick broke into the network of an Internet company called
Netcom and stole 20,000 customer credit card numbers. In 2004, hackers broke into
the network of the data broker ChoicePoint, stole data on over 100,000 people, and
used it to commit fraud. In late 2014, hackers broke into Home Depot’s corporate networks
and stole over 60 million credit card numbers; a month later, we learned about a heist
of 83 million households’ contact information from JPMorgan Chase. Two decades of
the Internet, and it seems as if nothing has changed except the scale.
One reasonable question to ask is: how well do the Internet companies, data brokers,
and our government protect our data? In one way, the question makes little sense.
In the US, anyone willing to pay for data can get it. In some cases, criminals have
legally purchased and used data to commit fraud.
Cybercrime is older than the Internet, and it’s big business. Numbers are hard to
come by, but the cost to the US is easily in the tens of billions of dollars. And
with that kind of money involved, the business of cybercrime is both organized and
international.
Much of this crime involves some sort of identity theft, which is the fancy Internet-era
term for impersonation fraud. A criminal hacks into a database somewhere, steals your
account information and maybe your passwords, and uses them to impersonate you to
secure credit in your name. Or he steals your credit card number and charges purchases
to you. Or he files a fake tax return in your name and gets a refund that you’re later
liable for.
This isn’t personal. Criminals aren’t really after your intimate details; they just
want enough information about your financial accounts to access them. Or sufficient
personal information to obtain credit.
A dozen years ago, the risk was that the criminals would hack into your computer and
steal your personal data. But the scale of data thefts is increasing all the time.
These days, criminals are more likely to hack into large corporate databases and steal
your personal information, along with that of millions of other people. It’s just
more efficient. Government databases are also regularly hacked. Again and again we
have learned that our data isn’t well-protected. Thefts happen regularly, much more
often than the news services report. Privacy lawyers I know tell me that there are
many more data vulnerabilities and breaches than get reported—and many companies never
even figure out that their networks have been hacked and their data is being stolen.
It’s actually amazing how bad corporate security can be. And because institutions
legitimately have your data, you often have no recourse if they lose it.
Sometimes the hackers aren’t after money. Californian Luis Mijangos was arrested in
2010 for “sextortion.” He would hack into the computers of his female victims, search
them for sexual and incriminating photos and videos, surreptitiously turn on the camera
and take his own, then threaten to publish them if they didn’t provide him with more
racy photos and videos. People who do this are known as “ratters,” for RAT, or remote
access Trojan. That’s the piece of malware they use to take over your computer. The
most insidious RATs can turn your computer’s camera on without turning the indicator
light on. Not all ratters extort their victims; some just trade photos, videos, and
files with each other.
It’s not just hackers who spy on people remotely. In Chapter 7, I talked about a school
district that spied on its students through their computers. In 2012, the Federal
Trade Commission successfully prosecuted seven rent-to-own computer companies that
spied on their customers through their webcams.
While writing this book, I heard a similar story from two different people. A few
years after a friend—or a friend’s daughter—applied to colleges, she received a letter
from a college she’d never applied to—different colleges in each story. The letter
basically said that the college had been storing her personal data and that hackers
had broken in and stolen it all; it
recommended that she place a fraud alert on her account with the major credit bureaus.
In each instance, the college had bought the data from a broker back when she was
a high school senior and had been trying to entice her to consider attending. In both
cases, she hadn’t even applied to the college. Yet the colleges were still storing
that data years later. Neither had secured it very well.
As long as our personal data sloshes around like this, our security is at risk.
I
n 1993, the Internet was a very different place from what it is today. There was no
electronic commerce; the World Wide Web was in its infancy. The Internet was a communications
tool for techies and academics, and we used e-mail, newsgroups, and a chat protocol
called IRC. Computers were primitive, as was computer security. For about 20 years,
the NSA had managed to keep cryptography software out of the mainstream by classifying
it as a munition and restricting its export. US products with strong cryptography
couldn’t be sold overseas, which meant that US hardware and software companies put
weak—and by that I mean easily breakable—cryptography into both their domestic and
their international products, because that was easier than maintaining two versions.
But the world was changing. Cryptographic discoveries couldn’t be quashed, and the
academic world was catching up to the capabilities of the NSA. In 1993, I wrote my
first book,
Applied Cryptography
, which made those discoveries accessible to a more general audience. It was a big
deal, and I sold 180,000 copies in two editions.
Wired
magazine called it “the book the National Security Agency wanted never to be published,”
because it taught cryptographic expertise to non-experts. Research was international,
and non-US companies started springing up, offering strong cryptography in
their products. One study from 1993 found over 250 cryptography products made and
marketed outside the US. US companies feared that they wouldn’t be able to compete,
because of the export restrictions in force.
At the same time, the FBI started to worry that strong cryptography would make it
harder for the bureau to eavesdrop on the conversations of criminals. It was concerned
about e-mail, but it was most concerned about voice encryption boxes that could be
attached to telephones. This was the first time the FBI used the term “going dark”
to describe its imagined future of ubiquitous encryption. It was a scare story with
no justification to support it, just as it is today—but lawmakers believed it. They
passed the CALEA law I mentioned in Chapter 6, and the FBI pushed for them to ban
all cryptography without a backdoor.
Instead, the Clinton administration came up with a solution: the Clipper Chip. It
was a system of encryption with surveillance capabilities for FBI and NSA access built
in. The encryption algorithm was alleged to be strong enough to prevent eavesdropping,
but there was a backdoor that allowed someone who knew the special key to get at the
plaintext. This was marketed as “key escrow” and was billed as a great compromise;
trusted US companies could compete in the world market with strong encryption, and
the FBI and NSA could maintain their eavesdropping capabilities.
The first encryption device with the Clipper Chip built in was an AT&T secure phone.
It wasn’t a cell phone; this was 1993. It was a small box that plugged in between
the wired telephone and the wired handset and encrypted the voice conversation. For
the time, it was kind of neat. The voice quality was only okay, but it worked.
No one bought it.
In retrospect, it was rather obvious. Nobody wanted encryption with a US government
backdoor built in. Privacy-minded individuals didn’t want it. US companies didn’t
want it. And people outside the US didn’t want it, especially when there were non-US
alternatives available with strong cryptography and no backdoors. The US government
was the only major customer for the AT&T devices, and most of those were never even
used.
Over the next few years, the government tried other key escrow initiatives, all designed
to give the US government backdoor access to all encryption, but the market soundly
rejected all of those as well.
The demise of the Clipper Chip, and key escrow in general, heralded the death of US
government restrictions on strong cryptography. Export controls were gradually lifted
over the next few years, first on software in 1996 and then on most hardware a few
years later. The change came not a moment too soon. By 1999, over 800 encryption products
from 35 countries other than the US had filled the market.
What killed both the Clipper Chip and crypto export controls were not demands for
privacy from consumers. Rather, they were killed by the threat of foreign competition
and demands from US industry. Electronic commerce needed strong cryptography, and
even the FBI and the NSA could not stop its development and adoption.
GOVERNMENT SURVEILLANCE COSTS BUSINESS
Those of us who fought the crypto wars, as we call them, thought we had won them in
the 1990s. What the Snowden documents have shown us is that instead of dropping the
notion of getting backdoor government access, the NSA and FBI just kept doing it in
secret. Now that this has become public, US companies are losing business overseas
because their non-US customers don’t want their data collected by the US government.
NSA surveillance is costing US companies business in three different ways: people
fleeing US cloud providers, people not buying US computer and networking equipment,
and people not trusting US companies.
When the story about the NSA’s getting user data directly from US cloud providers—the
PRISM program—broke in 2013, businesses involved faced a severe public relations backlash.
Almost immediately, articles appeared noting that US cloud companies were losing business
and their counterparts in countries perceived as neutral, such as Switzerland, were
gaining. One survey of British and Canadian companies from 2014 found that 25% of
them were moving their data outside the US, even if it meant decreased performance.
Another survey of companies found that NSA revelations made executives much more concerned
about where their data was being stored.
Estimates of how much business will be lost by US cloud providers vary. One 2013 study
by the Information Technology and Innovation Foundation foresees the loss of revenue
at $22 to $35 billion over three years; that’s
10% to 20% of US cloud providers’ foreign market share. The Internet analysis firm
Forrester Research believes that’s low; it estimates three-year losses at $180 billion
because some US companies will also move to foreign cloud providers.
US computer and networking companies are also taking severe hits. Cisco reported 2013
fourth quarter revenue declines of 8% to 10%. AT&T also reported earnings losses,
and had problems with its European expansion plans. IBM lost sales in China. So did
Qualcomm. Verizon lost a large German government contract. There’s more. I have attended
private meetings where large US software companies complained about significant loss
of foreign sales. Cisco’s CEO John Chambers wrote to the Obama administration, saying
that NSA’s hacking of US equipment “will undermine confidence in our industry and
in the ability of technology companies to deliver products globally.”
Chambers’s comments echo the third aspect of the competitiveness problem facing US
companies in the wake of Snowden: they’re no longer trusted. The world now knows that
US telcos give the NSA access to the Internet backbone and that US cloud providers
give it access to user accounts. The world now knows that the NSA intercepts US-sold
computer equipment in transit and surreptitiously installs monitoring hardware. The
world knows that a secret court compels US companies to make themselves available
for NSA eavesdropping, and then orders them to lie about it in public. Remember the
Lavabit story from Chapter 5?
All of this mistrust was exacerbated by the Obama administration’s repeated reassurances
that only non-Americans were the focus of most of the NSA’s efforts. More than half
of the revenue of many cloud companies comes from outside the US. Facebook’s Mark
Zuckerberg said it best in a 2013 interview: “The government response was, ‘Oh don’t
worry, we’re not spying on any Americans.’ Oh, wonderful: that’s really helpful to
companies trying to serve people around the world, and that’s really going to inspire
confidence in American internet companies.”
To be fair, we don’t know how much of this backlash is a temporary blip because NSA
surveillance was in the news, and how much of it will be permanent. We know that several
countries—Germany is the big one—are trying to build a domestic cloud infrastructure
to keep their national data out of the NSA’s hands. German courts have recently ruled
against data
collection practices by Google, Facebook, and Apple, and the German government is
considering banning all US companies that cooperate with the NSA. Data privacy is
shaping up to be the new public safety requirement for international commerce.
It’s also a new contractual requirement. Increasingly, large US companies are requiring
their IT vendors to sign contracts warranting that there are no backdoors in their
IT systems. More specifically, the contractual language requires the vendors to warrant
that there is nothing that would allow a third party to access their corporate data.
This makes it harder for IT companies to cooperate with the NSA or with any other
government agency, because it exposes them to direct contractual liability to their
biggest and most sophisticated customers. And to the extent they cannot sign such
a guarantee, they’re going to lose business to companies who can.
We also don’t know what sort of increase to expect in competitive products and services
from other countries around the world. Many firms in Europe, Asia, and South America
are stepping in to take advantage of this new wariness. If the 1990s crypto wars are
any guide, hundreds of non-US companies are going to provide IT products that are
beyond the reach of US law: software products, cloud services, social networking sites,
networking equipment, everything. Regardless of whether these new products are actually
more secure—other countries are probably building backdoors in the products they can
control—or even beyond the reach of the NSA, the cost of NSA surveillance to American
business will be huge.