Data and Goliath (24 page)

Read Data and Goliath Online

Authors: Bruce Schneier

BOOK: Data and Goliath
7.6Mb size Format: txt, pdf, ePub

It may seem as if I am contradicting myself. On one hand, I am advocating for individual
privacy over forced surveillance. On the other, I am advocating for
government and corporate transparency over institutional secrecy. The reason I say
yes to both lies in the existing power imbalance between people and institutions.
Institutions naturally wield more power than people. Institutional secrecy increases
institutional power, and that power differential grows. That’s inherently bad for
personal liberty. Individual privacy increases individual power, thereby reducing
that power differential. That’s good for liberty. It’s exactly the same with transparency
and surveillance. Institutional transparency reduces the power imbalance, and that’s
good. Institutional surveillance of individuals increases the power imbalance, and
that’s bad.

Transparency doesn’t come easily. The powerful do not like to be watched. For example,
the police are increasingly averse to being monitored. All over the US, police harass
and prosecute people who videotape them, and some jurisdictions have ruled it illegal.
Cops in Chicago have deliberately obscured cameras, apparently attempting to conceal
their own behavior. The San Diego Police Department denies all requests for police
videos, claiming that they’re part of ongoing investigations. During the 2014 protests
in Ferguson, Missouri, after the police killed an unarmed black man, police routinely
prevented protesters from recording them, and several reporters were arrested for
documenting events. Los Angeles police even went so far as to sabotage court-mandated
voice recorders in their patrol cars.

Governments and corporations routinely resist transparency laws of all kinds. But
the world of secrecy is changing. Privacy-law scholar Peter Swire writes about a declining
half-life of secrets. What he observed is that, in general, secrets get exposed sooner
than they used to. Technology is making secrets harder to keep, and the nature of
the Internet makes secrets much harder to keep long-term. The push of a “send” button
can deliver gigabytes across the Internet in a trice. A single thumb drive can hold
more data every year. Both governments and organizations need to assume that their
secrets are more likely to be exposed, and sooner, than ever before.

One of the effects of a shrinking half-life for secrets is that their disclosure is
more damaging. One of Snowden’s documents indicated that the NSA spied on the cell
phone of German chancellor Angela Merkel. The document is undated, but it’s obviously
from the last few years. If that
document had become public 20 years from now, the reaction in Germany would have been
very different from the public uproar that occurred in 2013, when Merkel was still
in office and the incident was current events rather than historical.

Cultural changes are also making secrets harder to keep. In the old days, guarding
institutional secrets was part of a lifelong culture. The intelligence community would
recruit people early in their careers and give them jobs for life. It was a private
men’s club, one filled with code words and secret knowledge. The corporate world,
too, was filled with lifers. Those days are gone. Many jobs in intelligence are now
outsourced, and there is no job-for-life culture in the corporate world anymore. Workforces
are flexible, jobs are outsourced, and people are expendable. Moving from employer
to employer is now the norm. This means that secrets are shared with more people,
and those people care less about them. Recall that five million people in the US have
a security clearance, and that a majority of them are contractors rather than government
employees.

There is also a greater belief in the value of openness, especially among younger
people. Younger people are much more comfortable with sharing personal information
than their elders. They believe that information wants to be free, and that security
comes from public knowledge and debate. They have said very personal things online,
and have had embarrassing photographs of themselves posted on social networking sites.
They have been dumped by lovers in public online forums. They have overshared in the
most compromising ways—and they survived intact. It is a tougher sell convincing this
crowd that government secrecy trumps the public’s right to know.

These technological and social trends are a good thing. Whenever possible, we should
strive for transparency.

OVERSIGHT AND ACCOUNTABILITY

In order for most societies to function, people must give others power over themselves.
Ceding power is an inherently risky thing to do, and over the millennia we have developed
a framework for protecting ourselves even as we do this: transparency, oversight,
and accountability. If we know how people are using the power we give them, if we
can assure ourselves that
they’re not abusing it, and if we can punish them if they do, then we can more safely
entrust them with power. This is the fundamental social contract of a democracy.

There are two levels of oversight. The first is strategic: are the rules we’re imposing
the correct ones? For example, the NSA can implement its own procedures to ensure
that it’s following the rules, but it should not get to decide what rules it should
follow. This was nicely explained by former NSA director Michael Hayden: “Give me
the box you will allow me to operate in. I’m going to play to the very edges of that
box. . . . You, the American people, through your elected representatives, give me
the field of play and I will play very aggressively in it.” In one sense he’s right;
it’s not his job to make the rules. But in another he’s wrong, and I’ll talk about
that in Chapter 13.

In either case, we need to get much better at strategic oversight. We need more open
debate about what limitations should be placed on government and government surveillance.
We need legislatures conducting meaningful oversight and developing forward-looking
responses. We also need open, independent courts enforcing laws rather than rubber-stamping
agency practices, routine reporting of government actions, vibrant public-sector press
and watchdog groups analyzing and debating the actions of those who wield power, and—yes—a
legal framework for whistleblowing. And a public that cares. I’ll talk about that
in Chapter 13, too.

The other kind of oversight is tactical: are the rules being followed? Mechanisms
for this kind of oversight include procedures, audits, approvals, troubleshooting
protocols, and so on. The NSA, for example, trains its analysts in the regulations
governing their work, audits systems to ensure that those regulations are actually
followed, and has instituted reporting and disciplinary procedures for occasions when
they’re not.

Different organizations provide tactical oversight of one another. The warrant process
is an example of this. Sure, we could trust the police forces to only conduct searches
when they’re supposed to, but instead we require them to bring their requests before
a neutral third party—a judge—who ensures they’re following the rules before issuing
a court order.

The key to robust oversight is independence. This is why we’re always suspicious of
internal reviews, even when they are conducted by a vigorous advocate. This was a
key issue with the chief privacy officer at the
DHS. I remember Mary Ellen Callahan, who held that job from 2009 to 2012. She was
a great advocate for privacy, and recommended that the agency cancel several programs
because of privacy concerns. But she reported to Janet Napolitano, the DHS secretary,
and all she could do was make suggestions. If Callahan had been outside of DHS, she
would have had more formal regulatory powers. Oversight is much better conducted by
well-staffed and knowledgeable outside evaluators.

You can think of the difference between tactical and strategic oversight as the difference
between doing things right and doing the right things. Both are required.

Neither kind of oversight works without accountability. Those entrusted with power
can’t be free to abuse it with impunity; there must be penalties for abuse. Oversight
without accountability means that nothing changes, as we’ve learned again and again.
Or, as risk analyst Nassim Taleb points out, organizations are less likely to abuse
their power when people have skin in the game.

It’s easy to say “transparency, oversight, and accountability,” but much harder to
make those principles work in practice. Still, we have to try—and I’ll get to how
to do that in the next chapter. These three things give us the confidence to trust
powerful institutions. If we’re going to give them power over us, we need reassurance
that they will act in our interests and not abuse that power.

RESILIENT DESIGN

Designing for resilience is an important, almost philosophical, principle in systems
architecture. Technological solutions are often presumed to be perfect. Yet, as we
all know, perfection is impossible—people, organizations, and systems are inherently
flawed. From government agencies to large multinational corporations, all organizations
suffer imperfections.

These imperfections aren’t just the result of bad actors inside otherwise good systems.
Imperfections can be mundane, run-of-the-mill, bureaucratic drift. One form of imperfection
is mission creep. Another comes from people inside organizations focusing on the narrow
needs of that organization and not on the broader implications of their actions. Imperfections
also come from social change: changes in our values over
time. Advancing technology adds new perturbations into existing systems, creating
instabilities.

If systemic imperfections are inevitable, we have to accept them—in laws, in government
institutions, in corporations, in individuals, in society. We have to design systems
that expect them and can work despite them. If something is going to fail or break,
we need it to fail in a predictable way. That’s resilience.

In systems design, resilience comes from a combination of elements: fault-tolerance,
mitigation, redundancy, adaptability, recoverability, and survivability. It’s what
we need in the complex and ever-changing threat landscape I’ve described in this book.

I am advocating for several flavors of resilience for both our systems of surveillance
and our systems that control surveillance: resilience to hardware and software failure,
resilience to technological innovation, resilience to political change, and resilience
to coercion. An architecture of security provides resilience to changing political
whims that might legitimize political surveillance. Multiple overlapping authorities
provide resilience to coercive pressures. Properly written laws provide resilience
to changing technological capabilities. Liberty provides resilience to authoritarianism.
Of course, full resilience against any of these things, let alone all of them, is
impossible. But we must do as well as we can, even to the point of assuming imperfections
in our resilience.

ONE WORLD, ONE NETWORK, ONE ANSWER

Much of the current surveillance debate in the US is over the NSA’s authority, and
whether limiting the NSA somehow empowers others. That’s the wrong debate. We don’t
get to choose a world in which the Chinese, Russians, and Israelis will stop spying
if the NSA does. What we have to decide is whether we want to develop an information
infrastructure that is vulnerable to all attackers, or one that is secure for all
users.

Since its formation in 1952, the NSA has been entrusted with dual missions. First,
signals intelligence, or SIGINT, involved intercepting the communications of America’s
enemies. Second, communications security, or COMSEC, involved protecting American
military—and some
government—communications from interception. It made sense to combine these two missions,
because knowledge about how to eavesdrop is necessary to protect yourself from eavesdropping,
and vice versa.

The two missions were complementary because different countries used different communications
systems, and military personnel and civilians used different ones as well. But as
I described in Chapter 5, that world is gone. Today, the NSA’s two missions are in
conflict.

Laws might determine what methods of surveillance are legal, but technologies determine
which are possible. When we consider what security technologies we should implement,
we can’t just look at our own countries. We have to look at the world.

We cannot simultaneously weaken the enemy’s networks while still protecting our own.
The same vulnerabilities used by intelligence agencies to spy on each other are used
by criminals to steal your financial passwords. Because we all use the same products,
technologies, protocols, and standards, we either make it easier for everyone to spy
on everyone, or harder for anyone to spy on anyone. It’s liberty versus control, and
we all rise and fall together. Jack Goldsmith, a Harvard law professor and former
assistant attorney general under George W. Bush, wrote, “every offensive weapon is
a (potential) chink in our defense—and vice versa.”

For example, the US CALEA law requires telephone switches to enable eavesdropping.
We might be okay with giving police in the US that capability, because we generally
trust the judicial warrant process and assume that the police won’t abuse their authority.
But those telephone switches are sold worldwide—remember the story about cell phone
wiretapping in Greece in Chapter 11—with that same technical eavesdropping capability.
It’s our choice: either everyone gets that capability, or no one does.

It’s the same with IMSI-catchers that intercept cell phone calls and metadata. StingRay
might have been the FBI’s secret, but the technology isn’t secret anymore. There are
dozens of these devices scattered around Washington, DC, and the rest of the country
run by who-knows-what government or organization. Criminal uses are next. By ensuring
that the cell phone network is vulnerable to these devices so we can use them to solve
crimes, we necessarily allow foreign governments and criminals to use them against
us.

Other books

Ordeal of the Mountain Man by William W. Johnstone
Therapy by Sebastian Fitzek
Deadeye Dick by Kurt Vonnegut
Cup of Sugar by Karla Doyle
You'll Think of Me by Wendi Zwaduk