The Internet Is Not the Answer (26 page)

BOOK: The Internet Is Not the Answer
13.6Mb size Format: txt, pdf, ePub

While Google wasn’t officially represented in the Augmented Reality Pavilion, there were plenty of early adopters wandering around the Venetian’s fake piazzas and canals wearing demonstration models of Google Glass, Google’s networked electronic eyeglasses. Michael Chertoff, the former US secretary of homeland security, described these glasses, which have been designed to take both continuous video and photos of everything they see, as inaugurating an age of “ubiquitous surveillance.”
23
Chertoff is far from alone is being creeped out by Google Glass. Several San Francisco bars have banned Google Glass wearers—known locally as “Glassholes”—from entry. The US Congress has already launched an inquiry into their impact on privacy. And in June 2013, privacy and data officials from seven countries, including Canada, Australia, Mexico, and Switzerland, sent Google CEO Larry Page a letter expressing their discomfort about the impact on privacy of these glasses. Like Chertoff’s, their country’s fears were of a “ubiquitous surveillance”—a world in which everyone was being watched all the time by devices that are collecting massive amounts of our most personal health, location, and financial data.
24

But it wasn’t only wearables that were on show at CES. To borrow the corporate language of Intel CEO Krzanich, it was the “broad ecosystem of life” that was being networked by all these new electronic devices spewing out those zettabytes of data that, according to Patrick Tucker, are now making anonymity impossible.
25
The Internet of Things had arrived in Las Vegas. Quite literally,
everything
at CES was becoming networked, e
verything
was being reinvented as a smart, connected device. There were smart ovens, smart clothing, smart thermostats, smart air conditioners, smart lighting systems, and smartphones, of course, all designed to capture data and distribute it on the network. One part of the show was dedicated to smart televisions—devices much more intelligent than most TV shows themselves. Indeed, South Korean electronics giant LG’s connected televisions are so intelligent that they are already logging our viewing habits in order to serve us up targeted ads.
26

Another part of CES was dedicated to the connected car—automobiles that are so all-seeing they know our speed, our location, and whether or not we are wearing our seat belt. According to the consultancy Booz, the market for connected cars is about to explode, with demand expected to quadruple between 2015 and 2020 and generate revenues of $113 billion by 2020.
27
But even today’s connected car is a data machine, with the onboard cameras from Mercedes-Benz’s new S-Class saloon already generating 300 gigabytes of data per hour about the car’s location and speed and the driver’s habits.
28

And then there’s Google’s driverless car, an artificially intelligent, networked car that is driven by software called Google Chauffeur. The idea of driverless cars might sound as science fictional as the idea of augmented reality glasses—but Nevada and Florida have already passed laws permitting their operation and the sight of trial versions of Google’s automated cars driving themselves up and down Route 101 between San Jose and San Francisco is not an uncommon one. While there’s no doubt that driverless cars do have enormous potential benefits, particularly in terms of safety and convenience, not to mention the potential environmental benefits of much lighter and thus more energy-efficient vehicles, Google’s pioneering role in them is deeply problematic. The software that powers their cars, Google Chauffeur, is essentially the automotive version of Google Glass, a “free” product designed to track everywhere we go and to feed all that data back to the main Google database so that it can
connect the dots
of our lives. As the
Wall Street Journal
columnist Holman Jenkins notes about these so-called autonomous driverless vehicles, “they won’t be autonomous at all,” and they may “pose a bigger threat to privacy than the NSA ever will.”
29
After all, if Google links the data collected from its driverless cars with data amassed from the rest of its ubiquitous products and platforms—such as the smartphone it is developing that uses 3-D sensors to automatically map our physical surroundings so that Google always knows where we are
30
—then you have a surveillance architecture that exceeds even anything that Erich Mielke, in his wildest imagination, ever dreamed up.

Tim Berners-Lee invented the Web in order to help him remember his colleagues at CERN. “The Web is more a social creation than a technical one,” he explains. “I designed it for a social effect—to help people work together—and not as a technical toy. The ultimate goal of the Web is to support and improve our weblike existence in the world. We clump into families, associations, and companies. We develop trust across the miles and distrust around the corner.”
31

But when Berners-Lee invented the Web in 1989, he never imagined that this “social creation” could be used so repressively, both by private companies and governments. It was George Orwell who, in
1984
,
invented the term “Big Brother” to describe secret policemen like Erich Mielke. And as the Internet of Things transforms every object into a connected device—50 billion of them by 2020 if we are to believe Patrik Cerwall’s researchers at Ericsson, with five and a half zettabytes of data being produced by 2015—more and more observers are worrying that twentieth-century Big Brother is back in a twenty-first-century networked guise—dressed in a broad ecosystem of wearables. They fear a world resembling that exhibition at the Venetian in which row after row of nameless, faceless data gatherers wearing all-seeing electronic glasses watch our every move.

Big Brother seemed ubiquitous at the Venetian. Reporting about CES, the
Guardian
’s Dan Gillmor warned that networked televisions that “watch us” are “closing in on Orwell’s nightmarish Big Brother vision.”
32
Even industry executives are fearful of the Internet of Things’s impact on privacy, with Martin Winterkorn, the CEO of Volkswagen, warning in March 2014 that the connected car of the future “must not become a data monster.”
33

But there is one fundamental difference between the Internet of Things and Erich Mielke’s twentieth-century Big Brother surveillance state, one thing distinguishing today’s networked society from Orwell’s
1984
. Mielke wanted to create crystal man against our will; in today’s world of Google Glass and Facebook updates, however, we are
choosing
to live in a crystal republic where our networked cars, cell phones, refrigerators, and televisions watch us.

The Panopticon

“On Tuesday I woke up to find myself on page 3 of the
Daily Mail
,” wrote a young Englishwoman named Sophie Gadd in December 2013. “That may be one of the worst ways to start the day, after falling out of bed or realizing you’ve run out of milk. My appearance was not the result of taking my clothes off, but the consequence of a ‘Twitter Storm.’”
34

A final-year history and politics undergraduate at the University of York, Gadd had inadvertently become part of a Twitter storm when, while on vacation in Berlin, she tweeted a painting of the eighteenth-century Russian czarina Catherine the Great from the Deutsches Historisches Museum in Berlin. In her tweet, Gadd suggested that the face in the painting, completed in 1794 by the portrait painter Johann Baptist Lampi, had an uncanny resemblance to that of the British prime minister David Cameron.

“Within hours,” Gadd explains, “it had been retweeted thousands of times,” with the tweet eventually becoming a major news story in both the
Daily Mail
and the
Daily Telegraph
. “This experience has certainly taught me a few things about viral social media,” Gadd says, including the observations—which have already been made by many other critics, including Dave Eggers in
The Circle,
his 2013 fictional satire of data factories like Google and Facebook—that “the Internet is very cynical” and “nothing is private.”
35

Gadd’s experience was actually extremely mild. Unlike other innocents caught up in an all-too-public tweet storm, she didn’t lose her job or have her reputation destroyed by a vengeful online mob or land up in jail. The same month, for example, that Sophie Gadd woke up to find herself on page 3 of the
Daily Mail
, a PR executive named Justine Sacco tweeted: “Going to Africa. Hope I don’t get AIDS. Just Kidding. I’m white!” Sacco published it as she was about to board a twelve-hour flight from London to Cape Town. By the time Sacco arrived in South Africa, she had only been retweeted three thousand times but had become such a source of global news that the paparazzi were there to snap her image as she stumbled innocently off her plane. Labeled the Internet’s public enemy number one for her stupid tweet, Sacco lost her job and was even accused of being a “f****** idiot” by her own father.
36
Sacco will now forever be associated with this insensitive but hardly criminal tweet. Such is the nature and power of the Internet.

“When you only have a small number of followers, Twitter can feel like an intimate group of pub friends,” Sophie Gadd notes about a social Web that is both unforgetting and unforgiving.
37
“But it’s not. It’s no more private than shouting your conversations through a megaphone in the high street.”

The dangers of the crystal republic predate George Orwell’s
1984
and twentieth-century totalitarianism. They go back to the enlightened despotism of Catherine II of Russia, the subject of Johann Baptist Lampi’s portrait hanging in Berlin’s Deutsches Historisches Museum, the David Cameron look-alike painting that had landed Sophie Gadd on page 3 of the
Daily Mail
.

The Italian-born Lampi hadn’t been the only late-eighteenth-century European to go to Russia to enjoy Catherine the Great’s largesse. Two English brothers, Samuel and Jeremy Bentham, also spent time there gainfully employed by Catherine’s autocratic regime. Samuel worked for Count Grigory Potemkin, one of Catherine’s many lovers, whose name has been immortalized for his “Potemkin villages” of fake industrialization he built to impress her. Potemkin gave Bentham the job of managing Krichev, his hundred-square-mile estate on the Polish border that boasted fourteen thousand male serfs.
38
And it was here that Samuel and his brother Jeremy, who joined him in 1786 in Krichev and is best known today as the father of the “greatest happiness” principle, invented the idea of what they called the “Panopticon,” or the “Inspection House.”

While Jeremy Bentham—who happened to have graduated from the same Oxford college as Tim Berners-Lee—is now considered the author of the Panopticon, he credits his brother Samuel with its invention. “Morals reformed—health preserved—industry invigorated—instruction diffused—public burthens lightened—Economy seated, as it were, upon a rock—the Gordian knot of the poor law not cut, but untied—all by a simple idea in Architecture!” Jeremy Bentham wrote triumphantly in a letter from Krichev to describe this new idea.

What Jeremy Bentham called a “simple idea in Architecture” reflected his brother’s interest in disciplining the serfs on Potemkin’s Krichev estate. Borrowing from the Greek myth of Panoptes, a giant with a hundred eyes, the Panopticon—intended to house a large institution like a prison, a school, or a hospital—was a circular structure designed to house a single watchman to observe everyone in the building. This threat of being watched, Jeremy Bentham believed, represented “a new mode of obtaining power of mind over mind.” The Panopticon was a “vividly imaginative” fusion of architectural form with social purpose,” the architectural historian Robin Evans explains. And this purpose was discipline. The more we imagined we were being watched, Jeremy and Samuel Bentham imagined, the harder we would work and the fewer rules we would break. Michel Foucault thus described the Panopticon as a “cruel, ingenious cage.” It was “a microcosm of Benthamite society,” according to one historian, and “an existential realization of Philosophical Radicalism,” according to another.
39

As the founder of Philosophical Radicalism, a philosophical school better known today as utilitarianism, Jeremy Bentham saw human beings as calculating machines driven by measurable pleasure and pain. Society could be best managed, Bentham believed, by aggregating all these pleasures and pains in order to determine the greatest collective happiness. In the words of the British legal philosopher H. L. A. Hart, Bentham was a “cost-benefit expert on the grand scale.”
40
And the nineteenth-century Scottish thinker Thomas Carlyle criticized Bentham as a philosopher focused on “counting up and estimating men’s motives.” Half a century before his compatriot Charles Babbage invented the first programmable computer, Bentham was already thinking about human beings as calculating machines. And the Panopticon—which he spent much of his life futilely trying to build—is a “simple idea in Architecture” that enables everything and everyone to be watched and measured.

Other books

Until We Meet Once More by Lanyon, Josh
Devil's Fire by Melissa Macneal
Talk Stories by Jamaica Kincaid
Kira's Secret by Orysia Dawydiak
Kiss the Cook by D'Alessandro, Jacquie
Pebble in the Sky by Isaac Asimov
Secretly Sam by Heather Killough-Walden
His Captive Lady by Carol Townend