More Awesome Than Money (2 page)

BOOK: More Awesome Than Money
7.8Mb size Format: txt, pdf, ePub

The rabbi also banned the reading of other people's private mail.

A millennium after Rabbeinu Gershom, other people's business was sheathed in fiber-optic cable. Ancient human urges to snoop had lost none of their voltage, but the prohibitions and social inhibitions were dissolving. In a virtual instant, forward-thinking businesses data-mined, data-scraped, data-aggregated. As these became exalted missions, digital culture erased social and legal boundaries that had been honored, however imperfectly, for centuries. Commercial surveillance was built into the ecology of everyday life.

Like nothing else since humans first stood upright, the World Wide
Web has allowed people to connect and learn and speak out. Its dominant architecture has also made it possible for businesses and governments to fill giant data vaults with the ore of human existence—the appetites, interests, longings, hopes, vanities, and histories of people surfing the Internet, often unaware that every one of their clicks is being logged, bundled, sorted, and sold to marketers. Together, they amount to nothing less than a full-psyche scan, unobstructed by law or social mores.


Facebook holds and controls more data about the daily lives and social interactions of half a billion people than 20th-century totalitarian governments ever managed to collect about the people they surveilled,” Eben Moglen, a technologist, historian, and professor of law at Columbia University, observed in 2010.

That shirtless, drunken picture? The angry note to a lover, the careless words about a friend, the so-called private messages? Facebook has them all. And its marketers know when it is time to pop an advertisement for a florist who can deliver make-up flowers, or for a private investigator.

Students at MIT figured out the identities of closeted gay people by studying what seemed like neutral information disclosed on Facebook.
At the headquarters of Facebook in Palo Alto, Mark Zuckerberg was said to amuse himself with predictions on which Facebook “friends” would eventually hook up by watching how often someone was checking a friend's profile, and the kinds of messages they were sending.

The uproar over Facebook's privacy policies obscured intrusions on an even grander scale by other powerful forces in society. Everyone had heard of AOL and Microsoft; few were familiar with their subsidiaries.
Atlas Solutions, purchased by Microsoft in 2007, told businesses that it deploys “tracking pixels”—a kind of spy cookie that cannot be detected by most filters—to follow Internet users as they look at billions of web pages per month. These invisible bugs watch as we move across the web, shopping, reading, writing. Our habits are recorded. The pixels live in computer caches for months, waiting to be pulsed. Facebook bought Atlas in 2013, helping it track users when they left the site.

And virtually unknown to users, AOL's biggest business was never the cheery voice announcing, “You've got mail”; it was the billions of data items its subsidiary, Platform A, mined from Internet users, linking their interests and purchases, zip codes and significant others. The data was
stored on servers physically located in giant warehouses near Manassas, Virginia. AOL boasted that it followed “consumer behavior across thousands of websites.”

Facebook was a proxy in a still larger struggle for control over what used to be the marrow of human identity: what we reveal and what we conceal, what we read and what we want. Just as human tissue is inhabited by trillions of bacteria, so, too, our online life is heavily colonized by external forces, invisible bits of code that silently log our desires and interests, and, at times, manipulate them.

An experiment conducted in 2010 by the
Wall Street Journal
showed how far commercial interests could penetrate personal information, unbeknownst to web users. As part of a remarkable series called “
What They Know,” the
Journal
team set up a “clean” computer with a browser that had not previously been used for surfing. The results: after visiting the fifty most popular websites in the United States, the reporters found that 131 different advertising companies had planted 2,224 tiny files on the computer. These files, essentially invisible, kept track of sites the users visited. This allowed the companies to develop profiles that included age, gender, race, zip code, income, marital status, health worries, purchases, favorite TV shows, and movies. Deleting the tracking files did not eliminate them: some of them simply respawned.

Handling all the data they collected was possible because computing power continued to double every eighteen months to two years, the rate predicted in 1965 by the technologist Gordon Moore. Cheap and prolific by 2010, that power enabled the creation of bare-bones start-ups and the granular monitoring of personal habits. “We can uniquely see the full and complete path that your customers take to conversion, from seeing an ad on TV to searching on their smartphone to clicking on a display ad on their laptop,” a business called Converto boasted on its website in 2013.

The online advertising industry argued that the ability to tailor ads that appeared on a screen to the presumed appetites of the person using the computer was the foundation of the free Internet: the advertising dollars supported sites that otherwise would have no sources of revenue.

Whatever the merits of that argument, it was hard to defend the stealthiness of the commercial surveillance. No national law requires that this monitoring be disclosed, much less forbids it. A few halfhearted
efforts by the Federal Trade Commission to regulate the monitoring have gone nowhere. There was no way for people to get their data back.

Or even their thoughts.

In mid-2013, two researchers published a paper entitled “Self Censorship on Facebook,” reporting that in a study of 3.9 million users, 71 percent did not post something that they started to write. That is, they changed their minds. While this might look like prudence, or discretion, or editing, the researchers—both working at Facebook—described it as “self-censorship” and wrote that such behavior was a matter of concern to social network sites. When this happens, they wrote, Facebook “loses value from the lack of content generation.”

The company maintained that users are told that it collects not only information that is openly shared but also when you “view or otherwise interact with things.”
That means, the company asserted, the right to collect the unpublished content itself. “Facebook considers your thoughtful discretion about what to post as bad, because it withholds value from Facebook and from other users. Facebook monitors those unposted thoughts to better understand them, in order to build a system that minimizes this deliberate behavior,” Jennifer Golbeck, the director of the Human-Computer Interaction Lab at the University of Maryland, wrote in
Slate
.

At every instant, the fluid dynamics of the web—the interactions, the observations, the predations—are logged by servers. That such repositories of the lightning streams of human consciousness existed was scarcely known and little understood. “
Almost everyone on Planet Earth has never read a web server log,” Eben Moglen, the Columbia law professor said. “This is a great failing in our social education about technology. It's equivalent to not showing children what happens if cars collide and people don't wear seat belts.”

—

One day in the 1970s, a man named Douglas Engelbart was walking along a beachfront boardwalk in California when he spotted a group of skateboarders doing tricks.

“You see these kids skateboarding actually jumping into the air and the skateboard stays attached to their feet, and they spin around,
land on the skateboard, and keep going,” Engelbart remembered many years later.

For Engelbart, those skateboarders were a way to understand the unpredictability of technology. “I made them stop and do it for me six times, so I could see how they did it. It's very complicated—shifting weight on their feet, and so on. You couldn't give them the engineering and tell them to go out and do that. Fifteen years ago, who could have designed that? And that's all we can say about computers.”

A little-celebrated figure in modern history, Engelbart had spent decades thinking about how computers could be linked together to augment human intelligence, to solve problems of increasing complexity. At a gathering of technologists in 1968, he showed what could happen when computers talked to one another. For the occasion, he demonstrated a device that made it easier for the humans to interact with the computer. The device was called the mouse.
The cofounder of Apple, Steve Wozniak, said that Engelbart should be credited “for everything we have in the way computers work today.”

The emergence of the personal computer and the Internet, with its vast democratizing power, were part of Engelbart's vision. He died in July 2013, a few weeks after revelations by a man named Edward Snowden that the United States National Security Agency was collecting spectacular amounts of data.
Writing in the
Atlantic,
Alexis C. Madrigal noted: “We find ourselves a part of a ‘war on terror' that is being perpetually, secretly fought across the very network that Engelbart sought to build. Every interaction we have with an Internet service generates a ‘business record' that can be seized by the NSA through a secretive process that does not require a warrant or an adversarial legal proceeding.”

The business purposes of such data collection are apparent, if unsettling. But what need did governments have for it? Among Western democracies, the stated purpose was piecing together suggestive patterns that might reveal extremists plotting attacks like those carried out on September 11, 2001. The dystopic possibilities of such powers had, of course, been anticipated by George Orwell in
1984,
and by the visionary cyberpunk novelist William Gibson in
Neuromancer
. But fiction was not necessary to see what could be done: in 2010, its utility as an instrument
of surveillance and suppression had been realized in, among other places, Syria, Tunisia, Iran, and China.

So, too, were its other properties: as the Diaspora guys were making plans for their project in 2010, the Arab Spring was stirring to life, some of it in subversive online communications that either were not noticed or not taken seriously by the regimes that would soon be toppled. The same mechanisms allowed more alert regimes to surveil opponents, or to be led directly to the hiding places of dissidents who had unwittingly emitted location beams from phones in their pockets.

By 2010, in just the two years since Raphael Sofaer had entered college, Facebook had grown by 300 million users, almost five new accounts every second of the day. The ravenous hunger for new ways to connect in a sprawling world was not invented by Facebook, but the company was perfectly positioned to meet it, thanks to skill, luck, and the iron will of its young founder, Mark Zuckerberg.
A manager in Facebook's growth department, Andy Johns, described going to lunch for the first time with his new boss, one of Zuckerberg's lieutenants.

“I remember asking him, ‘So what kind of users am I going after? Any particular demographics or regions? Does it matter?' and he sternly responded ‘It's fucking land-grab time, so get all of the fucking land you can get.'”

Could four young would-be hackers build an alternative that preserved the rich layers of connection in social networking without collecting the tolls assessed by Facebook? Would anyone support their cause?

When word got out about their project, they were swamped.

In a matter of days, they received donations from thousands of people in eighteen countries; tens of thousands more started to follow their progress on Twitter, and in time, a half million people signed up to get an invitation. That was more weight than the four were ready to carry. On the night that their fund-raising drive exploded, as money was pouring in through online pledges, nineteen-year-old Rafi Sofaer toppled off the even keel where he seemed to live his life. It was too much. They were just trying to build some software. “Make them turn it off!” he implored the others. It couldn't be done.

Four guys hanging around a little club room at NYU suddenly found themselves handed a global commission to rebottle the genie of personal
privacy. They were the faces of a movement, a revolution against the settled digital order. Their job was to demonetize the soul.

The problem they set out to solve was hard. That was its attraction. They were young, smart, quick to learn what they did not know, and girded for battle. Suddenly, they had a legion of allies. And expectations. It was delightful, for a
while.

PART ONE

To the
Barricades

CHAPTER ONE

S
harply turned out in a tailored charcoal suit accented with a wine-red tie, the burly man giving the lecture had enchanted for twenty minutes, one moment summoning John Milton from the literary clouds, the next alluding to the lost continent of Oceania, then wrapping in Bill Gates and Microsoft. Every offhand reference was, in fact, a purposeful stitch in a case for how the entire architecture of the Internet had been warped into a machine for surveilling humans doing what they do: connecting, inquiring, amusing themselves. Then he made the subject concrete.

“It is here,” the speaker said, “of course, that Mr. Zuckerberg enters.”

Seated in an NYU lecture hall in Greenwich Village on a Friday evening, the audience stirred. Most of those attending were not students but members of the Internet Society, the sponsor of the talk. But no one listened more avidly than two NYU students who were seated near the front, Max Salzberg and Ilya Zhitomirskiy.

And they were keen to hear more about “Mr. Zuckerberg.” That, of course, was Mark Zuckerberg, boy billionaire, founder and emperor of Facebook, and a figure already well known to everyone in the crowd that had come to hear a talk by Eben Moglen, a law professor, history scholar, and technologist.
The Social Network,
a fictionalized feature film about the creation of Facebook, was still eight months away from its premier. Nevertheless, the name Zuckerberg needed no annotation. And at age
twenty-five, he had never gotten an introduction of the sort that Moglen was about to deliver to him in absentia.

“The human race has susceptibility to harm, but Mr. Zuckerberg has attained an unenviable record,” Moglen said. “He has done more harm to the human race than anybody else his age. Because—”

Moglen's talk was being live-streamed, and in an East Village apartment a few blocks away, an NYU senior named Dan Grippi, who had been only half listening, stopped his homework.

“Because,” Moglen continued, “he harnessed Friday night.

“That is, everybody needs to get laid, and he turned it into a structure for degenerating the integrity of human personality.”

Gasps. A wave of laughs. A moment earlier, this had been a sober, if engaging, talk, based on a rigorous analysis of how freedom on the Internet had been trimmed until it bled. As lawyer, hacker, and historian, Moglen possessed a rare combination of visions. He blended an engineer's understanding of the underlying, intricate architecture of the Internet and the evolving web with a historian's panoramic view of how those structures supported, or undermined, democratic principles and a lawyer's grasp of how far the law lagged behind technology. For nearly two decades, Moglen had been the leading consigliere of the free-software movement in the United States, and even if not everyone in the auditorium at New York University was personally acquainted with him, they all knew of him. A master orator, Moglen knew that he had just jolted his audience.

He immediately tacked toward his original thesis, this time bringing along Zuckerberg and Facebook as Exhibit A, saying: “He has, to a remarkable extent, succeeded with a very poor deal.”

Most of the regalia of Facebook, its profile pages and activity streams and so on, were conjured from
a commonplace computer language called PHP, which had been created when Mark Zuckerberg was eleven years old. By the time he built the first Facebook in 2004, PHP was already in use on more than
10 million websites; much of the web came to billowing life on computer screens thanks to those same text scripts, always tucked out of sight behind Oz's curtain. Knowing that, Moglen put the terms of the Zuckerberg/Facebook deal with the public in currency that his audience, many of them technologists, could grasp in an instant.

“‘I will give you free web hosting and some PHP doodads, and you get spying for free, all the time.' And it works.”

The audience howled.

“That's the sad part, it works,” Moglen said. “How could that have happened? There was no architectural reason, really.”

As lightning bolts go, this one covered a lot of ground. A mile away in his apartment, Dan listened and thought, what if he's right? The guy who created PHP called it that because he needed some code scripts for his personal home page. Which is sort of what Facebook was: You had a home page that could be played with in certain, limited ways. Post a picture. Comment on a friend's. Read an article that someone liked or hated. Watch a funny cat video. But all these things were possible on any web page, not just Facebook, which was really just a bunch of web pages that were connected to one another. Maybe there was no good technical reason that social networks should be set up the way Facebook was. For Moglen, it was an example—just one, but a globally familiar one—of what had gone wrong with the Internet, a precise instance of what he had been talking about for the first twenty minutes of his speech, in every sentence, even when he seemed to be wandering.

Dan was starting to wish he had gone to the talk in person. He knew that Max and Ilya were there. Practically from the moment Moglen had opened his mouth to make what sounded like throw-away opening remarks, they had been galvanized. “I would love to think that the reason that we're all here on a Friday night is that my speeches are so good,” Moglen had said. The audience tittered. In truth, the speeches of this man were widely known not just as good but as flat-out brilliant, seemingly unscripted skeins of history, philosophy, technology, and renaissance rabble-rousing, a voice preaching that one way lay a dystopic digital abyss, but that just over there, in striking distance, was a decent enough utopia.

“I actually have no idea why we're all here on a Friday night,” Moglen continued, “but I'm very grateful for the invitation. I am the person who had no date tonight—so it was particularly convenient that I was invited for now. Everybody knows that. My calendar's on the web.” No need for Moglen to check any other calendars to know that quite a few members of the audience did not have dates, either. His confession was an act of kinship, but it also had a serious edge.

“Our location is on the web,” Moglen said. Cell phones could pinpoint someone's whereabouts. Millions of times a year, the major mobile phone companies asked for and were given the precise location of people with telephones. There was no court order, no oversight, just people with law enforcement ID cards in their pockets.

“Just like that,” he said, getting warmed up.

He was making these points three years before Edward Snowden emerged from the shadows of the National Security Agency to fill in the shapes that Mogen was sketching.

“The deal that you get with the traditional service called ‘telephony' contains a thing you didn't know, like spying. That's not a service to you but it's a service and you get it for free with your service contract for telephony.”

For those who hacked and built in garages or equivalent spaces, Moglen was an unelected, unappointed attorney general, the enforcer of a legal regimen that protected the power of people to adjust the arithmetic that made their machines work.

As the volunteer general counsel to the Free Software Foundation, Moglen was the legal steward for GNU/Linux, an operating system that had been largely built by people who wrote their own code to run their machines. Why pay Bill Gates or Steve Jobs just so you could turn your computer on? For the low, low price of zero, free software could do the trick just as well, and in the view of many, much better. And GNU/Linux was the principal free system, built collaboratively by individuals beginning in the mid-1980s. It began as GNU, a code bank overseen by a driven ascetic, Richard A. Stallman, and found a path into modern civilization when a twenty-one-year-old Finnish computer science student, Linus Torvalds, adopted much of the Stallman code and added a key piece known as the kernel, to create a free operating system. (One of his collaborators called Torvalds's contribution Linux, and as the GNU/Linux release became the most widespread of the versions was routinely shorthanded as Linux, to the dismay of Stallman.) They were joined by legions of big businesses and governments following the hackers down the free-software road. On average, more than nine thousand new lines of code were contributed to Linux every day in 2010, by hundreds of volunteers
and by programmers working for businesses like Nokia, Intel, IBM, Novell, and Texas Instruments.

The successor to Bill Gates as CEO of Microsoft, Steve Ballmer, fumed that Linux had, “you know, the characteristics of communism that people love so very, very much about it. That is, it's free.”

It was indeed. As the Free Software Foundation saw things, in principle, the strings of 1s and 0s that make things happen on machines were no more the property of anyone than the sequence of nucleotides that provide the instructions for life in deoxyribonucleic acid, DNA.

Linux was the digital genome for billions of phones, printers, cameras, MP3 players, and televisions. It ran the computers that underpinned Google's empire, was essential to operations at the Pentagon and the New York Stock Exchange, and served as the dominant operating system for computers in Brazil, India, and China. It was in most of the world's supercomputers, and in a large share of the servers. In late 2010, Vladimir Putin ordered that all Russian government agencies stop using Microsoft products and convert their computers to Linux systems by 2015.

Linux had no human face, no alpha dog to bark at the wind; it had no profit-and-loss margins, no stock to track in the exchanges, and thus had no entries on the scorecards kept in the business news sections of the media. It was a phenomenon with few precedents in the modern market economy, a project on which fierce competitors worked together. In using GNU/Linux, they all had to agree to its licensing terms, whose core principles were devised primarily by Stallman, of the Free Software Foundation, in consultation with Moglen and the community of developers.

The word “free” in the term “free software” often threw people off. It referred not to the price but to the ability of users to shape the code, to remake, revise, and pass it along, without the customary copyright limitations of proprietary systems. Think of free software, Stallman often said, not as free as in free beer, but free as in free speech. So the principles of free software were spelled out under the license that people agreed to when they used it: anyone could see it, change it, even sell it, but they could not make it impossible for others to see what they had done, or to make their own subsequent changes. Every incarnation had to be available for anyone else to tinker with. Ballmer of Microsoft called it “a
cancer that attaches itself in an intellectual property sense to everything it touches.”

As the chief legal engineer for the movement, who helped to enforce the license and then to revise it, Moglen was the governor of a territory that was meant to be distinctly ungovernable, or at least uncontrollable, by any individual or business.

Having started as a lawyer for the scruffy, Moglen often found himself, as the years went by, in alliances that included powerful corporations and governments that were very pleased to run machines with software that did not come from the laboratories of Microsoft in Redmond, Washington, or of Apple in Cupertino, California. It was not that Moglen or his original long-haired clients had changed or compromised their views: the world simply had moved in their direction, attracted not necessarily by the soaring principles of “free as in free speech,” or even because it was “free as in free beer.” They liked it because it worked. And, yes, also because it was free.

The hackers had led an unarmed, unfunded revolution: to reap its rewards, all that the businesses—and anyone else—had to do was promise to share it. The success of that movement had changed the modern world.

It also filled the lecture hall on a Friday night. Yet Moglen, as he stood in the auditorium that night in February 2010, would not declare victory. It turned out that not only did free software not mean free beer, it didn't necessarily mean freedom, either. In his work, Moglen had shifted his attention to what he saw as the burgeoning threats to the ability of individuals to communicate vigorously and, if they chose, privately.

“I can hardly begin by saying that we won,” Moglen said, “given that spying comes free with everything now. But we haven't lost. We've just really bamboozled ourselves and we're going to have to unbamboozle ourselves really quickly or we're going to bamboozle other innocent people who didn't know that we were throwing away their privacy for them forever.”

His subject was freedom not in computer software but in digital architecture. Taken one step at a time, his argument was not hard to follow.

In the early 1960s, far-flung computers at universities and government research facilities began communicating with one another, a
network of peers. No central brain handled all the traffic. Every year, more universities, government agencies, and institutions with the heavy-duty hardware joined the network. A network of networks grew; it would be called the Internet.

The notion that these linked computers could form a vast, open library, pulsing with life from every place on earth, gripped some of the Internet's earliest engineers. That became possible in 1989, when Tim Berners-Lee developed a system of indexing and links, allowing computer users to discover what was available elsewhere on the network. He called it the World Wide Web. By the time the public discovered the web in the mid-1990s, the personal computers that ordinary people used were not full-fledged members of the network; instead, they were adjuncts, or clients, of more centralized computers called servers.

“The software that came to occupy the network was built around a very clear idea that had nothing to do with peers. It was called server-client architecture,” Moglen said.

So for entry to the promise and spoils of the Internet, individual users had to route their inquiries and communications through these central servers. As the servers became more powerful, the equipment on the desktop became less and less important. The servers could provide functions that once had been built into personal computers, like word processing and spreadsheets. With each passing day, the autonomy of the users shrunk. They were fully dependent on central servers.

Other books

Chasing Utopia by Nikki Giovanni
Painted Blind by Hansen, Michelle A.
This Starry Deep by Adam P. Knave
Armageddon Conspiracy by John Thompson
Whitefern by V.C. Andrews
The Joiner King by Troy Denning
A Common Life by Jan Karon