Read Reclaiming Conversation Online
Authors: Sherry Turkle
E
ach of us who “feeds” the system ends up being shaped by it, but in a very different way than the person caught in the panopticon. We don't so much conform because we fear the consequences of being caught out in deviant behavior; rather, we conform because what is shown to us online is shaped by our past interests. The system presents us with what it believes we will buy or read or vote for.
It places us in a particular world
that constrains our sense of what is out there and what is possible.
For any query, search engines curate results based on what they know about you, including your location and what kind of computer you are using. So, if you do a search about the Ukraine and opposition movements don't come up, this may be because an algorithm has decided that you don't want to see them. This means that you won't learn (at least not then) that they exist. Or, by the logic of the algorithm, you may be presented with only certain political advertisements. You may not learn that a candidate who seems “moderate” in national advertising sends antiâgun control advertising to other people,
just not to you
.
The web promises to make our world bigger. But as it works now, it also narrows our exposure to ideas. We can end up in a bubble in which we hear only the ideas we already know. Or already like. The philosopher Allan Bloom has suggested the cost: “Freedom of the mind requires not only, or not even especially, the absence of legal constraints but the presence of alternative thoughts.
The most successful tyranny
is not the one that uses force to assure uniformity, but the one that removes awareness of other possibilities.”
Once you have a glimmerâand you only need a glimmerâof how this works, you have reason to believe that what the web shows you is a reflection of what you have shown it. So, if anti-abortion advertisements appear on your social media newsfeed, you may well ask what you did to put them there. What did you search or write or read? Little by little, as new things show up on the screen, you watch passively while the web
actively constructs its version of you
.
Karl Marx described how a simple wooden table, once turned into a commodity, danced to its own ghostlike tune. Marx's table, transcendent, “not only stands with its feet on the ground . . . it stands on its head, and
evolves out of its wooden brain
, grotesque ideas far more wonderful than âtable-turning' ever was.” These days, it is our digital double that dances with a life of its own.
Advertising companies use it to build more targeted marketing campaigns. Insurance companies use it to apportion health benefits. From time to time, we are startled to get a view of who the algorithms that work over our data think we are. Technology writer Sara Watson describes such a moment. One day, Watson receives an invitation, a targeted advertisement, to participate in a study of anorexia in a Boston-area hospital. Watson says, “Ads
seem
trivial. But when they start to question whether I'm eating enough,
a line has been crossed
.”
Watson finds the request to participate in the anorexia study personal and assaultive, because she is stuck with the idea that she made the invitation appear. But how? Is the study targeting women with small grocery bills? Women who buy diet supplements? We are talking through machines to algorithms whose rules we don't understand.
For Watson, what is most disorienting is that she doesn't understand how the algorithm
reached its conclusion about her
.
And how can she challenge
a black box? For the algorithms that build up your digital double are written across many different platforms. There is no place where you can “fix” your double. There is no place to make it conform more exactly to how you want to be represented. Watson ends up confused: “It's hard to tell whether the algorithm doesn't know us at all, or if it actually
knows us better than we know ourselves
.” Does the black box know something she doesn't?
In conversations with others over a lifetime, you get to see yourself as others see you. You get to “meet yourself” in new ways. You get to object on the spot if somebody doesn't “get you.” Now we are offered a new experience:
We are asked to see ourselves as the collection of things we are told we should want, as the collection of things we are told should interest us.
Is this a tidier version of identity?
Building narratives about oneself takes time, and you never know if they are done or if they are correct. It is easier to see yourself in the mirror of the machine. You have mail.
T
horeau went to Walden to try to think his own thoughts, to remove himself from living “too thickly”âhow he referred to the constant chatter around him in society. These days, we live more “thickly” than Thoreau could ever have imagined, bombarded by the opinions, preferences, and “likes” of others. With the new sensibility of “I share, therefore I am,” many are drawn to the premise that thinking together makes for better thinking.
Facebook's Zuckerberg thinks that thinking is a realm where together is always better. If you share what you are thinking and reading and watching, you will be richer for it. He says that he would always “
rather go to a movie
with [his] friends” because then they can share their experience and opinions. And if his friends can't be there physically, he can still have a richer experience of the movie through online sharing. Neil Richards, a lawyer, cross-examines this idea. Always sharing with friends has a cost.
It means we'll always choose the movie they'd choose and won't choose the movie we want to see if they'd make fun of it. . . . If we're always with our friends, we're never alone, and we never get to explore ideas for ourselves. Of course, the stakes go beyond movies and extend to reading, to web-surfing,
and even thinking
.
And even thinking. Especially thinking. One student, who was used to
blogging as a regular part of her academic program
for her master's degree, changed styles when she changed universities and began her doctoral studies. In her new academic program blogging was discouraged. She comments that, looking back, the pressure to continually publish led
to her thinking of herself as a brand. She wanted everything she wrote to conform to her confirmed identity. And blogging encouraged her to write about what she could write about best. It discouraged risk taking. Now, writing privately, she feels more curious. Research shows that people who use social media are less willing to share their opinions if they think their followers and
friends might disagree with them
. People need private space to develop their ideas.
Generations of Americans took as self-evident the idea that private space was essential to democratic life. My grandmother had a civics lesson ready when she talked about the privacy of my library books. In order to be open to the widest range of ideas, I had to feel protected when making my reading choices. “
Crowdsourcing” your reading preferences
, says Richards, drives you “to conformity and the mainstream by social pressures.”
C
ognitive science has taught us the several qualities that make it
easy not to think about something
you probably don't want to think about anyway. You don't know when it is going to “happen.” You don't know exactly what it means for it to “happen.” And there is no
immediate
cause and effect between actions you might take and consequences related to the problem.
So, for example, if you don't want to think about climate change, you are able to exploit the psychological distance between a family vacation in an SUV and danger to the planet. A similar sense of distance makes it easy to defer thinking about the hazards of “reading in public,” the risks of living with a digital double, and threats to privacy on the digital landscape.
Here is Lana, a recent college graduate, thinking aloud about how she
doesn't think
about online privacy:
Cookies? I think that companies make it hard to understand what they are really doing. Even calling them cookies seems pretty brilliant. It
makes it sound cute, like it's nothing. Just helpful to you. Sweet. And it is helpful to get better ads or better services for the things you want. But how do they work and what are they going to do with all that they know about you? I don't know and I don't like where this is going. But I'm not going to think about this until something really bad happens concretely.
Lana is uneasy that data are being collected about her, but she's decided that right now she's not going to worry about it. She says that when she was younger she was “creeped out” by Facebook having so much information about her, but now she deals with her distrust of Facebook by keeping her posts light, mostly about parties and social logistics. She doesn't want what she puts on Facebook “coming back to haunt me.”
More than this, Lana says, she “is glad not to have anything controversial on my mind, because I can't think of any online place where it would be safe to have controversial conversations.” And she would want to have any conversation online because that is where she is in touch with all her friends. Lana describes a circle that encourages silence: If she had controversial opinions she would express them online, so it's good that she has none, because what she would say would not be private in this medium. In fact, Lana's circle has one more reinforcing turn: She says it's good that she has nothing controversial to say because she would be saying it online and everything you say online is kept forever. And that is something she doesn't like at all.
I talk to Lana shortly after her graduation from college in June 2014. In the news are manifestations of disruptive climate change, escalating wars and terrorism, the limitations of the international response to the Ebola epidemic, and significant violence due to racial tensions. There is no lack of things to communicate about “controversially.” Yet this very brilliant young woman, beginning a job in finance, is relieved not to have strong opinions on any of this because her medium for expressing them would be online and there is no way to talk “safely” there.
But Lana does not say that she finds any of this a problem. It would be inconvenient to label it that way. If you say something is a problem,
that suggests you should be thinking about changing it and Lana is not sure that this is the direction she wants to take her feelings of discontent, at least not now. Right now, as for many others, her line is that “we all are willing to trade off privacy for convenience.”
She treats this trade-off as arithmeticâas if, once it's calculated, it doesn't need to be revisited.
W
hen I talk to young people, I learn that they are expert at keeping “local” privacyâprivacy from each other when they want to keep things within their clique, privacy from parents or teachers who might be monitoring their online accounts; here they use code words,
a blizzard of acronyms
. But as for how to think about private mindspace on the net, most haven't thought much about it and don't seem to want to. They, like the larger society, are, for the most part, willing to defer thinking about this. We are all helped in this by staying vague on the details.
And the few details we know seem illogical or like half-truths. It is illegal to tap a phone, but it is not illegal to store a search. We are told that our searches are “anonymized,” but then,
experts tell us that this is not true
. Large corporations take our data, which seems to be legal, and the government also wants our dataâthings such as what we search, whom we text, what we text, whom we call, what we buy.
And it's hard to even learn the rules. I am on the board of the Electronic Freedom Foundation, devoted to privacy rights in digital culture. But it was only in spring 2014 that an email circulated to board members that described how easy it is to provoke the government to put you on a list of those whose email and searches are “fully tracked.” For example, you will get on that list if, from outside the United States, you try to use TOR, a method of browsing anonymously online. The same article explained that from within the United States,
you will also activate “full
tracking
” if you try to use alternatives to standard operating systemsâfor example, if you go to the Linux home page. It would appear that the Linux forum has been declared an “extremist” site.
One of my graduate research assistants has been on that forum because she needed to use annotation software that ran only on Linux. When she reads the communiqué about Linux and full tracking, she is taken aback, but what she says is, “Theoretically I'm angry but I'm not having an emotional response.” According to the source we both read, undisputed by the NSA, the content of her email and searches is surveilled. But still, she says, “Who knows what that means. Is it a person? Is it an algorithm? Is it tracking me by my name or my IP address?”
Confused by the details, she doesn't demand further details. Vague understandings support her sense that looking into this more closely can wait. So does the idea that she will be blocked or perhaps singled out for further surveillance if she tries to get more clarity.
One college senior tells me, with some satisfaction, that he has found a way around some of his concerns about online privacy. His strategy: He uses the “incognito” setting on his web browser. I decide that I'll do the same. I change the settings on my computer and go to bed thinking I have surely taken a step in the right direction. But what step have I taken? I learn that with an “incognito” setting I can protect my computer from recording my search history (so that family members, for example, can't check it), but I haven't slowed down Google or anyone else who might want access to it. And there is the irony that articles on how to protect your privacy online often recommend using TOR, but the NSA considers TOR users suspect and deserving of extra surveillance.