Read Alone Together Online

Authors: Sherry Turkle

Alone Together (64 page)

BOOK: Alone Together
11.6Mb size Format: txt, pdf, ePub
ads
2
One study comparing data from 1985 and 2004 found that the mean number of people with whom Americans can discuss matters important to them dropped by nearly one-third, from 2.94 people in 1985 to 2.08 in 2004. Researchers also found that the number of people who said they had no one with whom to discuss such matters more than doubled, to nearly 25 percent. The survey found that both family and nonfamily confidants dropped, with the loss greatest in nonfamily connections. Miller McPherson, Lynn Smith-Lovin, and Matthew E. Brashears, “Social Isolation in America: Changes in Core Discussion Networks over Two Decades,”
American Sociological Review
71 (June 2006): 353-375.
3
Barry Wellman and Bernie Hogan (with Kristen Berg et al.), “Connected Lives: The Project,” in
Networked Neighborhoods
, ed. Patrick Purcell (London: Springer-Verlag, 2006), 161-216.
4
Moving past the philosophical, there are contradictions on the ground: a “huggable” robot is a responsive teddy bear that makes it possible for a grandmother in Detroit to send a squeeze to her grandson in Cambridge, Massachusetts. The grandmother hears and sees her grandson through the eyes and ears of the bear, and the robot communicates her caress. All well and good. But when videoconferences and hugs mediated by teddy bears keep grandparents from making several-thousand-mile treks to see their grandchildren in person (and there is already evidence that they do), children will be denied something precious: the starchy feel of a grandmother’s apron, the smell of her perfume up close, and the taste of her cooking. Amy Harmon, “Grandma’s on the Computer Screen,”
New York Times
, November 26, 2008,
www.nytimes.com/2008/11/27/us/27minicam.htm?pagewanted=all
. (accessed December 11, 2009). On the “Huggable” project, see
http://robotic.media.mit.edu/projects/robots/huggable/overview/overview.html
(accessed April 5, 2010).
5
On ELIZA, see Joseph Weizenbaum,
Computer Power and Human Reason: From Judgment to Calculation
(San Francisco: Freeman, 1976); Sherry Turkle,
The Second Self: Computers and the Human Spirit
(1984; Cambridge, MA: MIT Press, 2005); Sherry Turkle,
Life on the Screen: Identity in the Age of the Internet
(New York: Simon and Schuster, 1995).
6
People who feel that psychotherapists are dismissive or disrespectful may also prefer to have computers as counselors. An MIT administrative assistant says to me: “When you go to a psychoanalyst, well, you’re already going to a robot.”
7
In fact, we have two robotic dreams. In one, we imagine the robots as perfect companions. In another, we join with them to become new selves. This second scenario itself has two variants. In a first, we evolve. We assimilate robotic parts until there is no “us” and “them.” In the short term, we feel smarter and healthier. In the long term, we become immortal. In the second variant, there is a decisive turn, a moment of “singularity” in which computing power is so vast that people essentially become one with machines. For a critique of what he calls “cybernetic totalism,” see Jaron Lanier, “One Half a Manifesto,”
www.edge.org/3rd-culture/lanier-pl.html
(accessed August 3, 2010) and
You Are Not a Gadget: A Manifesto
(New York: Knopf, 2010).
8
Psychoanalysis sees truth in the symptom. But it is a truth that has not been given free expression. You don’t want to get rid of these truths for they are “signs that something has disconnected a significant experience from the mass of other, non-symptomatic significant experiences. The aim of psychoanalysis is to restore the broken connection, thereby converting the distorted, disconnected experience (the symptom) into an ordinary, connected one.” See Robert Caper,
Building Out into the Dark: Theory and Observation in Science and Psychoanalysis
(New York: Routledge, 2009), 90.
9
Kevin Kelly, “Technophilia,” The Technium, June 8, 2009,
www.kk.org/thetechnium/archives/2009/06/technophilia.php
(accessed December 9, 2009).
10
Caper,
Building Out into the Dark
, 93.
11
Personal communication, October 2008.
12
Caper says, “We tolerate the plague of our neurotic symptoms because we fear that discovering the truths they simultaneously rest on and cover over will lead to our destruction.” And further, an interpretation, like a new technology, “always poses a danger.... The danger consists not in the analysts’ search for truth, and not even in the fact that his interpretations are inevitably flawed, but in his not recognizing that this is so.” See Caper,
Building Out into the Dark
, 91, 94.
13
Henry Adams, “The Dynamo and the Virgin,” in
The Education of Henry Adams: An Autobiography
(Boston: Massachusetts Historical Society, 1918), 380.
14
Kelly, “Technophilia.”
15
One roboticist who makes quite extravagant claims about our futures is David Hanson. For videos and progress reports, see Hanson Robotics at
www.hansonrobotics.com
(accessed December 11, 2009). And, of course, there is David Levy’s book on the future of robot affections,
Love and Sex with Robots: The Evolution of Human-Robot Relationships
(New York: Harper Collins, 2007).
16
This is a paraphrase. The exact citation is, “When you want to give something presence, you have to consult nature and that is where design comes in. If you think of brick, for instance, you say to brick, ‘What do you want, brick?’ And brick says to you, ‘I’d like an arch.’ And if you say to brick, ‘Look, arches are expensive and I can use a concrete lintel over you, what do you think of that, brick?’ And brick says to you, ‘I’d like an arch.’” See Nathaniel Kahn,
My Architect: A Son’s Journey
(New Yorker Films, 2003).
17
Rodney Brooks, cited in “MIT: ‘Creating a Robot So Alive You Feel Bad About Switching It Off’—a Galaxy Classic,” The Daily Galaxy, December 24, 2009,
www.dailygalaxy.com/my_weblog/2009/12/there-is-ongoing-debate-about-what-constitutes-life-synthetic-bacteria-for-example-are-created-by-man-and-yet-also-alive. html
(accessed June 4, 2010).
18
Cynthia Breazeal and Rodney Brooks both make the point that robot emotions do not have to be like human ones. They should be judged on their own merits. See Cynthia Breazeal and Rodney Brooks (2005). “Robot Emotion: A Functional Perspective,” in J.-M. Fellous and M. Arbib (eds.)
Who Needs Emotions: The Brain Meets the Robot
, MIT Press. 271-310. Breazeal insists that “the question for robots is not, ‘Will they ever have human emotions?’ Dogs don’t have human emotions, either, but we all agree they have genuine emotions. The question is, ‘What are the emotions that are genuine for the robot?’” Breazeal talks about Kismet as a synthetic being and expects that it will be “given the same respect and consideration that you would to any living thing.” WNPR, “Morning Edition,” April 9, 2001,
www.npr.org/programs/morning/features/2001/apr/010409.kismet.html
(accessed August 12, 2010). See also Susan K. Lewis, “Friendly Robots,” Nova,
www.pbs.org/wgbh/nova/tech/friendly-robots.htmland
Robin Marantz Henig, “The Real Transformers,”
New York Times
, July 29, 2007,
www.nytimes.com/2007/07/29/magazine/29robots-t.html
(accessed September 3, 2010).
19
There is much talk these days of a “robot bill of rights.” As robots become more complex, there is a movement to have formal rules for how we treat artificial sentience. Robot rights are the subject of parliamentary inquiry in the United Kingdom. In South Korea, where the government plans to put a sociable robot into every home by 2020, there are plans to draw up legal guidelines on how they must be treated. The focus of these efforts is on protecting the robots. But as early as the mid-1990s, people abused virtual creatures called “norns,” tormenting them until they became psychotic, beating their virtual heads against virtual walls. Popular Web videos show even as simple a robotic toy as Hasbro’s Elmo Live being doused with gas and set on fire, his red fur turning to charcoal as he writhes in what looks like pain. I have watched the abuse of Tamagotchis, Furbies, My Real Babies, and Paros. The “robot rights movement” is all about not hurting the robots. My concern is that when we torture sociable robots that we believe to have “states of mind,” we damage ourselves.
Daniel Roth, “Do Humanlike Machines Deserve Human Rights,”
Wired Magazine
, January 19, 2009,
www.wired.com/culture/culturereviews/magazine/17-02/st_essay
(accessed June 4, 2010).
20
For drawing my attention to what he calls “
formes frustes
of feeling,” I thank my colleague Cambridge psychiatrist and psychoanalyst Dr. David Mann, who has reformulated an entire range of unpleasant affects (for example, envy, greed, resentment) in an as-yet-unpublished essay, “Failures of Feeling” (2009).
21
Anthony Storr,
Solitude: A Return to the Self
(New York: Random House, 1988).
22
Quandaries have become a classic way of thinking about moral dilemmas. See, for example, Marc Hauser,
Moral Minds: How Nature Designed Our Universal Sense of Right and Wrong
(New York: Ecco, 2006). Some of the most common quandaries involve trolley cars and the certainty of death. A typical scenario has you driving a trolley car with five workers ahead of you on the track. Doing nothing will kill all five. You can swerve onto a track on which there is only one worker. Do you act to kill one person rather than five? Then, the scenario may be shifted so you are on a bridge, observing the trolley cars. There is a fat man standing beside you. Do you push him onto the track to stop the trolley, thus saving the five people? And so it goes.
23
Traditional psychology was constructed based on experiments done only with men and through theories that only took into account male development. During the first and second world wars, psychological tests were standardized for the male soldiers with whom they were developed. End of story. Psychologists came to see male responses as “normal” ones. The behaviors, attitudes, and patterns of relationship exhibited by most men became the norm for “people.” Psychologist Carol Gilligan’s 1982
In a Different Voice
is an example of work that broke this frame. Gilligan portrays the canonical (and stereotypically “male”) view of moral reasoning and then points out that it constitutes only one way in which people make moral decisions. The canonical pattern looks at moral choices in terms of abstract principles. Another, equally evolved moral voice relies on concrete situations and relationships. For example, see Gilligan’s treatment of “Amy and Heinz” in
In a Different Voice: Psychological Theory and Women’s Development
(Cambridge, MA: Harvard University Press, 1993), 26-28, 30. The “robots-or-nothing” thinking about elder care frames a dilemma that begs for a contextual approach; this is what the fifth graders in Miss Grant’s class brought to the table.
We hear another moment of reframing when seventeen-year-old Nick tries to find a way to get his father to put away his BlackBerry during family dinners. Recall that in Nick’s home, family dinners are long. His mother takes pride in her beautifully prepared meals with many courses. Nick suggests shorter meals. His parents argue principles: the priority of work versus that of a meal prepared with love. Nick focuses on relationship. The family needs family time. How can they provide that for each other? Nick suggests a shorter meal with no phones.
24
Anthony Appiah,
Experiments in Ethics
(Cambridge, MA: Harvard University Press, 2008), 196-197. Appiah is writing about “trolley car” quandaries, but he could be writing about the “robots-or-nothing” problem.
25
Here I note the work on using robots as a therapeutic tool with people on the autism spectrum. Robots do not overwhelm them as people may. The predictability of robots is comforting. The question remains whether these robots can serve as transitions to relationships with people. I cotaught a course at MIT on robotics and autism with Rosalind Picard and Cynthia Breazeal. Roboticists are of course gratified to feel that they can contribute to therapy in this area; the jury is still out on whether nonhuman faces get us ready for human ones. For a discussion that focuses on the work of roboticist Maja Matarić in this area, see Jerome Groopman, “Robots That Care: Advances in Technological Therapy,”
The New Yorker
, November 2, 2009,
www.newyorker.com/reporting/2009/11/02/091102fa_fact_groopman
(accessed November 11, 2009).
26
This phrase is drawn from Roger Shattuck’s book on the “Wild Child” of Aveyron.
The Forbidden Experiment
(New York: Farrar, Strauss, and Giroux, 1980).
27
“Basic trust” is Erik Erikson’s phrase; see
Childhood and Society
(New York: Norton, 1950) and
Identity and the Life Cycle
(1952; New York: Norton, 1980).
28
At MIT, the question of risk strikes most of my students as odd. They assume, along with roboticist David Hanson, that eventually robots “will evolve into socially intelligent beings, capable of love and earning a place in the extended human family.” See Groopman, “Robots That Care.”
29
A University of Michigan study found that today’s college students have less empathy than those of the 1980s or 1990s. Today’s generation scored about 40 percent lower in empathy than their counterparts did twenty or thirty years ago. Sara Konrath, a researcher at the University of Michigan’s Institute for Social Research, conducted, with University of Michigan graduate student Edward O’Brien and undergraduate student Courtney Hsing, a meta-analysis that looked at data on empathy, combining the results of seventy-two different studies of American college students conducted between 1979 and 2009. Compared to college students of the late 1970s, the study found, college students today are less likely to agree with statements such as “I sometimes try to understand my friends better by imagining how things look from their perspective” and “I often have tender, concerned feelings for people less fortunate than me.” See “Empathy: College Students Don’t Have As Much As They Used To,” EurekAlert! May 28, 2010,
www.eurekalert.org/pub_releases/2010-05/uom-ecs052610 .php
(accessed June 4, 2010).
BOOK: Alone Together
11.6Mb size Format: txt, pdf, ePub
ads

Other books

The Superpower Project by Paul Bristow
Beautiful Death by Fiona McIntosh
Instances of the Number 3 by Salley Vickers
Prohibit by Viola Grace
Cracking India by Bapsi Sidhwa
Carry Me Home by Sandra Kring