Read Reclaiming Conversation Online
Authors: Sherry Turkle
uncontrolled access to data:
Here, on technology and its desires, I have been influenced through my career by the work of Bruno Latour. See, for example,
Science in
Action: How to Follow Scientists and Engineers Through Society
(Cambridge, MA: Harvard University Press, 1999 [1987]);
Aramis, or the Love
of Technology
, Catherine Porter, trans. (Cambridge, MA: Harvard University Press, 2002 [1996]).
the rise of a “public sphere”:
See Jürgen Habermas,
The Structural Transformation of the Public Sphere: An Inquiry into
a Category of Bourgeois Society
(Cambridge, MA: The MIT Press, 1991 [1962]). Cited in Steven Miller,
Conversation: A
History of a Declining Art
(New Haven, CT: Yale University Press, 2007), 91.
“seats of English liberty”:
Cited in ibid, 90.
“the Man I conversed with”:
Cited in ibid., 91.
“
to improve the nick of time
”:
Henry David Thoreau,
Walden
(Princeton, NJ: Princeton University Press, 2004 [1854]), 17.
THE END OF FORGETTING
he calls it his withdrawing room:
Henry David Thoreau,
Walden
(Princeton, NJ: Princeton University Press, 2004 [1854]), 141.
substitute for humans in conversation:
For practice with job interviews, see Mohammed (Ehsan) Hoque, Matthieu Corgeon, Jean-Claude Martin, et al., “MACH: My Automated Conversation coacH,” http://web.media.mit.edu/~mehoque/Publications/13.Hoque-etal-MACH-UbiComp.pdf. For an automated psychotherapist, albeit one that still uses human input but hopes to do away with that system limitation as soon as possible, see Rob Morris and Rosalind Picard, “Crowdsourcing Collective Emotional Intelligence,”
Proceedings of CI
2012, http://www.robertrmorris.org/pdfs/Morris_Picard_CI2012.pdf. For a first look at a robot that aspires to be a social and emotionally competent constant companion, note Jibo, developed by Cynthia Breazeal, one of the world's leading researchers in sociable robotics, http://www.myjibo.com.
a “robotic moment”:
For a fuller discussion of “the robotic moment,” see Sherry Turkle,
Alone Together: Why We Expect More from Technology and Less from Each Other
(New York: Basic Books, 2011), 23â147.
a computer program called ELIZA:
Joseph Weizenbaum's original paper on ELIZA was written in 1966: Joseph Weizenbaum, “ELIZA: A Computer Program for the Study of Natural Language Communication Between Man and Machine,”
Communications of the ACM
9, no. 1 (January 1966): 36â45. Ten years later his book,
Computer Power and Human Reason: From Judgment to Calculation
, was deeply critical of the AI enterprise. The ELIZA experience had chastened him.
remember your name:
For overviews of sociable robotics by two leaders in the field, see Rodney Brooks,
Flesh and Machines: How Robots Will Change Us
(New York: Pantheon, 2002), and Cynthia Breazeal,
Designing Sociable Robots
(Cambridge, MA: The MIT Press, 2002).
The man finds Kismet so supportive:
See, for example, this interaction with
the robot Kismet: MIT CSAI, “Kismet and Rich,” MIT AI video, http://www.ai.mit.edu/projects/sociable/movies/kismet-and-rich.mov.
Machines with voices have particular power:
Anthony DeCasper, “Of Human Bonding: Newborns Prefer Their Mothers' Voices,”
Science
208, no. 4448 (1980): 1174â76, doi: 10.1126/science.7375928.
distinguish human speech from the machine-generated kind:
Judith Shulevitz brought together an array of facts about our vulnerability to machine talk, including this one. See Judith Shulevitz, “Siri, You're Messing Up a Generation of Children,”
New Republic,
April 2, 2014, http://www.newrepublic.com/article/117242/siris-psychological-effects-children.
when we observe others acting:
For an overview, Giacomo Rizzolatti, Laila Craighero, “The Mirror-Neuron System,”
Annual Review of Neuroscience
27 (2004): 169â92, doi:10.1146/annurev.neuro.27.070203.144230.
has no meaning when we feel it:
Emmanuel Levinas, “Ethics and the Face,”
Totality and Infinity: An Essay on Exteriority
, Alphonso Lingus, trans. (Pittsburgh, PA: Duquesne University Press, 1969).
I worked at the MIT Artificial Intelligence Laboratory:
I worked with Cynthia Breazeal and Brian Scasellati, the chief designers of Kismet and Cog, to study children's responses to these sociable robots. The stories about children and robots that follow are drawn from my report of that work. See Turkle,
Alone Together
, 84â101.
Sharing a few words at the checkout:
And research suggests this kind of small talk is not just an act of kindness to other people; it makes people happier. Nicholas Epley and Juliana Schroeder, “Mistakenly Seeking Solitude,”
Journal of Experimental Psychology: General
, advance online publication (2014), http://dx.doi.org/10.1037/a0037323.
trying to build an automated psychotherapist:
For a description of the system, see Morris and Picard, “Crowdsourcing Collective Emotional Intelligence.”
description of a stressful situation:
The three-sentence limit is an example of how we reduce conversation because of the limitations of technology and then reframe the reduced conversation as a feature rather than a bug. The authors of the automated psychotherapy program say, “By limiting the text entry to three sentences, we help users compartmentalize their stressors. Also, shorter text entries are easier to read and are therefore more manageable for the online workers.” Ibid.
increasingly willing to discuss these things with machines:
This is unfolding in a context where fewer people ask for talk therapy and fewer professionals suggest it. And in a context where cultural expectations of what a therapist might provide have shifted. We used to expect that therapists would want to learn about our families, where we were from, the details of our life situations. Now we are likely to content ourselves with medication if it will make us feel better, and often our professional consultations are by telephone or Skype. Of course, all of these have a place. They are sometimes useful, often necessary. See Gardiner Harris, “Talk Doesn't Pay, So Psychiatry Turns Instead to Drug Therapy,”
New York Times
, March 5, 2011,
http://www.nytimes.com/2011/03/06/health/policy/06doctors.html?ref=health. University of Pennsylvania researcher Steven C. Marcus has documented the decline of psychotherapy in recent years. See, for example, Steven C. Marcus and Mark Olfson, “National Trends in the Treatment for Depression from 1998 to 2007,”
Archives of General Psychology
67, no. 12 (2010): 1265â73, doi:10.1001/archgenpsychiatry.2010.151. See also Mark Olfson and Steven C. Marcus, “National Trends in Outpatient Psychotherapy,”
American Journal of Psychiatry
167, no. 12 (2010): 1456â63, doi:10.1176/appi.ajp.2010.10040570.
But in our enthusiasm for the new, convenience becomes custom. We are too ready to forget the power of face-to-face presence. Gillian Isaacs Russell, a British-trained psychoanalyst now working in the United States, embraced computer-mediated psychoanalytic treatment and had a long-distance practice with patients in China, the United Kingdom, and Central America. She writes this about her experience: “I met for over three years with a small peer group of practitioners who were doing treatments in China. Initially we met to navigate the cross-cultural territory, but found increasingly that we were concerned with the limitations of the medium itself.” Her work is a powerful argument for presence and an argument against those who believe that in psychoanalytic practice there was an equivalence between what could be accomplished face-to-face and over the Internet. Gillian Isaacs Russell,
Screen Relations: The Limits of Computer-Mediated Psychoanalysis
(London: Karnac Books, 2015).
Among the things that translate poorly online are the bodily experiences that are part of the therapeutic experience. We come to treatment with our whole selves. As do therapists. So therapists explain that when they are listening to a patient, they may have a bodily experience of the patient's words. They may feel sleepy, get a headache, a backache, experience nausea. That bodily experience is part of the countertransference, a reaction that demonstrates again and again the connection between body and mind. In an analytically oriented therapy, the therapist sees his or her job as putting these bodily sensations back into words: an interpretation, an intervention that hopefully reframes what is happening in a way that will be useful to the patient. On this point, see Patrick Miller,
Driving Soma: A Transformational Process in the Analytic Encounter
(London: Karnac Books, 2014).
“Better than Human”:
Kevin Kelly, “Better than Human: Why Robots Willâand MustâTake Our Jobs,”
Wired
, December 24, 2012, http://www.wired.com/2012/12/ff-robots-will-take-our-jobs/all.
include the roles of conversation
:
Computer scientist David Levy argues that robots should even be given their chance to become our spouses.
Love and Sex with Robots: The Evolution of Human-Robot Relationships
(New York: HarperCollins, 2007).
Love and Sex
is a personal favorite of mine in the escalation of the “simple salvations” literature because it is dedicated to Anthony, a hacker I used as a case study in my 1984
The Second Self: Computers and the Human Spirit
(Cambridge, MA: The MIT Press, 2005 [1984]). Levy thought that Anthony, lonely and somewhat isolated, might appreciate a robot lover since he had trouble in the human romance department. My reading of Anthony's story shows him yearning for relationship. It's a world he has trouble with, but he wanted in. To me, Levy had missed the point in his haste to solve Anthony's “problem” with a robot. Levy was suggesting replacing the person with a machine instead of increasing the potential of the person.
To me, David Levy's
Love and Sex
is a companion piece to Kevin Kelly's
Wired
cover story. For Levy, having a robot as a lover would not diminish Anthony. In Kelly's piece, if Anthony will accept a robot in the job, then by definition it was a job that not only people were meant to do.
computer conversation is “an imitation game”:
Alan Turing, “Computing Machinery and Intelligence,”
Mind
59 (1950): 433â60.
robot conversation and companionship:
For so generously sharing their work, robots, and ideas, special thanks to my colleagues Lijin Aryananda, Rodney Brooks, Cynthia Breazeal, Aaron Edsinger, Cory Kidd, and Brian Scasellati.
But particularly to the old:
There is controversy about the economic benefit of substituting robots for humans in service jobs. See Zeynep Tufecki, “Failing the Third Machine Age,” The Message, Medium, 2014, https://medium.com/message/failing-the-third-machine-age-1883e647ba74.
produce “caretaker machines”:
See, for example, Timothy W. Bickmore and Rosalind W. Picard, “Towards Caring Machines,” in
CHI 04 Extended Abstracts on Human Factors and Computer Systems
(New York: ACM Press, 2004).
I have studied Paro, a robot in the shape of a baby seal, designed as a companion to the elderly. Publicity films for Paro show older men and women who live with Paro having breakfast with it, watching television with it, taking it to the supermarket and out to dinner. In interviews about life with Paro, people say they are happy for its company, that it is easier to take care of than a real pet, and they are reassured to have a pet that will not die. See the Paro website at www.parorobots.com. On Paro, see Sherry Turkle, William Taggart, Cory D. Kidd, et al., “Relational Artifacts with Children and Elders: The Complexities of Cybercompanionship,”
Connection Science
28, no. 4 (2006): 347â61, doi:10.1080/09540090600868912. See also Cory D. Kidd, William Taggart, and Sherry Turkle, “A Sociable Robot to Encourage Social Interaction Among the Elderly,”
Proceedings of the
2006 IEEE International Conference on Robotics and Automation
(2006): 3972â76.
engaged in “as-if” conversations:
In this context, I use the term “as-if” in the spirit of Helene Deutsch's work on the as-if personality: Helene Deutsch, “Some Forms of Emotional Disturbance and their Relationship to Schizophrenia,”
Psychoanalytic Quarterly
11 (1962): 301â21.
People were special:
For my early work on computational objects, the question of aliveness, and what made people “special” in this context,
see Turkle,
The Second Self.
My work on aliveness continued with a second generation of computational objects and was reported in Turkle,
Life on the Screen: Identity in the Age of the Internet
(New York: Simon and Schuster, 1995). My inquiry, with an emphasis on children's reasoning rather than their answers, is inspired by Jean Piaget,
The Child's Conception of the World
, Jean Tomlinson and Andrew Tomlinson, trans. (Totowa, NJ: Littlefield, Adams, 1960).
better to talk to computer programs:
See Turkle,
Alone Together,
50â52.
one for the child:
From the earliest ages, children thrive on getting emotional feedback from the faces of their caretakers. In infant studies, when infants encounter “still-faced” and silent mothers, they become agitated, do everything possible to reengage the mother, and then, if not successful, become withdrawn and despondent, shriek, and lose control of their body posture. A silent mother is a pathological mother and a pathology-inducing mother. See Edward Tronick, Heidelise Als, Lauren Adamson, et al., “The Infant's Response to Entrapment Between Contradictory Messages in Face-to-Face Interaction,”
Journal of the American Academy of Child Psychiatry
17, no. 1 (1978): 1â113, doi:10.1016/S0002-7138(09)62273-1. See also Lauren B. Adamson and Janet E. Frick, “The Still Face: A History of a Shared Experimental Paradigm,”
Infancy
4, no. 4 (October 1, 2003): 451â73, doi:10.1207/S15327078IN0404 _01. For a video of the phenomenon, see “Still Face Experiment: Dr. Edward Tronick,” YouTube video, posted by “UMass Boston,” November 30, 2009, https://www.youtube.com/watch?v=apzXGEbZht0.