Technopoly (15 page)

Read Technopoly Online

Authors: Neil Postman

BOOK: Technopoly
5.27Mb size Format: txt, pdf, ePub

What is significant about this response is that it has redefined
the meaning of the word “belief.” The remark rejects the view that humans have internal states of mind that are the foundation of belief and argues instead that “belief” means only what someone or something does. The remark also implies that simulating an idea is synonymous with duplicating the idea. And, most important, the remark rejects the idea that mind is a biological phenomenon.

In other words, what we have here is a case of metaphor gone mad. From the proposition that humans are in some respects like machines, we move to the proposition that humans are little else but machines and, finally, that human beings
are
machines. And then, inevitably, as McCarthy’s remark suggests, to the proposition that machines are human beings. It follows that machines can be made that duplicate human intelligence, and thus research in the field known as artificial intelligence was inevitable. What is most significant about this line of thinking is the dangerous reductionism it represents. Human intelligence, as Weizenbaum has tried energetically to remind everyone, is not transferable. The plain fact is that humans have a unique, biologically rooted, intangible mental life which in some limited respects can be simulated by a machine but can never be duplicated. Machines cannot feel and, just as important, cannot
understand
. E
LIZA
can ask, “Why are you worried about your mother?,” which might be exactly the question a therapist would ask. But the machine does not know what the question means or even
that
the question means. (Of course, there may be some therapists who do not know what the question means either, who ask it routinely, ritualistically, inattentively. In that case we may say they are acting like a machine.) It is meaning, not utterance, that makes mind unique. I use “meaning” here to refer to something more than the result of putting together symbols the denotations of which are commonly shared by at least two people. As I understand it, meaning also includes those things we call feelings, experiences, and sensations that
do not have to be, and sometimes cannot be, put into symbols. They “mean” nonetheless. Without concrete symbols, a computer is merely a pile of junk. Although the quest for a machine that duplicates mind has ancient roots, and although digital logic circuitry has given that quest a scientific structure, artificial intelligence does not and cannot lead to a meaning-making, understanding, and feeling creature, which is what a human being is.

All of this may seem obvious enough, but the metaphor of the machine as human (or the human as machine) is sufficiently powerful to have made serious inroads in everyday language. People now commonly speak of “programming” or “deprogramming” themselves. They speak of their brains as a piece of “hard wiring,” capable of “retrieving data,” and it has become common to think about thinking as a mere matter of processing and decoding.

Perhaps the most chilling case of how deeply our language is absorbing the “machine as human” metaphor began on November 4, 1988, when the computers around the A
RPANET
network became sluggish, filled with extraneous data, and then clogged completely. The problem spread fairly quickly to six thousand computers across the United States and overseas. The early hypothesis was that a software program had attached itself to other programs, a situation which is called (in another human-machine metaphor) a “virus.” As it happened, the intruder was a self-contained program explicitly designed to disable computers, which is called a “worm.” But the technically incorrect term “virus” stuck, no doubt because of its familiarity and its human connections. As Raymond Gozzi, Jr., discovered in his analysis of how the mass media described the event, newspapers noted that the computers were “infected,” that the virus was “virulent” and “contagious,” that attempts were made to “quarantine” the infected computers, that attempts were also being made to “sterilize” the network, and that programmers
hoped to develop a “vaccine” so that computers could be “inoculated” against new attacks.
9

This kind of language is not merely picturesque anthropomorphism. It reflects a profound shift in perception about the relationship of computers to humans. If computers can become ill, then they can become healthy. Once healthy, they can think clearly and make decisions. The computer, it is implied, has a will, has intentions, has reasons—which means that humans are relieved of responsibility for the computer’s decisions. Through a curious form of grammatical alchemy, the sentence “We use the computer to calculate” comes to mean “The computer calculates.” If a computer calculates, then it may decide to miscalculate or not calculate at all. That is what bank tellers mean when they tell you that they cannot say how much money is in your checking account because “the computers are down.” The implication, of course, is that no person at the bank is responsible. Computers make mistakes or get tired or become ill. Why blame people? We may call this line of thinking an “agentic shift,” a term I borrow from Stanley Milgram to name the process whereby humans transfer responsibility for an outcome from themselves to a more abstract agent.
10
When this happens, we have relinquished control, which in the case of the computer means that we may, without excessive remorse, pursue ill-advised or even inhuman goals because the computer can accomplish them or be imagined to accomplish them.

Machines of various kinds will sometimes assume a human or, more likely, a superhuman aspect. Perhaps the most absurd case I know of is in a remark a student of mine once made on a sultry summer day in a room without air conditioning. On being told the thermometer read ninety-eight degrees Fahrenheit, he replied, “No wonder it’s so hot!” Nature was off the hook. If only the thermometers would behave themselves, we could be comfortable. But computers are far more “human” than thermometers or almost any other kind of technology. Unlike
most machines, computers do no work; they direct work. They are, as Norbert Wiener said, the technology of “command and control” and have little value without something to control. This is why they are of such importance to bureaucracies.

Naturally, bureaucrats can be expected to embrace a technology that helps to create the illusion that decisions are not under their control. Because of its seeming intelligence and impartiality, a computer has an almost magical tendency to direct attention away from the people in charge of bureaucratic functions and toward itself, as if the computer were the true source of authority. A bureaucrat armed with a computer is the unacknowledged legislator of our age, and a terrible burden to bear. We cannot dismiss the possibility that, if Adolf Eichmann had been able to say that it was not he but a battery of computers that directed the Jews to the appropriate crematoria, he might never have been asked to answer for his actions.

Although (or perhaps because) I came to “administration” late in my academic career, I am constantly amazed at how obediently people accept explanations that begin with the words “The computer shows …” or “The computer has determined …” It is Technopoly’s equivalent of the sentence “It is God’s will,” and the effect is roughly the same. You will not be surprised to know that I rarely resort to such humbug. But on occasion, when pressed to the wall, I have yielded. No one has as yet replied, “Garbage in, garbage out.” Their defenselessness has something Kafkaesque about it. In
The Trial
, Josef K. is charged with a crime—of what nature, and by whom the charge is made, he does not know. The computer turns too many of us into Josef Ks. It often functions as a kind of impersonal accuser which does not reveal, and is not required to reveal, the sources of the judgments made against us. It is apparently sufficient that the computer has pronounced. Who has put the data in, for what purpose, for whose convenience, based on what assumptions are questions left unasked.

This is the case not only in personal matters but in public decisions as well. Large institutions such as the Pentagon, the Internal Revenue Service, and multinational corporations tell us that their decisions are made on the basis of solutions generated by computers, and this is usually good enough to put our minds at ease or, rather, to sleep. In any case, it constrains us from making complaints or accusations. In part for this reason, the computer has strengthened bureaucratic institutions and suppressed the impulse toward significant social change. “The arrival of the Computer Revolution and the founding of the Computer Age have been announced many times,” Weizenbaum has written. “But if the triumph of a revolution is to be measured in terms of the social revision it entrained, then there has been no computer revolution.”
11

In automating the operation of political, social, and commercial enterprises, computers may or may not have made them more efficient but they have certainly diverted attention from the question whether or not such enterprises are necessary or how they might be improved. A university, a political party, a religious denomination, a judicial proceeding, even corporate board meetings are not improved by automating their operations. They are made more imposing, more technical, perhaps more authoritative, but defects in their assumptions, ideas, and theories will remain untouched. Computer technology, in other words, has not yet come close to the printing press in its power to generate radical and substantive social, political, and religious thought. If the press was, as David Riesman called it, “the gunpowder of the mind,” the computer, in its capacity to smooth over unsatisfactory institutions and ideas, is the talcum powder of the mind.

I do not wish to go as far as Weizenbaum in saying that computers are merely ingenious devices to fulfill unimportant functions and that the computer revolution is an explosion of nonsense. Perhaps that judgment will be in need of amendment
in the future, for the computer is a technology of a thousand uses—the Proteus of machines, to use Seymour Papert’s phrase. One must note, for example, the use of computer-generated images in the phenomenon known as Virtual Reality. Putting on a set of miniature goggle-mounted screens, one may block out the real world and move through a simulated three-dimensional world which changes its components with every movement of one’s head. That Timothy Leary is an enthusiastic proponent of Virtual Reality does not suggest that there is a constructive future for this device. But who knows? Perhaps, for those who can no longer cope with the real world, Virtual Reality will provide better therapy than E
LIZA
.

What is clear is that, to date, computer technology has served to strengthen Technopoly’s hold, to make people believe that technological innovation is synonymous with human progress. And it has done so by advancing several interconnected ideas.

It has, as already noted, amplified beyond all reason the metaphor of machines as humans and humans as machines. I do not claim, by the way, that computer technology originated this metaphor. One can detect it in medicine, too: doctors and patients have come to believe that, like a machine, a human being is made up of parts which when defective can be replaced by mechanical parts that function as the original did without impairing or even affecting any other part of the machine. Of course, to some degree that assumption works, but since a human being is in fact not a machine but a biological organism all of whose organs are interrelated and profoundly affected by mental states, the human-as-machine metaphor has serious medical limitations and can have devastating effects. Something similar may be said of the mechanistic metaphor when applied to workers. Modern industrial techniques are made possible by the idea that a machine is made up of isolatable and interchangeable parts. But in organizing factories so that workers are also conceived of as isolatable and interchangeable parts, industry
has engendered deep alienation and bitterness. This was the point of Charlie Chaplin’s
Modern Times
, in which he tried to show the psychic damage of the metaphor carried too far. But because the computer “thinks” rather than works, its power to energize mechanistic metaphors is unparalleled and of enormous value to Technopoly, which depends on our believing that we are at our best when acting like machines, and that in significant ways machines may be trusted to act as our surrogates. Among the implications of these beliefs is a loss of confidence in human judgment and subjectivity. We have devalued the singular human capacity to see things whole in all their psychic, emotional and moral dimensions, and we have replaced this with faith in the powers of technical calculation.

Because of what computers commonly do, they place an inordinate emphasis on the technical processes of communication and offer very little in the way of substance. With the exception of the electric light, there never has been a technology that better exemplifies Marshall McLuhan’s aphorism “The medium is the message.” The computer is almost all process. There are, for example, no “great computerers,” as there are great writers, painters, or musicians. There are “great programs” and “great programmers,” but their greatness lies in their ingenuity either in simulating a human function or in creating new possibilities of calculation, speed, and volume.
12
Of course, if J. David Bolter is right, it is possible that in the future computers will emerge as a new kind of book, expanding and enriching the tradition of writing technologies.
13
Since printing created new forms of literature when it replaced the handwritten manuscript, it is possible that electronic writing will do the same. But for the moment, computer technology functions more as a new mode of transportation than as a new means of substantive communication. It moves information—lots of it, fast, and mostly in a calculating mode. The computer, in fact, makes possible the fulfillment of Descartes’ dream of the mathematization of the
world. Computers make it easy to convert facts into statistics and to translate problems into equations. And whereas this can be useful (as when the process reveals a pattern that would otherwise go unnoticed), it is diversionary and dangerous when applied indiscriminately to human affairs. So is the computer’s emphasis on speed and especially its capacity to generate and store unprecedented quantities of information. In specialized contexts, the value of calculation, speed, and voluminous information may go uncontested. But the “message” of computer technology is comprehensive and domineering. The computer argues, to put it baldly, that the most serious problems confronting us at both personal and public levels require technical solutions through fast access to information otherwise unavailable. I would argue that this is, on the face of it, nonsense. Our most serious problems are not technical, nor do they arise from inadequate information. If a nuclear catastrophe occurs, it shall not be because of inadequate information. Where people are dying of starvation, it does not occur because of inadequate information. If families break up, children are mistreated, crime terrorizes a city, education is impotent, it does not happen because of inadequate information. Mathematical equations, instantaneous communication, and vast quantities of information have nothing whatever to do with any of these problems. And the computer is useless in addressing them.

Other books

The Prisoner by Karyn Monk
Be Safe I Love You by Cara Hoffman
Under the Moon Gate by Baron, Marilyn
Singing Hands by Delia Ray
Nanny X by Madelyn Rosenberg
Night Rounds by Helene Tursten
Shadows of the Past by Brandy L Rivers