The Glass Cage: Automation and Us (10 page)

BOOK: The Glass Cage: Automation and Us
13.94Mb size Format: txt, pdf, ePub

The transformation that aviation has gone through over the last few decades—the shift from mechanical to digital systems, the proliferation of software and screens, the automation of mental as well as manual work, the blurring of what it means to be a pilot—offers a roadmap for the much broader transformation that society is going through now. The glass cockpit, Don Harris has pointed out, can be thought of as a prototype of a world where “there is computer functionality everywhere.”
43
The experience of pilots also reveals the subtle but often strong connection between the way automated systems are designed and the way the minds and bodies of the people using the systems work. The mounting evidence of an erosion of skills, a dulling of perceptions, and a slowing of reactions should give us all pause. As we begin to live our lives inside glass cockpits, we seem fated to discover what pilots already know: a glass cockpit can also be a glass cage.

 

*
A note on terminology: When people talk about a stall, they’re usually referring to a loss of power in an engine. In aviation, a stall refers to a loss of lift in a wing.

THE DEGENERATION EFFECT

A
HUNDRED YEARS AGO
, in his book
An Introduction to Mathematics
, the British philosopher Alfred North Whitehead wrote, “Civilization advances by extending the number of important operations which we can perform without thinking about them.” Whitehead wasn’t writing about machinery. He was writing about the use of mathematical symbols to represent ideas or logical processes—an early example of how intellectual work can be encapsulated in code. But he intended his observation to be taken generally. The common notion that “we should cultivate the habit of thinking of what we are doing,” he wrote, is “profoundly erroneous.” The more we can relieve our minds of routine chores, offloading the tasks to technological aids, the more mental power we’ll be able to store up for the deepest, most creative kinds of reasoning and conjecture. “Operations of thought are like cavalry charges in battle—they are strictly limited in number, they require fresh horses, and must only be made at decisive moments.”
1

It’s hard to imagine a more succinct or confident expression of faith in automation as a cornerstone of progress. Implicit in Whitehead’s words is a belief in a hierarchy of human action. Every time we offload a job to a tool or a machine, or to a symbol or a software algorithm, we free ourselves to climb to a higher pursuit, one requiring greater dexterity, richer intelligence, or a broader perspective. We may lose something with each upward step, but what we gain is, in the end, far greater. Taken to an extreme, Whitehead’s sense of automation as liberation turns into the techno-utopianism of Wilde and Keynes, or Marx at his sunniest—the dream that machines will free us from our earthly labors and deliver us back to an Eden of leisurely delights. But Whitehead didn’t have his head in the clouds. He was making a pragmatic point about how to spend our time and exert our effort. In a publication from the 1970s, the U.S. Department of Labor summed up the job of secretaries by saying that they “relieve their employers of routine duties so they can work on more important matters.”
2
Software and other automation technologies, in the Whitehead view, play an analogous role.

History provides plenty of evidence to support Whitehead. People have been handing off chores, both physical and mental, to tools since the invention of the lever, the wheel, and the counting bead. The transfer of work has allowed us to tackle thornier challenges and rise to greater achievements. That’s been true on the farm, in the factory, in the laboratory, in the home. But we shouldn’t take Whitehead’s observation for a universal truth. He was writing when automation was limited to distinct, well-defined, and repetitive tasks—weaving fabric with a steam loom, harvesting grain with a combine, multiplying numbers with a slide rule. Automation is different now. Computers, as we’ve seen, can be programmed to perform or support complex activities in which a succession of tightly coordinated tasks is carried out through an evaluation of many variables. In automated systems today, the computer often takes on intellectual work—observing and sensing, analyzing and judging, even making decisions—that until recently was considered the preserve of humans. The person operating the computer is left to play the role of a high-tech clerk, entering data, monitoring outputs, and watching for failures. Rather than opening new frontiers of thought and action to its human collaborators, software narrows our focus. We trade subtle, specialized talents for more routine, less distinctive ones.

Most of us assume, as Whitehead did, that automation is benign, that it raises us to higher callings but doesn’t otherwise alter the way we behave or think. That’s a fallacy. It’s an expression of what scholars of automation have come to call the “substitution myth.” A labor-saving device doesn’t just provide a substitute for some isolated component of a job. It alters the character of the entire task, including the roles, attitudes, and skills of the people who take part in it. As Raja Parasuraman explained in a 2000 journal article, “Automation does not simply supplant human activity but rather changes it, often in ways unintended and unanticipated by the designers.”
3
Automation remakes both work and worker.

W
HEN PEOPLE
tackle a task with the aid of computers, they often fall victim to a pair of cognitive ailments,
automation complacency
and
automation bias
. Both reveal the traps that lie in store when we take the Whitehead route of performing important operations without thinking about them.

Automation complacency takes hold when a computer lulls us into a false sense of security. We become so confident that the machine will work flawlessly, handling any challenge that may arise, that we allow our attention to drift. We disengage from our work, or at least from the part of it that the software is handling, and as a result may miss signals that something is amiss. Most of us have experienced complacency when at a computer. In using email or word-processing software, we become less vigilant proofreaders when the spell checker is on.
4
That’s a simple example, which at worst can lead to a moment of embarrassment. But as the sometimes tragic experience of aviators shows, automation complacency can have deadly consequences. In the worst cases, people become so trusting of the technology that their awareness of what’s going on around them fades completely. They tune out. If a problem suddenly crops up, they may act bewildered and waste precious moments trying to reorient themselves.

Automation complacency has been documented in many high-risk situations, from battlefields to industrial control rooms to the bridges of ships and submarines. One classic case involved a 1,500-passenger ocean liner named the
Royal Majesty
, which in the spring of 1995 was sailing from Bermuda to Boston on the last leg of a week-long cruise. The ship was outfitted with a state-of-the-art automated navigation system that used GPS signals to keep it on course. An hour into the voyage, the cable for the GPS antenna came loose and the navigation system lost its bearings. It continued to give readings, but they were no longer accurate. For more than thirty hours, as the ship slowly drifted off its appointed route, the captain and crew remained oblivious to the problem, despite clear signs that the system had failed. At one point, a mate on watch was unable to spot an important locational buoy that the ship was due to pass. He failed to report the fact. His trust in the navigation system was so complete that he assumed the buoy was there and he simply didn’t see it. Nearly twenty miles off course, the ship finally ran aground on a sandbar near Nantucket Island. No one was hurt, fortunately, though the cruise company suffered millions in damages. Government safety investigators concluded that automation complacency caused the mishap. The ship’s officers were “overly reliant” on the automated system, to the point that they ignored other “navigation aids [and] lookout information” that would have told them they were dangerously off course. Automation, the investigators reported, had “the effect of leaving the mariner out of meaningful control or active participation in the operation of the ship.”
5

Complacency can plague people who work in offices as well as those who ply airways and seaways. In an investigation of how design software has influenced the building trades, MIT sociologist Sherry Turkle documented a change in architects’ attention to detail. When plans were hand-drawn, architects would painstakingly double-check all the dimensions before handing blueprints over to construction crews. The architects knew that they were fallible, that they could make the occasional goof, and so they followed an old carpentry dictum: measure twice, cut once. With software-generated plans, they’re less careful about verifying measurements. The apparent precision of computer renderings and printouts leads them to assume that the figures are accurate. “It seems presumptuous to check,” one architect told Turkle; “I mean, how could I do a better job than the computer? It can do things down to hundredths of an inch.” Such complacency, which can be shared by engineers and builders, has led to costly mistakes in planning and construction. Computers don’t make goofs, we tell ourselves, even though we know that their outputs are only as good as our inputs. “The fancier the computer system,” one of Turkle’s students observed, “the more you start to assume that it is correcting your errors, the more you start to believe that what comes out of the machine is just how it should be. It is just a visceral thing.”
6

Automation bias is closely related to automation complacency. It creeps in when people give undue weight to the information coming through their monitors. Even when the information is wrong or misleading, they believe it. Their trust in the software becomes so strong that they ignore or discount other sources of information, including their own senses. If you’ve ever found yourself lost or going around in circles after slavishly following flawed or outdated directions from a GPS device or other digital mapping tool, you’ve felt the effects of automation bias. Even people who drive for a living can display a startling lack of common sense when relying on satellite navigation. Ignoring road signs and other environmental cues, they’ll proceed down hazardous routes and sometimes end up crashing into low overpasses or getting stuck in the narrow streets of small towns. In Seattle in 2008, the driver of a twelve-foot-high bus carrying a high-school sports team ran into a concrete bridge with a nine-foot clearance. The top of the bus was sheared off, and twenty-one injured students had to be taken to the hospital. The driver told police that he had been following GPS instructions and “did not see” signs and flashing lights warning of the low bridge ahead.
7

Automation bias is a particular risk for people who use decision-support software to guide them through analyses or diagnoses. Since the late 1990s, radiologists have been using computer-aided detection systems that highlight suspicious areas on mammograms and other x-rays. A digital version of an image is scanned into a computer, and pattern-matching software reviews it and adds arrows or other “prompts” to suggest areas for the doctor to inspect more closely. In some cases, the highlights aid in the discovery of disease, helping radiologists identify potential cancers they might otherwise have missed. But studies reveal that the highlights can also have the opposite effect. Biased by the software’s suggestions, doctors can end up giving cursory attention to the areas of an image that haven’t been highlighted, sometimes overlooking an early-stage tumor or other abnormality. The prompts can also increase the likelihood of false-positives, when a radiologist calls a patient back for an unnecessary biopsy.

A recent review of mammography data, conducted by a team of researchers at City University London, indicates that automation bias has had a greater effect on radiologists and other image readers than was previously thought. The researchers found that while computer-aided detection tends to improve the reliability of “less discriminating readers” in assessing “comparatively easy cases,” it can actually degrade the performance of expert readers in evaluating tricky cases. When relying on the software, the experts are more likely to overlook certain cancers.
8
The subtle biases inspired by computerized decision aids may, moreover, be “an inherent part of the human cognitive apparatus for reacting to cues and alarms.”
9
By directing the focus of our eyes, the aids distort our vision.

Both complacency and bias seem to stem from limitations in our ability to pay attention. Our tendency toward complacency reveals how easily our concentration and awareness can fade when we’re not routinely called on to interact with our surroundings. Our propensity to be biased in evaluating and weighing information shows that our mind’s focus is selective and can easily be skewed by misplaced trust or even the appearance of seemingly helpful prompts. Both complacency and bias tend to become more severe as the quality and reliability of an automated system improve.
10
Experiments show that when a system produces errors fairly frequently, we stay on high alert. We maintain awareness of our surroundings and carefully monitor information from a variety of sources. But when a system is more reliable, breaking down or making mistakes only occasionally, we get lazy. We start to assume the system is infallible.

Other books

Dreamspinner by Lynn Kurland
A Taste of Sin by Jennifer L Jennings, Vicki Lorist
The Rape of Venice by Dennis Wheatley
Riding to Washington by Gwenyth Swain
Little Belle Gone by Whitlock, Stephanie
Leaving Haven by Kathleen McCleary
Rotters by Kraus, Daniel
Alien Refuge by Tracy St. John
Stolen Life by Rudy Wiebe