Premillennial tensions probably do help explain some of our collective irrationality. Living in a scientific era, most of us grant the arbitrariness of reckoning time in base-ten rather than, say, base-twelve, and from the birth of Christ rather than from the day Muhammad moved from Mecca. Yet even the least superstitious among us cannot quite manage to think of the year 2000 as ordinary. Social psychologists have long recognized a human urge to convert vague uneasiness into definable concerns, real or imagined. In a classic study thirty years ago Alan Kerckhoff and Kurt Back pointed out that “the belief in a tangible threat makes it possible to explain and justify one’s sense of discomfort.”
26
Some historical evidence also supports the hypothesis that people panic at the brink of centuries and millennia. Witness the “panic terror” in Europe around the year 1000 and the witch hunts in Salem in the 1690s. As a complete or dependable explanation, though, the millennium hypothesis fails. Historians emphasize that panics of equal or greater intensity occur in odd years, as demonstrated by anti-Indian hysteria in the mid 1700s and McCarthyism in the 1950s. Scholars point out too that calendars cannot account for why certain fears occupy people at certain times (witches then, killer kids now).
27
Another popular explanation blames the news media. We have so many fears, many of them off-base, the argument goes, because the media bombard us with sensationalistic stories designed to increase ratings. This explanation, sometimes called the media-effects theory, is less simplistic than the millennium hypothesis and contains sizable kernels of truth. When researchers from Emory University computed the levels of coverage of various health dangers in popular magazines and newspapers they discovered an inverse relationship: much less space was devoted to several of the major causes of death than to some uncommon causes. The leading cause of death, heart disease, received approximately the same amount of coverage as the eleventh-ranked cause
of death, homicide. They found a similar inverse relationship in coverage of risk factors associated with serious illness and death. The lowest-ranking risk factor, drug use, received nearly as much attention as the second-ranked risk factor, diet and exercise.
28
Disproportionate coverage in the news media plainly has effects on readers and viewers. When Esther Madriz, a professor at Hunter College, interviewed women in New York City about their fears of crime they frequently responded with the phrase “I saw it in the news.” The interviewees identified the news media as both the source of their fears and the reason they believed those fears were valid. Asked in a national poll why they believe the country has a serious crime problem, 76 percent of people cited stories they had seen in the media. Only 22 percent cited personal experience.
29
When professors Robert Blendon and John Young of Harvard analyzed forty-seven surveys about drug abuse conducted between 1978 and 1997, they too discovered that the news media, rather than personal experience, provide Americans with their predominant fears. Eight out of ten adults say that drug abuse has never caused problems in their family, and the vast majority report relatively little direct experience with problems related to drug abuse. Widespread concern about drug problems emanates, Blendon and Young determined, from scares in the news media, television in particular.
30
Television news programs survive on scares. On local newscasts, where producers live by the dictum “if it bleeds, it leads,” drug, crime, and disaster stories make up most of the news portion of the broadcasts. Evening newscasts on the major networks are somewhat less bloody, but between 1990 and 1998, when the nation’s murder rate declined by 20 percent, the number of murder stories on network newscasts increased 600 percent
(not
counting stories about O.J. Simpson).
31
After the dinnertime newscasts the networks broadcast newsmagazines, whose guiding principle seems to be that no danger is too small to magnify into a national nightmare. Some of the risks reported by such programs would be merely laughable were they not hyped with so much fanfare: “Don’t miss
Dateline
tonight or YOU could be the next victim!” Competing for ratings with drama programs and
movies during prime-time evening hours, newsmagazines feature story lines that would make a writer for “Homicide” or “ER” wince.
32
“It can happen in a flash. Fire breaks out on the operating table. The patient is surrounded by flames,” Barbara Walters exclaimed on ABC’s “20/20” in 1998. The problem—oxygen from a face mask ignited by a surgical instrument—occurs “more often than you might think,” she cautioned in her introduction, even though reporter Arnold Diaz would note later, during the actual report, that out of 27 million surgeries each year the situation arises only about a hundred times. No matter, Diaz effectively nullified the reassuring numbers as soon as they left his mouth. To those who “may say it’s too small a risk to worry about” he presented distraught victims: a woman with permanent scars on her face and a man whose son had died.
33
The gambit is common. Producers of TV newsmagazines routinely let emotional accounts trump objective information. In 1994 medical authorities attempted to cut short the brouhaha over flesh-eating bacteria by publicizing the fact that an American is fifty-five times more likely to be struck by lightning than die of the suddenly celebrated microbe. Yet TV journalists brushed this fact aside with remarks like, “whatever the statistics, it’s devastating to the victims” (Catherine Crier on “20/20”), accompanied by stomach-turning videos of disfigured patients.
34
Sheryl Stolberg, then a medical writer for the
Los Angeles Times,
put her finger on what makes the TV newsmagazines so cavalier: “Killer germs are perfect for prime time,” she wrote. “They are invisible, uncontrollable, and, in the case of Group A strep, can invade the body in an unnervingly simple manner, through a cut or scrape.” Whereas print journalists only described in words the actions of “billions of bacteria” spreading “like underground fires” throughout a person’s body, TV newsmagazines made use of special effects to depict graphically how these “merciless killers” do their damage.
35
In Praise of Journalists
Any analysis of the culture of fear that ignored the news media would be patently incomplete, and of the several institutions most culpable for creating and sustaining scares the news media are arguably first among
equals. They are also the most promising candidates for positive change. Yet by the same token critiques such as Stolberg’s presage a crucial shortcoming in arguments that blame the media. Reporters not only spread fears, they also debunk them and criticize one another for spooking the public. A wide array of groups, including businesses, advocacy organizations, religious sects, and political parties, promote and profit from scares. News organizations are distinguished from other fear-mongering groups because they sometimes bite the scare that feeds them.
A group that raises money for research into a particular disease is not likely to negate concerns about that disease. A company that sells alarm systems is not about to call attention to the fact that crime is down. News organizations, on the other hand, periodically allay the very fears they arouse to lure audiences. Some newspapers that ran stories about child murderers, rather than treat every incident as evidence of a shocking trend, affirmed the opposite. After the schoolyard shooting in Kentucky the
New York Times
ran a sidebar alongside its feature story with the headline “Despite Recent Carnage, School Violence Is Not on Rise.” Following the Jonesboro killings they ran a similar piece, this time on a recently released study showing the rarity of violent crimes in schools.
36
Several major newspapers parted from the pack in other ways.
USA Today
and the
Washington Post,
for instance, made sure their readers knew that what should worry them is the availability of guns.
USA Today
ran news stories explaining that easy access to guns in homes accounted for increases in the number of juvenile arrests for homicide in rural areas during the 1990s. While other news outlets were respectfully quoting the mother of the thirteen-year-old Jonesboro shooter, who said she did not regret having encouraged her son to learn to fire a gun (“it’s like anything else, there’s some people that can drink a beer and not become an alcoholic”),
USA Today
ran an op-ed piece proposing legal parameters for gun ownership akin to those for the use of alcohol and motor vehicles. And the paper published its own editorial in support of laws that require gun owners to lock their guns or keep them in locked containers. Adopted at that time by only fifteen states, the laws had reduced the number of deaths among children in those states by 23 percent.
37
The
Washington Post,
meanwhile, published an excellent investigative piece by reporter Sharon Walsh showing that guns increasingly were being marketed to teenagers and children. Quoting advertisements and statistics from gun manufacturers and the National Rifle Association, Walsh revealed that by 1998 the primary market for guns—white males—had been saturated and an effort to market to women had failed. Having come to see children as its future, the gun industry has taken to running ads like the one Walsh found in a Smith & Wesson catalog: “Seems like only yesterday that your father brought you here for the first time,” reads the copy beside a photo of a child aiming a handgun, his father by his side. “Those sure were the good times—just you, dad and his Smith & Wesson.”
38
As a social scientist I am impressed and somewhat embarrassed to find that journalists, more often than media scholars, identify the jugglery involved in making small hazards appear huge and huge hazards disappear from sight. Take, for example, the scare several years ago over the Ebola virus. Another
Washington Post
reporter, John Schwartz, identified a key bit of hocus-pocus used to sell that scare. Schwartz called it “the Cuisinart Effect,” because it involves the mashing together of images and story lines from fiction and reality. A report by
Dateline NBC
on deaths in Zaire, for instance, interspersed clips from
Outbreak,
a movie whose plot involves a lethal virus that threatens to kill the entire U.S. population. Alternating between Dustin Hoffman’s character exclaiming, “We can’t stop it!” and real-life science writer Laurie Garrett, author of
The Coming Plague,
proclaiming that “HIV is not an aberration ... it’s part of a trend,”
Dateline’s
report gave the impression that swarms of epidemics were on their way.
39
Another great journalist-debunker, Malcolm Gladwell, noted that the book that had inspired
Outbreak,
Richard Preston’s
The Hot Zone,
itself was written “in self-conscious imitation of a sci-fi thriller.” In the real-world incident that occasioned
The Hot Zone,
monkeys infected in Zaire with a strain of Ebola virus were quarantined at a government facility in Reston, Virginia. The strain turned out not to be lethal in humans, but neither Preston in his book nor the screenwriters for
Outbreak
nor TV producers who sampled from the movie let that anticlimax
interfere with the scare value of their stories. Preston speculates about an airborne strain of Ebola being carried by travelers from African airports to European, Asian, and American cities. In
Outbreak
hundreds of people die from such an airborne strain before a cure is miraculously discovered in the nick of time to save humanity. In truth, Gladwell points out in a piece in
The New Republic,
an Ebola strain that is both virulent to humans and airborne is unlikely to emerge and would mutate rapidly if it did, becoming far less potent before it had a chance to infect large numbers of people on a single continent, much less throughout the globe. “It is one of the ironies of the analysis of alarmists such as Preston that they are all too willing to point out the limitations of human beings, but they neglect to point out the limitations of microscopic life forms,” Gladwell notes.
40
Such disproofs of disease scares appear rather frequently in general-interest magazines and newspapers, including in publications where one might not expect to find them. The
Wall StreetJournal,
for instance, while primarily a business publication and itself a retailer of fears about governmental regulators, labor unions, and other corporate-preferred hobgoblins, has done much to demolish medical myths. Among my personal favorites is an article published in 1996 titled “Fright by the Numbers,” in which reporter Cynthia Crossen rebuts a cover story in
Time
magazine on prostate cancer. One in five men will get the disease,
Time
thundered. “That’s scary. But it’s also a lifetime risk—the accumulated risk over some 80 years of life,” Crossen responds. A forty-year-old’s chance of coming down with (not dying of) prostate cancer in the next ten years is 1 in 1,000, she goes on to report. His odds rise to 1 in 100 over twenty years. Even by the time he’s seventy, he has only a 1 in 20 chance of
any
kind of cancer, including prostate.
41
In the same article Crossen counters other alarmist claims as well, such as the much-repeated pronouncement that one in three Americans is obese. The number actually refers to how many are overweight, a less serious condition. Fewer are
obese
(a term that is less than objective itself), variously defined as 20 to 40 percent above ideal body weight as determined by current standards.
42
Morality and Marketing
To blame the media is to oversimplify the complex role that journalists play as both proponents and doubters of popular fears. It is also to beg the same key issue that the millennium hypothesis evades: why particular anxieties take hold when they do. Why do news organizations and their audiences find themselves drawn to one hazard rather than another?