Read Restless Giant: The United States From Watergate to Bush v. Gore Online
Authors: James T. Patterson
Tags: #20th Century, #Oxford History of the United States, #American History, #History, #Retail
Instead, Clinton campaigned as a cautiously progressive “New Democrat” who championed centrist policies directed at bringing middle-class voters back to the party. Democrats, he said, would never win unless they avoided overidentification with interest groups such as labor unions. Though they must protect entitlements such as Social Security and Medicare, they had to cast aside their “old tax-and-spend policies.” Again and again, Clinton insisted that his party must stand for “opportunity,” “responsibility,” and “community.” Like Jimmy Carter, he said that he was a born-again man of faith. During the campaign he stood behind the death penalty. He promised to reform welfare, primarily by limiting the length of time that recipients might receive cash assistance, and to push for enactment of a middle-class tax cut. Without explaining how he would do it, he said he would halve the federal deficit within four years.
68
Clinton, however, had to overcome several political liabilities. Like many well-educated young men during the Vietnam War era, he had manipulated the Selective Service system so as to avoid military service. How could he expect to beat a war hero like Bush? Rumors circulated that he had smoked marijuana. Worst of all, persuasive evidence indicated that as a married man he had been a womanizer. During the early New Hampshire primary, the first major test, the media had a field day tracking down stories about his long-standing sexual relationship with Gennifer Flowers, a former Arkansas state employee. Clinton grumbled, “All I’ve been asked about by the press are a woman I didn’t sleep with and a draft I didn’t dodge.”
But Clinton could not explain all these stories away. It was true that he had outfoxed his draft board and that—like the huge majority of young boomers in the late 1960s—he had smoked marijuana. (With a straight face Clinton said that he had tried it once, but that he didn’t inhale, and never tried it again.) Appearing with his wife, Hillary, on a widely watched showing of
60 Minutes
, he sought to downplay stories about his sexual escapades, admitting only that he had “caused pain in his marriage.” In the end, he lost the primary to Tsongas. In the process he also acquired an enduring reputation as an adroit but glib and slippery politician—a “Slick Willie” who promised all manner of things and talked out of all sides of his mouth.
69
One Republican branded him a “pot-smoking, philandering, draft-dodger.”
70
It soon became obvious, however, that Clinton’s sexual exploits were far less damaging than political pundits had been predicting during the primary. Unlike Gary Hart, whose womanizing had knocked him out of the contest for the Democratic presidential nomination of 1988, Clinton came out of New Hampshire in politically good shape. What was happening? Some observers noted that Hart had been hypocritical, posturing as Mr. Clean before being exposed, whereas Clinton made no such claims. Others credited the artful performance that Clinton, a winsome character, had managed on television, and marveled at Hillary, who was standing by him. If Clinton’s behavior was all right with her, voters seemed to say, why should they worry?
In retrospect, it also seems that two long-range developments helped to enable Clinton to survive his deserved reputation as a womanizer. The first was that the ever more inquisitive media had by then helped to accustom Americans to hearing about the sexual transgressions of their political leaders—not only Gary Hart but also FDR, Ike (maybe), JFK, and LBJ. Clinton, though a transgressor, thereby developed a sort of immunity. (A comparable immunity helped to protect later presidential candidates, such as Al Gore in 2000, from attacks concerning youthful use of marijuana.) The second development, related to the first, was one of the many signs that cultural norms had steadily been changing in the United States. Many Americans of Clinton’s huge and politically influential baby boom generation, having matured in the more permissive 1960s and ’70s, understood from personal experience—or from the experiences of friends and relations—that Clinton’s behavior was hardly unique. They were considerably more tolerant of wayward personal behavior than were their elders. What public officials had done in the bedroom, they believed, need not affect their candidacies.
Tsongas, moreover, was not a strong opponent. A self-styled “pro-business liberal,” he was stiff on the stump. Even in New Hampshire, his neighboring state, he had won only 33 percent of the votes compared to Clinton’s 25 percent. Clinton, ever resilient and energetic, emerged confident after New Hampshire, calling himself the “Comeback Kid.” He was comfortable on television and appeared often on talk shows. These were becoming such a key feature of American politics that Russell Baker of the
New York Times
wrote that presidential campaigns had entered the “Larry King era.”
71
For all these reasons, Clinton attracted ample funding and moved aggressively ahead. On Super Tuesday in March, he won eight of eleven primaries. Tsongas quit the race soon thereafter, and though Brown stayed in the running, Clinton defeated him in key primaries that followed. Well before the Democratic convention, he had the nomination in his pocket. In a move calculated to attract southerners and young people to the Democratic fold, he then named Gore of Tennessee, a fellow baby boomer, as his running mate.
Bush, meanwhile, encountered challenges from conservatives who rallied to Patrick “Pat” Buchanan, a former speechwriter for Nixon, director of communications for Reagan, columnist, and TV talk-show host. Buchanan was a forceful, often strident speaker who enjoyed being called a “right-wing populist” and the “pit bull of the Right.” A strong Catholic, he vehemently opposed abortion and directed his appeals to the Religious Right. As an isolationist in his views about foreign policy, he had argued against American military engagement in Iraq, and he called for sharp cutbacks in immigration. Though no one expected him to wrest the GOP nomination from Bush, he ran in the New Hampshire Republican primary, where he received 36 percent of the vote to the president’s 53 percent. His showing fired up a flare that illuminated the rift in the party.
72
In April another critic, H. Ross Perot, entered the campaign as an independent candidate. Perot was a billionaire businessman from Texas who paid for his own TV ads and thereby did not have to worry about campaign finance regulations. Announcing his candidacy on
Larry King Live,
he waged his campaign almost entirely on television, appearing before voters only late in the race. In the course of blasting Bush’s economic policies, the folksy Perot proclaimed that the “engine is broke,” and “we got to get under the hood and fix it.” Concerning Bush’s contributions to the rising national debt, he wisecracked, “Maybe it was voodoo economics. Whatever it is, we are in deep voodoo.” The deficit, he added, was “like a crazy aunt we keep down in the basement. All the neighbors know she’s there but nobody wants to talk about her.”
73
In the summer, when Bush worked for the NAFTA agreement with Canada and Mexico that would have eliminated tariffs between the three nations, Perot predicted that it would produce a “giant sucking sound” of American jobs whooshing across the border to the cheap labor markets of Mexico.
While these threats were mounting, Bush was slow to develop an organized team, and he remained strangely distracted and uninvolved.
74
Reporters wrote that he was “out of touch” with ordinary Americans. He did not seem to grasp an important fact about late twentieth-century American politics: Winning a presidential election had come to require full-time, all-absorbing attention. His disengagement proved disastrous at the GOP convention in August, which he carelessly allowed religious conservatives to dominate. They approved a strongly conservative platform that focused on preserving “family values,” called for the restoration of prayer in the public schools, and denounced abortion. Buchanan was given a prime-time speaking slot on the first night of the convention, where he preached a stern and religious message to a nationwide television audience. Slamming “radical feminism,” “abortion on demand,” and “homosexual rights,” he proclaimed, “There is a religious war going on in this country. It is a cultural war as critical to the kind of nation we shall be as the Cold War itself. This war is for the soul of America.”
75
With Bush apparently unable to control his own party, Clinton had a fairly easy time of it. He promised not only a middle-class tax cut but also some form of national health insurance coverage for all Americans. Though he kept his distance from Jesse Jackson and other African Americans on the left, he was highly popular among black voters. Perhaps because of his identity as a southerner, he fared better in the South (later carrying four of eleven southern states) than any Democratic nominee since Carter in 1976.
76
For the most part he stuck to a centrist position that he had earlier held on the issue of crime, notably by leaving the campaign trail and returning to Arkansas in order to approve the execution of a mentally retarded black prisoner.
Clinton paid little attention to foreign affairs. After all, the Cold War had ended, and the nation was at peace. Bush, moreover, was a war hero and therefore a more attractive candidate to many Americans who had fought in World War II or Vietnam. Instead, Clinton led with his strength, domestic policies. Speaking as if the recession had not abated, he concentrated on hammering home a central message: The economic policies of the Reagan and Bush administrations had badly damaged the nation. Republicans, Clinton charged, had compiled “the worst economic record since the Great Depression.” A prominent sign at his “war room” in Little Rock made his strategy clear: The main issue was
THE ECONOMY, STUPID
.
For all these reasons, Clinton triumphed easily in November. He won 44.9 million votes, 43 percent of the total, to Bush’s 39.1 million, 37.4 percent. Women, ignoring Clinton’s reputation as a philanderer, were stronger for him than men were.
77
In an election that brought almost 13 million more voters to the polls than in 1988, Bush received 9.7 million fewer votes than he had attracted four years earlier. He was strong only in the Sunbelt, the Plains states, and the Mountain West. These areas, which Reagan had also carried and where religious conservatives had been especially active, had become fairly solid Republican country. But Bush was badly beaten in the key urban states of the Northeast, Middle West, and the West Coast. Clinton won a big victory in California, which led the nation with votes (54) in the electoral college. He took the electoral college, 370 to 168.
Most analyses of the voting suggested that Perot’s candidacy had hurt Bush. Perot won 19.7 million votes, or 19 percent of the total. This was the most impressive performance by a third party candidate since Theodore Roosevelt had won 27 percent as the Progressive Party nominee in 1912. Studies estimated that as much as 70 percent of Perot’s support may have come from people who said they had voted for Bush in 1988.
78
The most likely explanation for Perot’s strong showing, which astonished many observers, is that a great many Americans chose to register their dissatisfaction with both parties.
Though Clinton had won handily, he knew he had little to crow about. His percentage of the vote, at 43, was the lowest for any winner since Woodrow Wilson had triumphed with 42 percent in 1912. Even Dukakis, with 45 percent in 1988, had fared a little better. Democrats gained one seat in the Senate, for a margin that would be 57 to 43 in 1993, but lost nine in the House, where they would have an advantage of 258 to 176.
Still, it was a heartening victory for the Democrats, who had been out of the White House for twelve years. Moreover, Clinton, a skilled, often charismatic campaigner, seemed to have papered over at least a few of the cracks that had weakened the Democratic Party since the 1960s. Though liberals worried that he might steer the party too far toward the right, they were delighted to have driven the GOP from power. Because Democrats had gained control of the presidency while maintaining majorities in both houses of Congress, they had good reason to hope that the plague of divided government would not afflict the new administration. With the recession becoming a thing of the past, supporters of Clinton anticipated that his “progressive centrism,” as some people called it, would reinvigorate the nation.
8
“Culture Wars” and “Decline” in the 1990s
Robert Bork, having been denied a seat on the Supreme Court by the Senate in 1987, emerged in the 1990s as a belligerent conservative in the “culture wars,” as contemporary writers saw them, of that contentious decade. He opened his angry, widely noticed
Slouching Toward Gomorrah
(1996) by citing William Butler Yeats’s “The Second Coming,” written in 1919 in the aftermath of World War I. Among the poem’s despairing lines were these: “Things fall apart; the center cannot hold; / Mere anarchy is loosed upon the world, / The blood-dimmed tide is loosed, and everywhere / The ceremony of innocence is drowned; / The best lack all conviction, while the worst / Are full of passionate intensity.”
1
The subtitle of Bork’s polemical book,
Modern Liberalism and American Decline
, highlighted two main themes that many conservatives bemoaned in the 1990s: America was in “decline,” and liberals were to blame for culture wars that were splintering the nation. Bork wrote, “There are aspects of almost every branch of our culture that are worse than ever before and the rot is spreading.” He fired at an array of targets: America’s “enfeebled, hedonistic culture,” its “uninhibited display of sexuality,” its “popularization of violence in . . . entertainment,” and “its angry activists of feminism, homosexuality, environmentalism, animal rights—the list could be extended almost indefinitely.” He closed by complaining that the United States was “now well along the road to the moral chaos that is the end of radical individualism and the tyranny that is the goal of radical egalitarianism. Modern liberalism has corrupted our culture across the board.”
2
Bork was by no means the only writer to lament the “decline” of America in the 1990s. Carl Rowan, an African American journalist, weighed in, also in 1996, with an irate book titled
The Coming Race War in America: A Wake-up Call
. Though his major target was the totally different one of white racism, Rowan agreed that the United States was “in decline . . . on the rocks spiritually, morally, racially, and economically.” He added, “Everywhere I see signs of decadence, decay, and self-destruction.” America, he said, was “sinking in greed” and in “sexual rot and gratuitous violence.” Inviting readers’ attention to the fates of ancient Rome and Greece, and of the British Empire, Rowan despaired, “this country . . . is in precipitous decline.”
3
Perceptive readers might have noted that Rowan’s allusion to Rome, Greece, and the British Empire echoed warnings that Paul Kennedy, a liberal, had issued in his widely cited
Rise and Fall of the Great Powers
, published in 1987. They would also have known that the ideological and cultural warfare that seemed to wrack the early and mid-1990s had its origins in battles that had escalated as far back as the 1960s. These had heated up in the late 1980s, when the literary critic Allan Bloom, in his aggressively titled
The Closing of the American Mind
, had lashed out against the trivialization of American intellectual life. In the same year, E. D. Hirsch Jr., in
Cultural Literacy: What Every American Needs to Know
, more temperately complained of what he and fellow authors considered to be the bewildering and culturally divisive nature of curricula in the schools.
4
Jeremiads about “American decline,” however, seemed to hit a larger cultural nerve in the early and mid-1990s. Many of these, like Bork’s, emanated from conservatives who were feeling marginalized by liberalizing cultural changes and who were outraged by what they perceived as ever expanding evils: sexual immorality, violent crime, vulgarity and sensationalism in the media, schools without standards, trash that passed as “art,” and just plain bad taste.
5
As Zbigniew Brzezinski wrote in 1993, a “massive collapse . . . of almost all established values” threatened to destroy American civilization.
6
What was one to supposed to think, other critics demanded, about a jury that awarded a woman $2.9 million because she had spilled hot coffee from McDonald’s that had badly scalded her? Or about Bill Clinton, president of the United States, who responded to a questioner who asked him on MTV whether he wore boxers or briefs? Perhaps thinking of the youth vote, Clinton replied, “Usually boxers.”
As earlier, many conservative writers located the source of cultural decline in the way Americans—especially boomers—had raised their children. For culture warriors such as these, old-fashioned “family values” were among the highest of virtues. Alarmed by what they perceived as the catch-as-catch-can quality of family life, they highlighted articles that reported only 30 percent of American families sitting down to eat supper together—as opposed to 50 percent that were supposed to have done so in the 1970s. As David Blankenhorn, director of the revealingly named Institute for American Values, exclaimed in 1993, America’s central problem was “family decline.” He added, “It’s not ‘the economy, stupid.’ It’s the culture.”
7
Religious conservatives, enlarging organizations such as the Family Research Council, swelled this chorus of laments, evoking outcries from liberals who warned that that the Religious Right was becoming ever more aggressive in waging wars against abortion, gay rights, and other matters.
8
Though Falwell, having weakened the Moral Majority by giving highly controversial speeches (in one, he defended the apartheid policies of South Africa), disbanded the organization in early 1989, a new force, the Christian Coalition, grew out of Pat Robertson’s presidential campaign of 1988. Appearing on the scene in 1989, it rapidly gained visibility under the leadership of Ralph Reed, a young, boyish-faced Georgian who had earlier headed the College Republican National Committee. Reed displayed extraordinary political, organizational, and fund-raising skills and managed at the same time to earn a PhD in history from Emory University in 1991. By mid-1992, the Christian Coalition claimed to have more than 150,000 members and to control Republican parties in several southern states.
9
In the early 1990s, another religious group, the Promise Keepers, also came into being. Founded by Bill McCartney, football coach at the University of Colorado, this was an all-male organization of evangelical Christians who vowed to cherish their wives and children and thereby strengthen family life in America. Growing slowly at first, Promise Keepers surged forward by mid-decade. At its peak in 1997, it staged a massive meeting and rally on the mall in Washington, where an estimated 480,000 men were said to promise to be loving and supportive husbands and fathers.
Many Americans who joined groups such as these were still contesting the divisive cultural and political legacy of the 1960s—a secular legacy, as they saw it, of pot smoking, bra burning, love beads, radical feminism, black power, crime in the streets, pornography and sexual license, abortion, family decline, Darwinian ideas of evolution, and gross-out popular culture. Stung by what they considered to be the hauteur of upper-middle-class liberals, they complained that an elitist left-wing/liberal culture had captured universities, foundations, Hollywood, and the media. A “great disruption” was ravaging late twentieth-century America.
10
Liberals rejected these laments, perceiving them as rants by the Christian Right, political Neanderthals, and hyper-patriots who were endangering civil rights and civil liberties and threatening the tolerant values of the nation. But some on the left, too, worried about social and cultural decline. In 1993, Senator Moynihan of New York, a prominent liberal, wrote an article that attracted considerable attention. It argued that America was “defining deviancy down”—that is, too calmly accepting as normal all sorts of once stigmatized and often dysfunctional behaviors, such as out-of-wedlock pregnancy.
11
Other liberals identified different signs of decline: conspicuous consumption, rising inequality, and misallocation of resources that—in the phrase of one frightened writer—threatened to turn the United States into a “third world country.”
12
Still other observers of cultural developments in the early and mid-1990s, many but not all of them centrist or slightly left of center in their politics, advanced a communitarian movement. As Robert Bellah and his co-authors had argued in
Habits of the Heart
(1985), they asserted that unbridled individualism was undermining America’s long-admired capacity for cooperative community involvement.
13
Popularizing a similar view in 1995, Harvard professor Robert Putnam published a widely discussed article titled “Bowling Alone.” Putnam observed that while bowling remained a popular sport, people were less likely to bowl together as teammates in leagues. Instead, they bowled alone. Americans, he said, were turning inward and looking out for themselves. Rising numbers of rich people were moving into gated enclaves. The American people, renowned for their voluntarism, were becoming more fragmented, isolated, and detached from community concerns.
To Putnam and others, the fate of league bowling was symptomatic of a larger, generational abdication from participation in the group activities that America’s historically community-minded people had earlier been famous for: voting, churchgoing, home entertaining, and volunteering in beneficent organizations such as service clubs, the YMCA, the PTA, and the League of Women Voters. Critics such as these had a pointed: Many of the groups that in the 1980s and 1990s did report large and growing memberships—such as the AARP and the Sierra Club—were largely top-down, professionally managed organizations that relied on foundations, mass mailings, and manipulation of the media to amass financial resources. Endangered, it seemed, were grass-roots, face-to-face meetings of concerned and unpaid local people who devoted time and effort in order to promote better communities.
14
Putnam considered a number of possible reasons for these alleged declines in community-minded activity—people were ever more busy commuting; two-parent working schedules were cutting into time once available for volunteering (a great deal of which in the past had been carried out by women); the ever wider reach of big business and globalization were eclipsing local attachments; the extension of big government was centralizing things—before concluding that the loss of community stemmed mainly from generational trends: The public-spirited generation that had grown up during the Depression and World War II was slowly dying out.
Like many Americans at the time, Putnam deplored developments in the media, especially the negative impact of television. Thanks in part to the proliferation of cable channels, many of them relying on niche marketing to attract people with special interests, Americans had become less likely than in the past to receive their information from newspapers, or even from network news. Worse, he said, Americans devoted long hours to mindless watching of the tube—an unblinking eye that was titillating the masses. To Putnam, the absorption of people in television since the 1950s was a fundamental source of the increasing isolation of Americans from one another.
15
A host of scholars and journalists jumped into vigorous debates with communitarians and with writers such as Putnam. Laments about the baneful influence of television, some of these debaters observed, had a long history that dated at least to highly publicized congressional hearings held in 1954 by Senator Estes Kefauver of Tennessee. In 1977, a widely noted book,
The Plug-In Drug
, had bemoaned the deleterious effects of TV on children and family life.
16
Though the introduction in the 1980s of remote controls greatly advanced mindless channel surfing, it remained difficult in the 1990s, as earlier, to prove that the impact of television on community life had become much more damaging over time.
Some writers who joined these debates wondered if people like Putnam were clinging to romantic concepts of the past. The ideal of “community,” they agreed, was attractive, but had it ever taken very deep hold in the United States, a dynamic land of capitalist energy and of restless geographical mobility? Others in these debates, taking a different tack, pointed out that conservative religious groups in the 1980s and 1990s were very much engaged in grass-roots community-building, though normally only among like believers. Still, concerns over decline in community, though challenged, seemed to resonate widely at the time, earning Putnam coverage in
People
magazine and an invitation to Camp David to meet President Clinton. “Americans,” Putnam proclaimed, “have been dropping out in droves, not merely from political life, but from organized community life more generally.”
17
“W
ARS
”
OVER CULTURAL PREFERENCES
and standards were hardly new in the early 1990s: In a multicultural nation of diverse religious and racial groups and of immigrants such as the United States, these struggles had a long history.
18
In the early and mid-1990s, however, they coexisted with, and drew sustenance from, the often harsh partisan political struggles of the Clinton years. Though accounts in the media often exaggerated the ferocity and power of the “wars,” the controversies did seem more clamorous than they had been in the past.