Authors: Larry Schweikart,Michael Allen
Food rapidly democratized and diversified, with the specialized dishes of the elites spreading to the middle class throughout the country. Soldiers who had come back from Italy had a yearning for pasta; New Yorkers who knew Coney Island learned the magic of the hot dog and took the concept with them as they traveled. Asian recipes moved inward from the coasts as Mexican cuisine surged northward. America’s eating establishments became the most richly textured on the planet, with the most varied menus anywhere in the world. Within thirty years, Jamaican hot peppers, Indian curry sauce, flour tortillas, lox, teriyaki sauce, Dutch chocolates, innumerable pasta variations, and spices of all descriptions flooded the shelves of American grocers, allowing a cook in North Dakota to specialize in cashew chicken, N’Awlins shrimp, or enchiladas. Not surprisingly, some of the most celebrated chefs to come out of this era drew upon their ethnic roots or experiences for their cooking. Martha Stewart (born Martha Kostyra) frequently prepared Polish dishes. And Julia Child, who worked in Asia with the Office of Strategic Services and then lived in Paris (where she learned to cook), had a broad firsthand exposure to foreign cuisine. Emeril Lagasse, another future star, born in the early 1960s, earned his chef’s apron in his parents’ Portuguese bakery.
Far from a decade of conformity, as expressed in the lamentations of books on corporate America, such as William Whyte’s
The Organization Man
(1956) or Sloan Wilson’s
The Man in the Gray Flannel Suit
(1955), the population had entered a period of sharp transition where technology was the handmaiden of turmoil. These books and others, such as David Reisman’s
The Lonely Crowd
(1950), emphasized a shift from rugged individualism to a team or corporate orientation. Reality was quite different: American conformity in fact kept the sudden and difficult transitions of the postwar world from careening out of control. It is not surprising that the two most popular movie stars of the day—the establishment’s John Wayne and the counterculture’s James Dean—in different ways celebrated rugged individualism, not conformity.
No one symbolized the effort to maintain continuity between 1950s America and its small-town roots and patriotic past more than painter Norman Rockwell (1894–1978), who in some ways was the most important and significant American artist in the history of the Republic. Born in New York, Rockwell left school in 1910 to study at the National Academy of Design. Almost immediately his work found an audience and a market. He painted Christmas cards before he was sixteen, and while still a teenager was hired to paint the covers of
Boys’ Life
, the official publication of the Boy Scouts of America.
68
Setting up a studio in New Rochelle, Rockwell worked for a number of magazines until he received a job in 1916 painting covers for
The Saturday Evening Post
, a magazine Rockwell called the “greatest show window in America.” In all, Rockwell painted 322 covers for the
Post
, and illustrated children’s books before he began painting for
Look
magazine.
Critics despised Rockwell because he presented an honest, yet sympathetic and loving, view of America.
69
He insisted on painting those scenes that captured the American spirit of family—independence, patriotism, and commitment to worship. Inspired by one of Franklin Roosevelt’s speeches, Rockwell produced his masterpieces, the
Four Freedoms
, which ran in consecutive issues of the
Post
in 1943 along with interpretive essays by contemporary writers.
Freedom from Want
was inspired by his family’s cook presenting a turkey at Thanksgiving.
Freedom of Speech
, possibly the best known Rockwell painting of all, featured a small-town meeting in which a laborer in a brown leather jacket speaks with confidence about a bill or proposal tucked in his pocket.
Rockwell did not ignore the serious deficiencies of American society. His 1964
Look
painting
The Problem We All Live With
remains one of the most powerful indictments of racial discrimination ever produced. Depicting the desegregation of a New Orleans school district in 1960, Rockwell painted a little black girl, Ruby Bridges, being escorted into the formerly all-white school by four federal marshals. The wall in the background has the splattered remains of a tomato just under the graffito
nigger
that appears above her head.
New Kids in the Neighborhood
(1967) pictures a moving van with two African American kids standing beside it—the new kids staring at three white children who are looking at them with curiosity, not anger or fear.
Rockwell’s paintings capture a stability in a sea of unraveling social and regional bonds. Religion tried to adapt to these changes but failed. It took outsiders, such as Billy Graham and Oral Roberts, to cut through the serenity, comfort, and even sloth of the mainstream religions to get Christianity focused again on saving the lost and empowering the body of Christ. Clinging to stability and eschewing change came at a price: the lack of passion and avoidance of contention in many denominations triggered a staggering decline in membership. One researcher found that starting in 1955, the Methodist Church lost an average of a thousand members every week for the next thirty years.
70
In the mid-1950s, churches responded by becoming more traditional and turning down the doctrinal voltage.
As religion grew less denominationally contentious, thus making it less important to live near those of a similar denomination, Americans found one less impediment to relocating to other cities or regions of the country. The market played a role in this sense of regional familiarity too. Entire industries sprang up to meet the demands of an increasingly mobile population. For example, Kemmons Wilson, a Tennessee architect, traveled with his family extensively and was irritated by the quality of hotels and the fact that most hotels or motels charged extra for children. Wilson and his wife embarked on a cross-country trip in which they took copious notes about every motel and hotel where they stayed: size of rooms, facilities, cost, and so on. He then returned home to design the model motel of optimal size, comfort, and pricing—with kids staying free. The result—Holiday Inn—succeeded beyond Wilson’s wildest dreams. By 1962, Wilson had 250 motels in some 35 states. Wilson saw standardization as the key. Each Holiday Inn had to be the same, more or less, as any other. That way, travelers could always count on a “good night’s sleep,” as he said later. Americans’ quest for familiar products, foods, and even fuel and music in an age of growing mobility produced a vast market waiting to be tapped.
71
Ray Kroc saw that potential. A middle-aged paper-cup salesman who had invented a multiple-milk-shake mixer, Kroc was impressed with a California hamburger stand owned by a pair of brothers named McDonald. He purchased the rights to the name and the recipes and standardized the food. All burgers, fries, and milk shakes at all locations had to be made in exactly the same way. In 1954 he opened the first McDonald’s drive-in restaurant in Des Planes, Illinois, replete with its characteristic golden arches. After five years, there were two hundred McDonald’s restaurants in the United States, and Kroc was opening a hundred more per year.
72
By the twenty-first century, “fast food” had become a derogatory term. But fifty years earlier, when truckers planned their stops at roadside truck cafés, the appearance of a McDonald’s restaurant in the distance, with its consistent level of food quality, brought nothing but smiles.
What Norman Rockwell had done for canvas, Kroc and Wilson did for food and lodging, in the sense that they provided buoys of familiarity in a sea of turbulence and international threats. Americans needed—indeed, demanded—a number of consistent threads, from music to meals, from autos to dwellings, within which to navigate the sea of transformation in which they found themselves.
The Invisible Man
One of the main arenas where Americans confronted radical change in the 1950s was in race relations. The continued injustice of a segregated society in which black people were either second-class citizens or, in more “sophisticated” cities, merely invisible, had finally started to change. Ralph Ellison’s novel
The Invisible Man
eloquently captured the fact that to most white Americans, blacks simply did not exist. Television shows never depicted blacks in central roles; black or “nigger” music, as white-dominated radio stations called it, was banned from playlists (as was Elvis Presley, whom disc jockeys thought was black, early on). One could search in vain for African American executives heading major white-owned companies.
Few blacks were even remotely equal to whites in economic, political, or cultural power. This situation existed across the nation, where it was winked at or deliberately ignored by most whites. But in the South racism was open and institutionalized in state and local laws. Since
Plessy v. Ferguson
the doctrine of “separate but equal” had been applied to southern public facilities, including schools, transportation, public restrooms and drinking fountains, and in the vast majority of private restaurants and in the housing market. On municipal buses, for example, blacks were
required
to give up their seats to whites, and were always expected to go to the back or middle of the bus. Segregation of the races divided everything from church services to whites-only diners. State universities in many southern states would not admit blacks, nor was any black—no matter how affluent—permitted to join country clubs or civic groups. Indeed, even as late as the 1990s, when the black/Asian golfer Tiger Woods became the youngest pro golfer to win the Masters, he was prohibited from joining some of the private golf clubs at which he had played as part of the Professional Golfers’ Association tour. Also in the 1990s, famous televangelist pastor Frederick K. C. Price was not invited to speak at certain churches because of his skin color.
Large numbers—if not the vast majority—of whites entertained some racial prejudices if not outright racism. Confederate flag-wavers, white-robed Ku Klux Klansmen (whose organization had plummeted in membership since the 1920s), and potbellied southern sheriffs still stood out as not-so-comical symbols of white racism. Equally dangerous to blacks, though, were well-meaning whites, especially northeastern liberals, who practiced a quiet, and perhaps equally systematic, racism. Those northern white elites would enthusiastically and aggressively support the fight for civil rights in the South while carefully segregating their own children at all-white private schools. They overwhelmingly supported public school systems with their votes and their editorials, but insulated their own children from exposure to other races by sending them to Andover or Sidwell Friends. Few had personal acquaintances who were black, and fewer still, when it was in their power, appointed or promoted blacks to corporate, church, or community positions.
Not surprisingly, this subterranean prejudice was at its worst in liberal meccas such as Hollywood and New York City, where television production headquarters selected the programming for virtually all TV broadcasting in the 1950s and early 1960s. With the notable exception of the radio show
Amos and Andy
—whose actors were actually white!—black television characters were nonexistent except as occasional servants or for comic relief or as dancers. There were no black heroes on television; worse, there were no black families. Black children did not have many good role models on television, and those African Americans they did see were seldom entrepreneurs, political leaders, or professionals. Perhaps not surprisingly, the wholesale exclusion of blacks from large segments of American society made African Americans suspicious of the few who did achieve positions of importance in white business or culture. Ellison’s
Invisible Man
appropriately captured white America’s treatment of more than 10 percent of its population.
Hardly in the vanguard of civil rights, Eisenhower shielded himself from controversy behind the separation of powers. His position, while perhaps appropriate at times, nevertheless contradicted the constitutionally protected civil rights of blacks and cemented the view among black politicians that their only source of support was the Democratic Party. It is ironic, then, that two key events in America’s racial history occurred during Eisenhower’s presidency. The Legal Defense and Educational Fund of the NAACP (National Association for the Advancement of Colored People), led by its director, attorney Thurgood Marshall, earlier had started to take on the “separate but equal”
Plessy
decision. Marshall had laid the groundwork with a Texas case,
Sweatt v. Painter
(1950), in which the Supreme Court found that intangible factors, such as isolation from the legal market, constituted inequality. The real breakthrough, however, came in 1954 through a case from Topeka, Kansas, in which the Supreme Court’s ruling in
Brown v. Board of Education
overturned
Plessy v. Ferguson
and prohibited state-supported racial discrimination.
The Reverend Oliver Brown, whose daughter Linda had to walk past a white school to catch her bus to a black school, had brought a suit against the Board of Education of Topeka, Kansas.
73
The board argued that its schools were separate, but equal (à la
Plessy
). In 1953, President Eisenhower had appointed a Republican, Earl Warren of California, as chief justice. This brought about a shift in the Court against
Plessy
, which the Court found inherently unequal. A year later, the Court required that states with segregated districts (twenty-one states and the District of Columbia) desegregate with “all deliberate speed.” In 1956, Southern states dominated by the Democrats issued a defiant “Southern Manifesto,” in which nineteen senators and eighty-one congressmen promised to use “all lawful means” to reinstate segregation.