Circuit City declined to share specific details of its new business plan, but to cite one example, employees working in the computer department were let go if their wage exceeded $15.50 an hour, regardless of seniority. Company executives were not affected by the layoffs and pay cuts. Nor were they affected in 2003 when 20 percent of the Circuit City workforce either lost their jobs or had their commissions cut. Meanwhile, customers stayed away in droves. Without skilled staff, what was once a full-service electronics retailer shriveled into a glorified delivery service shuffling product from factory to store to customers. The company continued to lose money, and in the final months of 2008 it filed for bankruptcy protection.
Circuit City’s 2007 “wage management initiative” was reported only feebly by the press, which by then was inured to skilled workers being trampled in a stampede of cost reductions. The news then was not of falling wages but of rising consumer prices. The Consumer Price Index—a measure of the average change in prices over time of goods and services including food, clothing, shelters, fuel, transportation fares, and health care—increased 4.2 percent from May 2007 to May 2008. A closer look reveals that these increases were not evenly distributed across all product and service categories. The cost of housing was up 3.3 percent, food up 5.1 percent, and energy up an alarming 17.4 percent. But the cost of clothing, which had been falling for decades, actually experienced a further decline of .6 percent. In fact, once food and energy were stripped from the equation, the Consumer Price Index increased by only 2.3 percent over that year, the largest portion of which was traced to an 8.1 percent growth in transportation costs.
Increases in the cost of food are always significant and for some people extremely painful. But to put it into perspective, the spurt in food costs in mid-2008 came on the heels of decades of sharp declines, at least relative to income. Data from the U.S. Department of Agriculture indicate that in 1929 Americans spent 20.3 percent of their disposable personal income on food at home, and an additional 3.1 percent on food away from home. In 1970 the at-home percentage had been slashed nearly in half, to 10.3 percent, while the away-from-home percentage increased only slightly, to 3.6 percent. The years that followed saw a steady decline in total food expenditures as a percentage of disposable income: In 2007, at-home food consumption had dropped to a mere 5.7 percent of income, and away-from-home had increased slightly to 4.1 percent of income. Even allowing for 2008 increases, food has for decades been a bargain in America, which is one reason that the poor have a far greater risk than the rich of becoming dangerously obese.
What we spend for food is to some degree under our control. Many of us (though certainly not all of us) who need or choose to lower our individual food costs might do so by eating out less and by reducing the amount of food we waste, which for most of us is substantial. This same sort of “trading down” also applies to most consumer goods. We can hand down T-shirts from one child to the next, buy furniture at secondhand shops, hang on to our cars for ten years rather than six, and send handmade cards or crafts instead of buying expensive gifts. These changes are not easy, but for most of us they can be made without heartbreaking disruption and hardship. This may seem a good thing, and in many senses it is. Few Americans have to concern themselves with starvation or worry about freezing to death for lack of a warm coat. But, unfortunately, the same cannot always be said for coping with daunting increases in the price of heating fuel, education, public transportation, rental housing and, most spectacularly, health care. For this we need income, and for millions of Americans income has not kept pace.
Thanks in part to the national insistence on low-priced consumer goods, wages and benefits have barely budged to accommodate the rising costs of most essentials. Technology-driven efficiencies have given us access to the best deals from around the world, but these deals come out of our communal paychecks. Under President George W. Bush and a compliant Republican Congress, the minimum wage was stalled at $5.15 an hour for nearly a decade. A full-time worker earning $5.15 an hour grosses $10,712 a year, well below the poverty line of $16,079 for a family of three. In 2007 the Democratic Congress raised the minimum wage to $7.25 over two years, a real victory for low-wage workers. But even this improved wage is lower in real dollars than the minimum wage of 1960. Retail workers generally make more than the minimum wage, but not enough more to allow them entry into the middle class. Nearly one-third of all working Americans living in poverty are employed in the retail sector. At this writing the average hourly wage for department store associates is $8.79, according to the U.S. Department of Labor, and the annual mean wage is $18,280—that is, except for clothing store clerks, who make less.
The minimum wage was key to President Franklin Delano Roosevelt’s New Deal, a vigorous and courageous response to the horrors visited on Americans by the crash of 1929 and the Great Depression. Given soaring unemployment, and crushing homelessness, it had been clear for nearly a decade that market forces alone were unable to achieve the desired recovery and that government action was necessary, whether in the form of taxation, industrial regulation, public works, social insurance, social welfare services, or deficit spending. Roosevelt and the New Dealers built powerful protections: Social Security, unemployment insurance, welfare, housing subsidies, disability insurance, and funds for widows and orphans. Such protections gave organized labor the traction to goad big business into weaving its own safety net: health insurance, pensions, guaranteed job security, and, for some, life and disability insurance. The Fair Labor Standards Act of 1938, the final major reform of the New Deal, was hailed by Roosevelt as “the most far-reaching, far-sighted program for the benefit of workers ever adopted here or any other country.”
Decades before, Henry Ford had set the bar on wages, paying his workforce (even the “sweepers”) $5 a day in wages and a profit-sharing bonus—enough, he famously figured, for them to buy the Model T’s his workforce assembled. We Americans revere this story as integral to the nation’s legacy. (We also tend to forget that Ford engaged in some fairly repressive employment practices.) But the so-called labor aristocracy made possible by such apparent largesse and by the union movement that followed was short-lived, a veritable blip on our historical screen. Today’s twenty-first-century service economy was built not on the Henry Ford manufacturing model or the union model but on the Frank W. Woolworth model. Unlike Ford, Wal-Mart and most other discounters don’t manufacture products; they distribute products made by others. As Woolworth himself pronounced in 1892, cheap goods cannot be had without “cheap help.” America is now awash in cheap help who distribute the cheap goods manufactured by even cheaper help working out of our sight and largely out of awareness. How lucky are we to have so many people working so hard and so cheaply to provide us with so many of life’s necessities and niceties? Once again that depends on who you ask.
In April 2008 economist Emek Basker published a study entitled “Does Wal-Mart Sell Inferior Goods?” Basker meant “inferior goods” in the technical sense; that is, products and services consumers buy in times of economic stress and hardship. She found that for every 1 percent decrease in personal disposable income, Wal-Mart revenues increased by 0.5 percent. That summer, as America oozed toward recession, Wal-Mart announced a 6.1 percent rise in sales, beating Wall Street estimates. TJX, the owner of cut-rate clothing retailers T. J. Maxx and Marshall’s, also enjoyed sales growth. That the corollary is obvious makes it no less troubling: Discounters profit most when Americans hit bottom. This may help explain why Wal-Mart lobbies so hard to keep unions at bay in order, it claims, to keep prices low for its customers. But it does not explain why until recent years the company lobbied against national health care reform and other protections that would benefit both its workers and its core clientele. Andrew Young, a former U.S. Congressman and U.N. ambassador turned Wal-Mart spokesman, seemed to offer an explanation: “Poverty in America,” he said, “is market potential unrealized.” It seems that the poor benefit the discounting industry far more than the discounting industry benefits the poor.
CHAPTER EIGHT
CHEAP EATS
The maxim that the “best is the cheapest” does not apply to food.
|
W. O. ATWATER, PH.D., “FOODS: NUTRITIVE VALUE AND COST”
I’m going to eat too much, but I’m never going to pay too much.
|
ADVERTISING CAMPAIGN FOR DENNY’S RESTAURANT
Potatoes have always been cheap, but they have not always been welcome. Native to the highlands of South America, they were introduced in Europe by Spanish conquistadors returning victorious from their exploits in the New World. Europeans regarded the lumpy mud-speckled objects with suspicion, and for good reason: Their flowers resembled those of deadly nightshade, and their tough, scruffy hide brought to mind skin diseases, such as leprosy and syphilis. In France, Belgium, Austria, and Germany potatoes were scorned as a pernicious, lust-inducing scourge. The Italians and British considered them a sort of punishment, food fit for pigs or prisoners, but the Irish saw things differently.
Sir Walter Raleigh brought the potato to Ireland in 1589, where it flourished. The Irish peasantry, practical by necessity, quickly embraced the staple that went from harvest to dinner plate with such aplomb and so little fuss. Wheat needed milling into flour, oats rolling or grinding, but potatoes required for palatability only a bake or a boil. Nutritious, filling, and portable, they were a near perfect food. Irish industrialists and landowners praised potatoes as “heaven-sent” food for the peasantry, cheap fuel for cheap labor. By the mid-nineteenth century 3 million Irish peasants ate almost nothing else, as many as ten or even twelve pounds a day, seasoned with buttermilk and salt, and maybe with one or two herring on the side. One historian of Ireland wrote at the time: “Cooking any food other than a potato had become a lost art. Women boiled hardly anything but potatoes.”
We know the tragic corollary to this tuber fixation. In 1845 a great wind blew spores of
phytophthora infestan
from southern Europe to Ireland, where, like the potato itself, the fungus found fertile ground. Over the centuries South Americans had learned to plant potatoes in scores of genetically distinct varieties, one or more of which would surely have mounted resistance to the fungal attack. But Ireland was defenseless against this intruder, relying as it did on a single subspecies, the unfortunately named “Lumper.” The Lumper, though not the best-tasting potato, grew easily on thin stony soil. It was, one might say, the cheapest of the cheap. The downside was that it rallied next to no objection to invaders. It was such easy fungal prey that it took mere days for a single infested plant to infest thousands of others, systematically curdling the nation’s primary food source into a foul-smelling black slime.
The Irish potato famine is branded into historical memory as a cautionary tale of greedy landowners, helpless farmers, and bad planning. Still, it bears reminding that throughout the course of that devastating scourge the Emerald Isle was heavy with food—plenty of fish, beef, oats, and wheat. Indeed, Ireland remained a food exporter, shipping meat and grain to richer nations while a million of its own citizens starved to death and a million and a half others fled to North America. The tragedy of the great Irish famine stemmed not from a food shortage but a shortage of food deemed cheap enough to feed the poor.
Experience has taught us the recklessness of growing one crop to the exclusion of most others. We have learned the hard way that monocultures are vulnerable to whatever microbe or swarming insect comes their way. But we have not managed to shake off the presumption that for most of us food should be cheap. This is not without consequences. In September 2008 the United Nations reported that 75 million souls were added to the roll of the world’s hungry, raising the total to a staggering 925 million worldwide. As it was in Ireland more than a century and a half earlier, this hunger was not traceable to an actual food shortage. The world in 2008 was richer in food than ever before; despite an uptick in population, there was more than enough to go around. But abundance was not enough. In Haiti, Burma, Ethiopia, and the Sudan people starved as the food they grew and harvested went to others. For them this was not a new story. In 1984-85 Ethiopia continued to export beans to the United Kingdom while famine killed a million of its people. Despite the continued threat of famine in 1989, Sudan sold 400,000 tons of sorghum to the European Community for animal feed. Today, despite continued food shortages, in Ethiopia much of the best land is devoted to growing coffee (which comprises more than 50 percent of that country’s exports) and in Sudan to growing cotton (50 percent of exports). This is not to suggest that exports and cash crops are not vital to these economies; they are, of course. But by focusing on one or two crops and making trade a priority, these nations greatly increase their risk of food insecurity, just as Ireland did two centuries ago. It is a terrible irony that the global demand for ever cheaper food has pushed the most vulnerable—poor families in the developing world—to the brink.
ANYONE WHO HAS laid eyes on a modern factory farm knows that factory—not farm—is the operative term. Agribusiness and the technology powering it enable efficiencies beyond the dreams and reach of any ordinary farmer. Livestock genetically engineered in the lab, fattened on corn and growth hormones in confinement facilities, and pumped with antibiotics grow into spectacular specimens. Crops grown from scientifically optimized seeds and lavished with petroleum-based fertilizers and herbicides do, too. All this makes for extremely cheap food not only in the United States but in much of the world. Between 1974 and 2005 food prices on world markets fell by three-quarters, meaning that in real terms food was much cheaper in 2005 than it was a generation earlier. Technology-driven efficiencies are one reason that food prices fell so far. Another is government-supported protections and subsidies for mega-farms.