Evil Geniuses: The Unmaking of America: A Recent History (27 page)

BOOK: Evil Geniuses: The Unmaking of America: A Recent History
2.59Mb size Format: txt, pdf, ePub

Labor law isn’t only a federal matter. During the New Deal, unions began signing contracts with company managements that required workers to join their unions or at least to pay union dues so there would be no free riders. Soon businesses and the political right started lobbying state legislatures to outlaw such contracts, shrewdly calling the proposed statutes “right-to-work” laws. And then in 1947, during one of just four years between 1933 and 1995 when Republicans controlled both the House and the Senate, Congress amended the National Labor Relations Act to give states permission to enact such antiunion laws. By 1955, seventeen states had done so, mostly in the South. But then the right-to-work movement stalled, and for almost a quarter-century it seemed dead—until the late 1970s and ’80s. Today most states have right-to-work laws, all but two of them states that voted Republican in the 2016 presidential election.

Studying America’s organized labor history, I noticed a symmetry that seems to show a tipping point: the moment when the fraction of all workers belonging to unions hits 25 percent. During the New Deal, that fraction zoomed from less than 10 percent past 25 percent in a decade. It was still 25 percent in the mid-1970s, but then as the right’s Raw Deal forced what had gone up to keep coming down, the percentage plummeted to 10 percent by the 1990s for workers in the private sector, and it kept on shrinking, down to 6 percent today—a level of unionization back to what it was in the very early 1900s.
Most
of the decline in unionization during the last half-century happened just during the 1980s.
*3
Once again, it’s remarkable how much the American 1980s amounted to the 1930s in reverse.

Employers in the ’80s also started using as never before a clever, quieter way of paying low-wage workers even less and neutering their unions: contracting with private firms to do blue-collar service work. This technique proved especially popular among public and nonprofit entities like colleges and cities, for whom the optics and politics of directly nickel-and-diming laborers and security guards could be awkward.

Harvard, for instance, employed a couple of thousand people as guards, janitors, parking attendants, and cooks, most of them unionized. Then in the 1980s and ’90s, the university started outsourcing much of that work to private contractors—contractors that paid lower wages and used nonunion workers. After that the threat of outsourcing still more jobs loomed over all of Harvard’s negotiations with its own unionized workers, which persuaded them to accept lower wages: between 1980 and 1996, their pay actually
fell
from the equivalent of $600 or $700 a week to $500 or $600.
*4
Or consider the people paid to schlep baggage onto and off of planes at U.S. airports. In 2002, 75 percent of them were employed directly by airlines; by 2012, 84 percent of them worked for outside contractors, and their average hourly wage had been cut almost in half, to less than twelve dollars.

Until I started researching this book, I’d never thought about this new wrinkle in the economy, let alone understood its scale or impact. Like so many of the hundreds of changes instituted in the 1980s, the practice of replacing staff with contract workers was too arcane and tedious for many of the rest of us to care or even know about. But imagine the thousands of companies and cities and schools and cultural institutions all over the country that have delegated so much of this kind of work to contractors, thereby making the treatment of all those eleven-dollar-an-hour workers somebody else’s problem. According to a 2018 study by five major-university economists, a full
third
of the increase in American income inequality over these last forty years has been the result of just this one new, dehumanizing labor practice.

Another cunning way big businesses began squeezing workers in the 1980s was to become
extremely
big. “The basic idea,” explains an economist specializing in markets for labor, “is that if employers don’t have to compete with one another for workers, they can pay less, and workers will be stuck without the outside job offers that would enable them to claim higher wages.”

As antitrust enforcement was discredited and enfeebled starting in the 1970s, big corporations were able to get so big and dominant in their business or regions that they had ever fewer companies directly competing with them to hire workers. More and more of them became the only games in town. One of the scholars who has helped expose this particular bit of rigging and its unfairness over the last several decades is the influential, idiosyncratic University of Chicago law professor Eric Posner.
*5
As he and his economist co-author Glen Weyl explain, antitrust laws were enacted to make sure that businesses compete in every way—not just as
sellers
setting the prices they charge for products and services, but also as
buyers
of labor setting the salaries they pay. The appeal of antitrust for citizens was to make sure competition kept prices lower and salaries higher. Enforcement of our antitrust laws, however, has come to focus entirely on consumer prices, particularly since the definitive Borking of the field in the late 1970s. The antitrust enforcers at the Department of Justice and Federal Trade Commission, because they rely “on the traditional assumption that labor markets are competitive,” and that it wasn’t
their
jobs to protect workers anyhow, “have never blocked a merger because of its effect on labor,” and they don’t even employ experts who could calculate those effects. If two rival companies made a secret agreement to cap workers’ salaries, they could get sued, but since “mergers that dramatically increase [companies’] labor market power are allowed with little objection,” the companies can combine and thereby create a salary-squeezing employment monopoly.
*6

Companies don’t even need to merge in order to pay workers less than they’d have to pay in a truly free labor market. I’d assumed only high-end employees were ever required to sign noncompete contracts—an HBO executive prohibited from going to work at Netflix, a coder at Lyft who can’t take a job coding for Uber. But no: shockingly, noncompetes have come to be used just as much to prevent a $10-an-hour fry cook at Los Pollos Hermanos from quitting to work for $10.75 at Popeyes. Of all American workers making less than $40,000 a year, one in eight are bound by noncompete agreements. As another way to reduce workers’ leverage, three-quarters of fast-food franchise chains have contractually prohibited their restaurant operators from hiring workers away from fellow franchisees.

Starting in the 1980s, the federal government also instituted big, covert structural tilts in favor of business—examples of the sneaky, stealthy “drift” effect I mentioned earlier. Inflation was an important tool for the economic right to get its way—first politically in 1980, when rapidly rising prices helped them get power, and thereafter by letting normal inflation move money from employees to employers by means of a kind of macroeconomic magician’s trick. Instead of actually
repealing
two important New Deal laws that had helped workers for four decades, an essentially invisible ad hoc regime of gradual, automatic pay cuts was put in place. One involved overtime pay, time and a half for each hour an employee works over forty a week—which legally goes only to people with salaries below a certain level. The new ploy was to stop raising that salary threshold in the late 1970s, or the ’80s, or the ’90s—thus letting inflation constantly lower it, thereby continually reducing the number of people who qualified for overtime pay. In 1975, when the threshold was the equivalent of $56,000 a year, a large majority of U.S. workers were eligible; in 2019, after just a single increase since 1975, the overtime line was under $24,000, which meant that fewer than 7 percent of American workers qualified.

A similar surreptitious screwing-by-inaction is how the federal minimum wage was dramatically reduced over time. From the mid-1950s until 1980, the minimum wage had been the equivalent of $10 or $12 an hour in today’s dollars. As with overtime pay, the minimum wage was never technically
reduced,
but by 1989 inflation had actually reduced it to just over $7, where it remains today. In other words, it has been the federal government’s unspoken decision to cut the wages of America’s lowest-paid workers by more than a third, a choice first made during the 1980s when it stopped raising the minimum, then ratified again and again by Democratic as well as Republican Congresses. In addition to keeping costs low for the employers of Kroger cashiers and Burger King cooks and Holiday Inn maids, the lower national floor for pay has the invisible-hand effect of pulling down the low wages of people earning more than the legal minimum.

Economic right-wingers have publicly
reveled
in their squashing of workers’ power in so many different ways. Federal Reserve chair Alan Greenspan said in a speech in the 2000s that spectacularly firing and replacing all the striking air traffic controllers in 1981 had been “perhaps the most important domestic” accomplishment of the Reagan presidency.

[It] gave weight to the legal right of
private
employers, previously not fully exercised, to use their own discretion to both hire and discharge workers. There was great consternation among those who feared that an increased ability to lay off workers would amplify the sense of job insecurity. Whether the average level of job insecurity has risen is difficult to judge.

In fact, it began a cascading increase in job insecurity throughout the U.S. economy that wasn’t at all difficult to see and feel and measure.

*1
We entered a similar productivity slough after the Great Recession, which seemed to be ending just before the 2020 recession.

*2
After a dozen years, the Clinton administration lifted that ban on federal employment.

*3
Because the unionization of
government
workers only happened in the 1960s and ’70s, just before the right started its full-bore campaign to turn back time and diminish workers’ power, more than a third of public-sector employees were in unions by 1980, and they still are.

*4
Because it was Harvard, protests and ambient liberalism and an endowment of $18 billion in 2002 persuaded its president—Larry Summers, who’d just served as Clinton’s secretary of the treasury—to start paying those service workers good wages after a generation of stiffing them.

*5
He’s the son of the influential, idiosyncratically conservative University of Chicago law professor, antitrust expert, and former federal judge Richard Posner, who helped transform antitrust and other economic law to help business.

*6
Economists’ term for markets where there’s just one overwhelmingly dominant buyer of labor (or anything else) is a
monopsony.

When I was a little kid, whenever we had to play musical chairs in school or at birthday parties, I never enjoyed it. I hated the tense seconds of waiting for each drop of the needle onto the record. Musical chairs made me anxious and made everyone manic and delivered a nasty set of lessons—life is an accelerating competition of one against all for diminishing resources, survival is just a matter of luck and a touch of brute force, and success is a momentary feeling of superiority to the losers who lose before you lose, with just one out of the ten or twenty of us a winner.

Working on this book, I’ve thought again and again of that game, how the rules of our economy were rewritten as a high-stakes game of musical chairs, with more anxiety and dread and frenzy. In fact, our economy since 1980 has been a particularly sadistic version of the game, where some players are disabled or don’t know the rules, and in addition to winning, only the winners get cake and ice cream and rides home.

The crippling of organized labor since 1980—and the increase in automation and relocating work abroad—helped make most American workers more anxious and uncertain and less prosperous. But there are other ways that increasing insecurity and increasing inequality got built into the political economy and became features of the system more than bugs.

The Friedman Doctrine in 1970 begat the shareholder supremacy movement in the 1980s, which begat an unraveling of all the old norms concerning loyalty and decency of businesses toward employees.
Loyalty
implies treating employees better than the law requires, which was at odds with the new mandates of shareholder supremacy. Replacing strikers was a shock-and-awe swerve, outsourcing work to low-wage contractors a less dramatic form of cold-bloodedness. Both were highly effective means of scaring workers in order to reduce their power and keep their pay lower.

But once the norms changed and a higher stock price became every public company’s practically exclusive goal, companies that weren’t facing strikes or financial problems also embraced the new ruthlessness. In addition to GE and its rank-and-yank corporate copycats continually, automatically firing a fixed quota of employees, profitable corporations began firing workers in bulk simply to please the finance professionals who constitute the stock market. “In the 1980s,” says Adam Cobb, a University of Pennsylvania Wharton School business professor who studies this sudden change in norms, “you started to see healthy firms laying off workers, mainly for shareholder value.” IBM, for instance, abandoned its proud de facto promise of permanent employment—starting in 1990, it got rid of 41 percent of its workers in five years, at first softly, pensioning off people fifty-five and over, then after that using straight mass firings. Throughout U.S. corporate culture, it was as if a decent civilization abruptly reverted to primitivism, the powers-that-be in suits and ties propitiating the gods with human sacrifice—which in addition to increasing profits had the benefit of making the survivors cower before the ruling elite.

Other corporate norms that prevailed from the New Deal until the 1980s, in particular those providing
nonunion
employees with fixed-benefit pensions and good healthcare, had been enforced indirectly by the power of organized labor. Because “companies were very worried about unions and the possibility of strikes,” another Wharton expert on labor relations explains, “they treated their employees well so they wouldn’t join a union. But that is no longer the case. Unions are on the decline. It’s easy to quash them if they try to organize. So some managers might not care as much about employee loyalty as they used to.”

Jacob Hacker, the Yale political scientist, calls this the Great Risk Shift, the ways that starting around 1980, business, in order to reduce current and future costs, dumped more and more risk “back onto workers and their families.” As a result, “problems once confined to the working poor—lack of health insurance and access to guaranteed pensions, job insecurity and staggering personal debt, bankruptcy and home foreclosure—have crept up the income ladder to become an increasingly normal part of middle-class life.”

Health insurance became a standard part of American jobs starting in the 1940s and ’50s, and early on the pioneering, not-for-profit, cover-everyone Blue Cross and Blue Shield associations provided most of that coverage. As commercial insurance companies got into the game, having Blue Cross and Blue Shield as their public interest competitors helped keep the for-profit insurers honest, not unlike how the existence of strong unions tended to make businesses treat nonunion employees better. In 1980 the three-quarters of Americans who had job-based health coverage paid very little in premiums or deductibles or copayments. But it’s been all downhill from there, thanks to more mercilessly profit-obsessed employers and insurance companies and healthcare providers. More and more of the healthcare industry consisted of for-profit corporations that were more and more subject to stock price monomania. Since the 1990s in many states, Blue Cross and Blue Shield have become totally commercial for-profit insurance companies that (deceptively) continue to use the venerable nonprofit brand name. Moreover, barely half of Americans these days are covered by insurance provided by a breadwinner’s employer. The average amount each American paid for medical expenses out of pocket increased by half during the 1980s alone. In 1980 the average family of four spent the equivalent of about $2,700 a year on medical expenses; today an average family of four—$50,000 income, insurance through the job—spends about $7,500 a year out of pocket.

The other existentially important benefit that American businesses began routinely offering in the 1950s was a fixed pension, a guaranteed monthly income for as long as you lived after you stopped working, which would be paid in addition to Social Security. Companies funded the pensions, and they became standard, like cover-almost-everything company-provided health insurance.

But then came the 1980s. I mentioned earlier how the tax code tweak 401(k), which went into effect in 1980, handed a captive audience of millions of new customers and a revenue bonanza to the financial industry. But this innovation also provided a cost-cutting financial bonanza to employers. They now had another clever way to execute on the new Scrooge spirit: replacing the pensions they’d funded for decades with individual-worker-funded investment plans—self-reliance! freedom!—cost them less right away and cost them
nothing
once employee number 49732 left the building for good.

In a recent study, Adam Cobb of the Wharton School found that just as CEOs started satisfying their new Wall Street über-headquarters and shareholder supremacy dogma by laying off workers, they started getting rid of pensions for the same reason. At thirteen hundred of the biggest U.S. corporations from 1982 on, the more a company’s shares were held by big financial institutions like mutual funds and banks—arm’s-length overlords who definitely felt no loyalty to any particular company’s employees—the more likely that company was to get rid of pension plans that had guaranteed benefits. On the other hand, companies that employed
any
unionized workers were likelier to continue paying pensions to their nonunion workers as well.

“The great lie is that the 401(k) was capable of replacing the old system of pensions,” says the regretful man who was president of the American Society of Pension Actuaries at the time and who had given his strong endorsement to 401(k)s. Without any national conversation or meaningful protest by employees—without a union or a Congress that was prepared to step in, how did you push back?—this crucial clause in the modern American social contract was unilaterally eliminated. In 1980 eight out of ten large and medium-size companies paid a guaranteed monthly sum to retirees for life, and
most
American workers retired with a fixed pension on top of Social Security, which the pension often equaled. Today only one in eight private sector employees are in line to get such a pension, and most American workers don’t even have a 401(k) or an IRA or any other retirement account. It’s yet another route by which the U.S. political economy made a round trip from 1940 to 1980 and then back again.

I mentioned the libertarian Fed chair Alan Greenspan’s remark that it was “difficult to judge” if the “increased ability to lay off workers” starting in the 1980s had had structurally, permanently increased Americans’ “sense of job insecurity.”

I am frequently concerned about being laid off.
From 1979 through the 2000s, that statement was posed in a regular survey of employees of four hundred big U.S. corporations, each person asked if they agreed or disagreed. In 1982, early in our new national musical chairs game, during a bad recession with high unemployment, only 14 percent of this large sample of workers said they felt anxious about losing their jobs. The number crept upward during the 1980s, and then in the ’90s people finally registered that, uh-oh, our social contract had been completely revamped. By 1995, even though the economic moment looked rosy—strong growth, the stock market rocketing upward—nearly half of Americans employed by big business said they worried a lot about being laid off.

In fact, in 1997, a strange new condition kicked in—pay continued to stagnate for most Americans despite low and dropping unemployment rates. A fundamental principle of free markets was being repudiated: the
supply
of labor could barely keep up with demand, but the
price
of labor, wages, wasn’t increasing. Alan Greenspan, as he presented his semiannual economic report to the Senate Banking Committee, mentioned those survey results and testified that the surprising “softness in compensation growth” was “mainly the consequence of greater worker insecurity” that had arisen since the early 1980s, insecurity that was also responsible, he said, for the continuing “low level of work stoppages” by unionized workers.

In other words, employees of the biggest corporations, whose jobs everyone had considered the most secure, were now too frightened of being jettisoned from those jobs to push hard for more pay or better working conditions.

Those data and their implications must’ve slipped Greenspan’s mind later when he found it “difficult to judge” the effects of insecurity on workers’ leverage and pay. And he never mentioned, of course, that it was he and his confederates on the right who’d spent the last decades restructuring our political economy to reduce the power of workers and increase their job insecurity. He did say he thought the curious disconnect in the late 1990s—low unemployment but no pay increases—was a blip, that “the return to more normal patterns may be in process” already. But two decades later it remained the not-so-new normal. The long-standing balance of power between employers and the employed was completely changed.

The impact of suddenly higher insecurity was a cascade of more insecurity. Starting in the late 1980s, as soon as Greenspan’s beloved new “ability to lay off workers” took effect, the fraction of Americans who actually lost their jobs each year increased by a third and stayed there. At the same time, individual household incomes started roller-coastering down and up and down as they hadn’t before. Soon the household incomes of one in eight Americans, poor and affluent and in between, were dropping by half or more in any given two-year period. Between 1979 and 1991, personal bankruptcies tripled (and then doubled), and the mortgage foreclosure rate quadrupled (and then doubled).


At the same time that economic insecurity grew, new sources of economic inequality were built into our system that made insecurity more chronic and extreme. Scores of public and private choices and changes increased inequality, all shaped by the new governing economic gospel: everybody for themselves, everything’s for sale, greed is good, the rich get richer, buyer beware, unfairness can’t be helped, nothing but thoughts and prayers for the losers.

What happened with higher education is a prime example. College had been the great American portal to upward economic and social mobility, especially public universities, which give out two-thirds of all four-year undergraduate degrees. But in the 1980s, that portal started becoming much harder to get through financially
and
much more financially vital. Meanwhile the rapidly rising cost of college provided a new business opportunity for the ravenous financial industry, which beset graduates (and people who failed to graduate) with debt that made the chronic new economic insecurity even worse. If omnipotent sadists had set out to take an extremely good, well-functioning piece of our political economy and social structure and make it undemocratic and oppressive, this is what their scheme would’ve looked like.

When I graduated high school in the 1970s, I could’ve gone with a plurality of my friends to the University of Nebraska, for which my parents would’ve paid resident tuition, room, and board equivalent to $10,000 a year. But I got into Harvard, so I went there, which cost the equivalent of $22,000 a year, all in. Those prices were typical at the time. They were also the same as they’d been for public and high-end private colleges a decade earlier.

Other books

The Fire Child by Tremayne, S. K.
Eye For A Tooth by Yates, Dornford
A Taste of Merlot by Heather Heyford
Final Cut by Lin Anderson
A Sixpenny Christmas by Katie Flynn
Freelancers: Falcon & Phoenix by Thackston, Anthony