It is clear that American families have been struggling in recent decades. Less obvious are the forces that are responsible for this reversal of fortune. However, a significant body of research now points to a confluence of economic and social trends that many scholars agree have played a crucial role in the rise of financial insecurity.
The Rise of the Service Economy
Since the 1970s, work in the United States has undergone a dramatic transformation—a regression from the New Deal quest for stability and from shared prosperity to insecurity security to a state in which work is precarious. In the words of sociologist Arne L. Kalleberg, work has become more “uncertain, unpredictable, and risky from the point of view of the worker.”
One reason for the rise of precarious work is the wholesale restructuring of the American economy from one based on manufacturing to one based on services. After World War II the manufacturing sector comprised 40 percent of the labor force; by 2005, that share had fallen to only 12 percent. The service sector now makes up about 80 percent of the jobs in the United States. Durable manufacturing jobs (autoworker, machinist, chemical engineer) offering higher wages and good benefits have been replaced by service sector jobs (store clerk, cashier, home health-care aide) that pay less, offer few or no benefits, and are more insecure.
Moreover, while the manufacturing sector tends to create good jobs at every employment level, the service sector tends to create a relatively small number of high-skill, high-paying jobs (in fields like finance, consulting, and medicine) along with a large number of low-skill, low-paid jobs (in retailing, child care, and hospitality). The result is that secure, semiskilled middle-income jobs like those that once fueled the rapid expansion of the American middle class are increasingly hard to find.
The Impact of Globalization
Beginning in the mid-to-late 1970s, U.S. firms began to face dramatically increased competition from around the world. To compete, American companies sought to lower labor costs, in part by outsourcing work to lower-wage countries. Technological advances aided this outsourcing process, as the growth in electronic tools for communication and information management meant that goods, services, and people could be coordinated and controlled from anywhere around the globe, enabling businesses to more easily move their operations to exploit cheap labor sources abroad.
Perhaps the most far-reaching effect of globalization has been a renegotiation of the unwritten social contract between American employers and employees. Managers now demand greater flexibility to quickly adapt and survive in an increasingly competitive global marketplace. In this context, the traditional employment relationship, in which work is steady and full-time, workers are rarely fired except for incompetence, working conditions are generally predictable and fair (often defined by union-negotiated contracts), and good employees can expect to climb a lifetime career ladder in the service of one employer, has come to seem unrealistic and onerous to business leaders. Today that traditional arrangement has largely disappeared, replaced by nonstandard, part-time, contract, and contingent work, generally offering reduced wages and scanty benefits. Mass layoffs are no longer an option of last resort but rather a key restructuring strategy used to increase short-term profits by reducing labor costs in both good times and bad.
The Decline of Unions
In this new environment, unions are struggling. Although manufacturing workers have a long history of labor organizing, service sector workers such as restaurant and retail employees do not, making it harder for service employee unions to grow. Moreover, globalization, technological changes, and the spread of flexible work arrangements have combined to enable employers to make an end run around unions by moving jobs to countries or parts of the United States where anti-union attitudes and laws predominate. As a consequence of these developments, union membership has steadily declined. In 1954, at the peak of union membership, 28 percent of employed workers were in unions. By 1983, only 20 percent of workers were union members. In 2012, union membership reached a historical low, with membership comprising only 11 percent of American workers. Among full-time workers, the median weekly earnings for union members is $943, while among nonunion workers the median weekly earnings is $742. The decline of unions has severely curtailed and diminished workers’ ability to collectively bargain to maintain high wages and good benefits, indirectly fueling a steady decline in the value of the minimum wage. Moreover, the decline of unions has eroded a broader moral commitment to fair pay, which even nonunion workers previously benefited from.
Together, the rise of the service economy, globalization, the decline of unions, and the erosion of the old work contract between employers and employees have created a precarious work environment for more and more Americans. Between the 1980s and 2004, more than 30 million full-time workers lost their jobs involuntarily. And during the Great Recession of 2008–2009, another 8.9 million jobs were lost. In the past few years, long-term unemployment has reached levels not seen since the government began monitoring rates of joblessness after World War II.
Risk Shifts to the Individual
Over the last several decades, both government policy and private sector labor relations have evolved to reduce the sharing of the economic risks involved in managing lives, caring for families, and safeguarding futures. Instead, individual Americans are increasingly being asked to plan for and guarantee their own educations, health care, and retirements. If today’s families want a safety net to catch them when they fall, they need to weave their own.
Underlying this shift in risk is neoliberal political ideology, often identified with leaders like Ronald Reagan and Margaret Thatcher, which holds that people will work harder and make better decisions if they must defend themselves against the vicissitudes of life. Neoliberal doctrine views dependence in a negative light (arguing that “coddling” by government undermines individual initiative) and actually celebrates risk and uncertainty as sources of self-reliance. In this new paradigm, the individual is encouraged to gain greater control over his or her life by making personal risk-management choices within the free market (and living with the consequences of any misjudgments). In this “ownership society,” individuals must learn to be secure with insecurity; the goal is to amass security on our own rather than look to government help or collective action as sources of support.
With the rise of neoliberalism, the ethic of sharing risk among workers, employers, and the federal government that emerged after the New Deal was replaced by an aggressively free-market approach that pushed deregulation and privatization in order to minimize the role of government in economic life. At the same time, responsibility for social welfare has steadily devolved from the federal government to states, localities, and even the private sector. The push toward privatizing social services reached a new level when President George W. Bush, through his establishment of the office of faith-based organizations, sought to formally create public-private partnerships in which welfare provision would increasingly be supplied not by the government but by religious organizations. The result of this devolution of social services has been the replacement of a relatively stable, consistent system of safety-net programs with a patchwork of state, local, and private programs, all of which scramble to find funding.
Though many Americans may be unfamiliar with the risk shift story, the results are widely known. From 1980 to 2004, the number of workers covered by a traditional defined-benefit retirement pension decreased from 60 percent to 11 percent. In contrast, the number of workers covered by a defined-contribution retirement benefit like a 401(k) plan, in which the worker is fully responsible for saving and managing his or her savings, grew from 17 percent in 1980 to 61 percent in 2004.
Traditional employer-provided health-care coverage began to erode as well. From 1979 to 2004, coverage dropped from 69 percent to 55.9 percent. In 2010, 49 million Americans were uninsured, an increase of close to 13 million people since 2000. For workers who continue to receive coverage, their share of the costs has increased drastically. A survey conducted by the Employee Benefit Research Institute found that to cover medical costs, 45 percent have decreased their contributions to other savings, 35 percent have had difficulty paying other bills, and 24 percent have had difficulty paying for basic necessities.
The Affordable Care Act, passed in 2010 and upheld by the Supreme Court in 2012, will greatly expand affordable health care. As a result of the legislation, it is estimated that by 2019, 29 million Americans will gain health insurance coverage. However, an equal number will still be uninsured. And the number of uninsured may rise depending on how many states opt out of expanding Medicaid eligibility. Currently twenty states will not participate in the Medicaid expansion. Analysis of states that won’t expand Medicaid has found that, as a result, about 5.3 million people will earn too much under their state’s Medicaid eligibility level to qualify but will earn too little to be eligible for tax credits that help offset the cost of insurance. Of the top ten least-insured metropolitan areas in the United States, seven are in states that will not expand Medicaid eligibility.
When it comes to aid for higher education, federal funding has grown, but that aid has mostly come in the form of loans rather than grants. Over the last decade, grants have made up between 22 and 28 percent of federal aid for education, while loans have made up between 61 and 70 percent. Moreover, even though there has been a 15 percent increase in the number of low-income students who receive a Pell Grant, the maximum award these students can receive now covers only about a third of the costs of a college education, as compared to around three-quarters in the 1970s.
The high price of a college degree is linked with a significant decline in the number of low- and moderate-income students who enroll in and graduate from college. Between 1992 and 2004, the percentage of low-income students enrolled in a four-year college decreased from 54 to 40 percent and the percentage of middle-income students decreased from 59 to 53 percent. For low-income children, the college completion rate has increased by only 4 percentage points between the generation born in the early 1960s and the generation born in the early 1980s. In contrast, among high-income children the college graduation rate increased 18 percentage points between generations. If education is the ladder by which less-advantaged Americans can hope to rise to the middle class and beyond, the rungs of that ladder are increasingly out of reach—yet another way in which the traditional system of shared social responsibility has been gradually dismantled over the past forty years.
Feeling insecure
With instability and uncertainty figuring prominently in people’s lives, it is important to ask if these social and economic trends are reflected in the way Americans feel. Do Americans feel more insecure? Have they become more worried? This question turns out to be a difficult one to answer.
The first obstacle to figuring out the answer is that we lack rich, long-term survey data that would enable us to tease out an in-depth answer. As a recent Rockefeller Foundation report noted, efforts to assess and measure people’s sense of security are rare. And the surveys we do have focus almost exclusively on job loss, which is just one risk among many that needs to be explored.
A second obstacle to measuring perceptions of security and insecurity across the decades is whether or not, over time, people continue to judge and evaluate their situations by the same criteria. In other words, can we assume that year in and year out people use the same yardstick to measure whether or not they are having a good or bad year? If assessments and meanings change over time and surveys don’t capture these subjective changes, then it’s not clear what our assessments are really measuring.
Analysis by Richard Curtin, the director of the Survey of Consumers at the University of Michigan, addresses the subjective nature of evaluation in his analysis of changes in the standards by which consumers have judged the economy over the last fifty years. For example, during the 1960s people had high expectations and were very confident about the government’s ability to control the economy and keep things on track. Such optimism about rising affluence ran into a brick wall during the economic shocks of the 1970s and early 1980s. Initially, dissatisfaction ensued as people continued to hold on to the economic aspirations from the past. By the mid-1980s, however, after repeated economic setbacks, consumers lowered their expectations about achievable growth rates and became more tolerant of high inflation and high unemployment. By the early 1990s, fears about job security grew as Americans became skeptical about the government’s ability to use economic policy to prevent downturns.
At this point expectations were so diminished that it took one of the longest economic expansions in U.S. history to reset high levels of optimism. Fueled by the dot-com boom, aspirations soared. In 2000, consumer confidence hit a new peak. With expectations high, consumers in the early 2000s cited high unemployment as an issue even though it was only around 6 percent, half as much as it had been in the early 1980s. The optimism of the late 1990s soon gave way to pessimism because of the successive recessions of 2001 and late 2007. In fact, between January 2007 and mid-2008, the Index of Consumer Sentiment fell by 42 percent, the greatest percentage decline compared to any other recession.
By mapping out historical shifts in consumers’ assessments of the economy, Curtin illustrates how “the same level of economic performance, say in terms of the inflation or unemployment rate, can be evaluated quite differently depending on what was thought to be the expected standard.” Moreover, changes in standards of evaluation usually occur very slowly and therefore can be difficult to detect. And since different groups of Americans have fared differently as a result of macroeconomic changes, it stands to reason that some Americans may have altered their standards and expectations sooner than others, and some may have altered their aspirations more significantly, and perhaps more permanently. In all likelihood, for example, those employed in the waning manufacturing sector, like autoworkers, had to let go of their expectations for a secure economic life long before and to a much larger degree than have college-educated Americans employed in the expanding service sector.
With this in mind, when sociologists Katherine Newman and Elisabeth Jacobs looked at survey data from the late 1970s to just before the Great Recession that examined people’s economic perceptions, they found something interesting. Their analysis revealed that, despite a few peaks and valleys, overall trends during this period suggest that Americans came to see themselves as more secure and in better financial shape, with about the same likelihood of losing their job. As we might expect, their analysis found that those with the lowest incomes and least education expressed the most vulnerability to employment insecurity and financial hardship, while those with higher incomes and more education expressed lower levels of concern.
Yet, despite their lower levels of concern overall, Americans with higher earnings, bachelor’s degrees, and managerial jobs have nonetheless exhibited the biggest increase in worry. Over the last thirty years, the proportions of college graduates and managers who said that they are likely to lose their jobs next year and the proportions who said they did worse financially this year than last year have gone up. The rise in concern about job security and financial stability among this group reflects new realities. During this period, the rate of job loss for the most educated went up faster than the rate of job loss for less-educated Americans. And when these workers lost their jobs and found new ones, the new jobs often didn’t pay as much. By 2001, workers with a bachelor’s degree experienced about a 23 percent drop in their earnings after losing a job. Such trends stand at odds with a long-standing belief among Americans with college degrees that their skills and credentials will translate into a solid footing. If discontent emerges when there is a gap between expectations and outcomes, then it would make sense for concern to increase more among the group that still thought it was well positioned to maintain a good, secure life. When this kind of an expectation smacks into job loss and downward mobility, people will start to worry.
For Americans with less education and lower earnings, it is very possible that worry as measured by feelings about job insecurity and financial hardship did not increase as much over a sustained period because they altered their expectations sooner and more permanently than did better-off Americans. As Newman and Jacobs point out, when those at the bottom lose a job, there is not as far to fall. For such families, their economic situation doesn’t change much from year to year; it’s always bad. Alternatively, other families may have taken on debt in order to hold on to their standards for security. The lack of a consistent and steep increase in worry among less well-off Americans thus does not necessarily signal that they feel more secure than they used to feel. To be sure, it could actually mean that they have gotten used to having less or gotten used to the high levels of debt required for them to hold on to traditional conceptions of security amid declining fortunes. What is also likely going on is that people’s frame of reference for what security even means has undergone a transformation. Finally, it could also be the case that our standard measures for these issues (concern about job security and whether or not we are worse off this year than last) don’t allow us to accurately assess people’s feelings.
We do not have the kind of comprehensive longitudinal survey data that would enable us to detect subjective changes in Americans’ views about what constitutes security and insecurity and whether such definitions shape trends in worry and concern over time. But other measures point to increases in insecure feelings among Americans. For example, even before the Great Recession started, about half of those surveyed worried somewhat about their economic security, with one-quarter “very” or “fairly” worried. By 2009, just over half of those surveyed were now “very” or “fairly” worried. A Pew Research survey done in 2011 found that only 56 percent of those polled felt that they were better off financially than their own parents were when they were the same age, which is the lowest percentage since the question was first asked in 1981, when 69 percent said they felt better off. In 2012, the General Social Survey (GSS) found that less than 55 percent of Americans agreed that “people like me and my family have a good chance of improving our standard of living,” the lowest reported level since 1987. That same year, the GSS also found that a record number of Americans (8.4 percent) identified themselves as “lower class,” which is the highest percentage reported in the forty years that the GSS has asked this question.
And we may be seeing changes in the definition of the American dream. The American dream has long been equated with moving up the class ladder and owning a home, but recent surveys have noted shifts away from such notions. When Joel Benenson, chief pollster for President Obama, examined voters’ thoughts about economic security and the American dream in 2011, he found something new. His polling discovered that middle-class Americans were more concerned about keeping what they have than they were with getting more. Another 2011 survey found the same thing. When asked which is more important to them, 85 percent of those surveyed said “financial stability” and only 13 percent said “moving up the income ladder.” In 2007, a survey found that owning a home defined the American dream for 35 percent of those surveyed. By 2013, the top two definitions of the American dream were “retiring with financial security” (28 percent) and “being debt free” (23 percent). Only 18 percent of those surveyed defined the American dream as owning a home.
As the economy experienced wide-reaching transformations, meanings and feelings have likely changed along with it. A National Journal article noted how even the definition of being middle class has undergone adjustment, especially in light of the rise of contract workers or “permatemps,” those who may make a good wage but receive no benefits and can expect no job security. Capturing this adjustment, the article asks, “If they make a decent income, are permatemps middle class? Not by the standards of the past. But by the diminished redefinition, maybe they are: earning a middle-class living—for the moment.”
Amid these shifting economic tides and morphing definitions, many have lost their way. While old beliefs such as that hard work will lead to security and prosperity have fallen by the wayside, it’s unclear to many Americans what new truths lay in their stead. As President Obama’s pollster Joel Benenson discovered, this lack of direction causes a great deal of unease. “One of the big sources of concern for the people we talked with,” Benenson said, “was that they didn’t recognize any new rules in this environment. All of the rules they had learned about how you succeed, how you get ahead—those rules no longer apply, and they didn’t feel there was a set of new rules.” These kinds of examinations suggest that in the age of insecurity, Americans are not just trying to weather an economic storm, but they are also feeling their way through the dark.
In the throes of the Great Depression, Americans decided that there had to be a better way to organize government and society, one that would allow individuals and families to enjoy greater stability and security. This philosophical shift from “rugged individualism” to “united we stand, divided we fall” paved the way for the New Deal, the Great Society, and the forging of an unwritten but pervasive social contract between employers and employees that rested on mutual loyalties and protections. The government invested in its citizens, employers invested in their employees, and individuals worked hard to make the most of those investments. As a result, in the decades immediately following World War II, prosperity reigned, inequality decreased, and a large and thriving middle class was born.
Beginning in the 1970s, this system began to unravel. Large-scale changes from globalization and the rise of the service economy to a philosophical shift toward free-market ideology and a celebration of risk changed the landscape of security in America. Against this backdrop, the government curtailed its investments in and protections of its citizens, and employers rewrote the social contract to increase their own flexibility and demand greater risk bearing by workers. Individuals continued to work hard, but instead of getting ahead, more Americans struggled harder and harder just to get by.
Insecurity now defines our world. The secure society has become the “risk society.” The belief that we are all in this together has been replaced with the assumption that we are each on our own. Cut adrift, Americans are struggling to forge security in an insecure age.
Excerpted from "Cut Adrift: Families in Insecure Times" by Marianne Cooper. Published by the University of California Press. Copyright © 2014 by the author. Reprinted with permission of the author and publisher. All rights reserved.
Shares