McDomination: How corporations conquered America and ruined our health

The corporate class has accumulated a staggering level of power in Washington — and we've all paid the price

Published February 23, 2014 2:00PM (EST)

  (AP/Gene J. Puskar)
(AP/Gene J. Puskar)

Excerpted from “Lethal but Legal: Corporations, Consumption, and Protecting Public Health”

On August 23, 1971, Lewis Powell sent a confidential memo to his friend Eugene Sydnor, Jr., the director of the U.S. Chamber of Commerce. The memo was both a call to arms and a battle plan for a business response to its growing legion of opponents. Powell was a corporate lawyer, a former president of the American Bar Association, and a board member of eleven corporations, including Philip Morris and the Ethyl Corporation, a company that made the lead for leaded gasoline. Powell had also represented the Tobacco Institute, the research arm of the tobacco industry, and various tobacco companies. Later that year, President Richard Nixon would nominate Powell to sit on the U.S. Supreme Court, where he served for fifteen years.

Powell’s memo serves as a useful starting point for understanding how the transformation of the corporate system that began in the 1970s set the stage for today’s global health problems. “No thoughtful person can question that the American economic system is under broad attack,” wrote Powell. “The assault on the enterprise system is broadly based and consistently pursued. It is gaining momentum and converts.” “One of the bewildering paradoxes of our time,” Powell continued, “is the extent to which the enterprise system tolerates, if not participates in, its own destruction.” He enumerated the system’s enemies: well-meaning liberals, government officials intent on regulating business, news media, student activists, and an emerging environmental and consumer movement— especially its most visible leader, Ralph Nader, in Powell’s view “the single most effective antagonist of American business.”

Powell called on business, especially the Chamber of Commerce, to end its “appeasement” of its critics and launch an aggressive and systematic counter-assault. The memo warned that “independent and uncoordinated activity by individual corporations, as important as this is, will not be sufficient. Strength lies in organization, in careful long-range planning and implementation, in consistency of action over an indefinite period of years . . . and in the political power available only though united action and national organizations.”

Powell urged new, well-funded public media campaigns to support the free enterprise system, the creation of think tanks and institutes to develop policy proposals and “direct political action” in legislative and judicial arenas. “It is time,” he argued, for “American business . . . to apply their great talents vigorously to the preservation of the system itself.” Powell’s “confidential” memo was first circulated within the Chamber of Commerce, then released in 1972 by investigative reporter Jack Anderson during the Powell Supreme Court confirmation hearings. While the document may not have been the blueprint for the rise of the Republican right that some analysts claim, its real value is as the articulation of the corporate prescription for capitalism’s ills.

Today, more than forty years after business took up Powell’s appeal, its success in achieving the goals he laid out makes it hard to fathom the depth of his concern. But the early 1970s were a high point for several public-interest movements that personified and amplified a growing opposition to business dominance. In addition, the recent victories of the civil rights and antiwar movements, coupled with the emergence of the women’s and environmental movements, meant change was in the air. Millions of Americans had shown their willingness to protest, demonstrate, and oppose corporate consumerism, so corporate leaders were understandably worried about their future.

These fears were amplified by legislative action. Between 1960 and 1980, under three Democratic and two Republican presidents, Congress passed an astonishing forty-nine laws that gave consumers, workers, and the environment new protection. These new laws, and the agencies that implemented them, governed the practices of the auto, alcohol, firearms, food, pharmaceutical, and tobacco industries, discussed previously, as well as every other industry in America. While each law had limitations, and many were inadequately enforced, together they constituted a sea change in government and corporate relations and signaled the willingness of both Republicans and Democrats to expand the rights and protections of consumers. After 1980, new regulations were of course still promulgated, but at a much slower pace, and many of the new laws limited or rolled back those passed in the previous two decades.

New organizations, such as Common Cause and Friends of the Earth, emerged to bring middle-class opposition to harmful business practices to Washington. Ralph Nader, Powell’s nemesis, founded a myriad of groups, including the Center for Auto Safety, the Corporate Accountability Project, the Center for Responsive Law, and the Public Citizen Litigation Group—all to create what Mark Green, a Nader protégé, called “a government in exile” that “waged a crusade against official malfeasance, consumer fraud and environmental degradation.” Nader’s 1965 book, Unsafe at Any Speed, an exposé of the auto industry, and Rachel Carson’s 1962 Silent Spring, a critique of pesticide use, alerted millions of Americans to the harmful health consequences of certain corporate practices.

Powell’s memo emphasized the domestic threats to corporate America, but abroad, even more pervasive challenges to United States–based corporate control were emerging. First, following the devastation of World War II, Japan and Germany had rebuilt their economies. Between 1950 and 1980, real annual per capita growth in gross national product went up by 7.4 percent in Japan and 4.9 percent in West Germany—but only by 2 percent in the United States.8 Fueled by this rapid growth, Japanese, German, and other European corporations began to compete with U.S. companies for global markets and profits. In the auto industry, for example, in 1953, the United States made 70 percent of the world’s motor vehicles; by 1968, this share was down to 38 percent.

Over the next four decades, the increasing globalization of multinational corporations forced U.S. companies to develop new strategies for increasing profits. This globalization of capital also changed how U.S. companies interacted with the U.S. government, ultimately diminishing government influence on companies. By 2008, shortly before the start of the economic crisis, Business Week observed that, “in effect, U.S. multinationals have been decoupling from the U.S. economy in the last decade. They still have their headquarters in America, they’re still listed on U.S. stock exchanges and most of their shareholders are still American. But their expansion has been mainly overseas.” As economic pressures forced corporate managers to let go of patriotism that may have motivated earlier decisions, their concern for the well-being of the American people and its economy declined.

The early 1970s also brought the first of several energy crises, precipitated by declines in energy production in the United States and by the growing power of oil-producing states to set energy prices. Rising oil prices contributed to a global economic downturn, ending two decades of American prosperity during which U.S. national income had nearly doubled and the size of the middle class had expanded significantly. A stock market crash in 1973–1974 further contributed to a global economic downturn, alarming investors and corporate owners.

It was in response to these national and international threats that corporate America devised a new game plan, designed to restore corporations’ ability to advance their political and economic agendas. No single individual or organization had the power to shape this response, but Powell’s memo clearly laid out a comprehensive agenda that could mobilize American business to take on its challengers.

Corporations’ rise to dominance: From the early years to the 1970s and 1980s

Responding to changing economic and political conditions was nothing new for corporations; they had long been an essential component of American society and had learned how to adapt to modifications in the political climate. Corporations were first created in the late seventeenth century in Holland, Great Britain, and elsewhere to mobilize capital for public development projects such as railroads or shipping canals that were too big for individual investors, and for colonization. In the mid–nineteenth century, they played an important role in creating the infrastructures for American industrialization, and by the late nineteenth century, they had become central players in the American economy. Later, through federal legislation, they received additional legal protections such as limited liability, constitutional protections, and extended lifespan. By the end of the twentieth century, most observers across the political spectrum agreed that corporations dominated the global economy and served as the principal agents for modern capitalism.

In some periods, corporations and their allies were more successful in achieving their business and political objectives than others. In the 1880s, for example, the “robber barons”—the corporate leaders of the steel, oil, and railroad industries—were able to win new government protection for trusts; and in the 1920s, prior to the Great Depression, business persuaded the federal government to keep taxes low, tariffs high, and regulation weak. At other times, popular movements won important victories from business, including the creation of the Food and Drug Administration in 1906, the right for workers to organize labor unions, and, during the New Deal, the creation of new regulatory agencies such as the Securities and Exchange Commission and the Federal Communications Commission. In 1936, campaigning for reelection, Franklin D. Roosevelt articulated this populist theme, “We now know that government by organized money is just as dangerous as government by organized mob.” Even in these more populist moments, however, corporate leaders were able to play major roles in structuring reforms and government regulations to ensure that their long-term interests were not sacrificed.

On one hand, the orchestrated corporate response to the domestic and global political and economic crises of the 1970s was simply corporations taking care of business as usual. Different were the concurrent changes in the economy, technology, and politics, which gave corporations powerful new tools to bring to their counter-offensive. New discoveries in information and communications such as computers, mobile phones, and later the Internet, made it easier to plan and coordinate national and global campaigns and to move capital and production around the world. Air transport and containerized shipping created the conditions for global consumer markets. Professions such as public relations, advertising, and lobbying developed sophisticated new techniques that enabled corporations to mobilize support for their economic and political goals.

At the same time, many of the countervailing powers that had in earlier times challenged corporate advances were in decline. Changes in family structure and job opportunities allowed corporations to take over what had been family responsibilities: McDonald’s replaced Mom’s cooking, TV became the new baby sitter, and Hollywood and Madison Avenue taught children about food, shopping, sex, and relationships.

Other institutions also lost influence to corporations. Patients learned about new drugs from advertisements rather than from their family doctor or local pharmacist. Churches and faith organizations, which had been an arena for social interactions, and sometimes offered critiques of unrestrained markets, lost parishioners to the mall, or decided to endorse wealth as the new virtue. Labor unions declined in membership and political influence. In the mid-1950s, more than a third of U.S. private sector workers belonged to labor unions; by 2005 this had declined to less than 10 percent. Political parties, in the past a limited avenue for popular participation in politics, became increasingly professionalized and subject to the influence of wealthy campaign contributors and lobbyists.

The mass media, at times a powerful critic of corporate excess, also came under the control of big business. Large media conglomerates such as Disney, National Amusements, Time Warner, Viacom, News Corp., Bertelsmann AG, Sony, and General Electric took over television, radio, book publishing, movies, music, and other media and often shared directors or owners with other companies, reducing any impetus for journalistic investigations of harmful corporate practices. Later, the Internet, increasingly dominated by big corporations and advertisers, replaced more intimate and face-to-face forms of communication.

Together, these changes helped clear the playing field for the amplified corporate voice that Lewis Powell had urged in his memo. As the public-interest movements that had alarmed Powell and his allies waned in the late 1970s and especially after the election of Ronald Reagan in 1980, big business again became the dominant voice in social and economic policy and politics, as it had been in the 1880s and 1920s.

In the short term, the Powell memo and the point of view it represented led to rapid changes. The Chamber of Commerce soon “doubled its membership, tripled its budget and stepped up its lobbying efforts” in Washington, where it became the dominant corporate voice. In 1971, the National Association of Manufacturers, the voice of corporate producers, moved from Cincinnati to Washington, D.C., where it, too, played a growing role in public policy. In 1971, 175 companies had registered lobbyists in the capital; by 1982, the number had increased to almost 2,500. Between 1976 and the mid-1980s, the number of corporate political action committees (PACs) increased from 300 to more than 1,200.

Several new business-friendly think tanks were established, including the Heritage Foundation and the Cato Institute, and during the 1970s, the American Enterprise Institute, another Washington-based business-friendly organization, increased its staff from ten to 125 and its budget from $1 million to $8 million. Business leaders also created more activist organizations designed expressly to change policy, including the Business Roundtable and the American Legislative Exchange Council. As journalist Bill Moyers wrote in 2011, these responses to Powell’s call “triggered an economic transformation that would in time touch every aspect of our lives.”

How economic change led to changes in health

From their inception, the primary objective of corporations was to make money for their investors. Until the changes that began in the 1970s, however, social, political, and economic factors constrained corporations and limited their impact on people’s day-to-day lives outside the factory gates. Several key developments set the stage for this transformation.

Short-termism: “Short-termism” describes the emphasis on investors’ getting a return on investment quickly—in a few quarters or years rather than in decades. As capital became more mobile and companies extended their global reach, investors had more opportunities to look for higher rates of return. “Shareholder value” became the new mantra for corporate managers, and executives who failed to meet earnings goals had their salaries docked or lost their jobs. Managers were forced to focus on cost-cutting, quarterly returns, and short-term quick fixes to boost revenues. For the corporation itself, too many disappointing quarters led to a loss of investors and fears of takeover.

In an oft-quoted 1981 speech, General Electric CEO Jack Welch laid out the principles. The questions companies need to ask during what he called “slow growth” periods, he told financial analysts meeting at the Pierre Hotel in New York City, were “how big and how fast” a company could grow. “Management and companies that hang on to losers for whatever reason, tradition, sentiment, their own management weaknesses, won’t be around in 1990.” “Neutron Jack” practiced what he preached: each year he fired the managers with the lowest returns, ensuring a sharp focus on the bottom line. This focus often had an impact on health: the Big Three auto companies chose to invest heavily in polluting SUVs because these vehicles produced windfall profits that kept investors happy— even as they contributed to the longer term decline of the auto industry. In this environment, concerns about the long-term safety of new products or the sustainability of a production practice inevitably lost precedence to profitability.

Financialization: “Financialization” has been defined as a “pattern of accumulation in which profit making occurs increasingly through financial channels rather than through trade and commodity production.” As investor demand for profit increased, the returns on investments in mortgages, derivatives, or commodities futures were higher than for those in industries that produced goods or services. This increased the demand for short-term results in the traditional industries and contributed to rapid acquisition and selling of companies. Increased use of leveraged buyouts, junk bonds, and hedge funds were among the consequences of the increasing financialization of corporate America. Between 1990 and 2010, the financial sector’s share of total corporate profits doubled in the United States, reaching as high as 44 per cent in 2002. The fast growth and high profits in this sector exacerbated the pressure on consumer corporations to match these returns or risk losing capital to these more promising investments. Over time, companies that made products to sell to consumers lost ground to companies that bought and sold risk, depending on these new financial firms for investment and loans. Maximizing shareholder value often trumped holding on to long-term customers, leading to more volatile markets and ever more urgent quests for blockbuster products that would please investors even if they harmed consumers.

The story of the leveraged buyout of RJR Nabisco, a leading tobacco and food company, told by Bryan Burrough and John Helyar in their book Barbarians at the Gate, shows how companies became more concerned with making deals than with making products. In 1988, Henry Kravis, one of the originators and a master of leveraged buyouts, took on RJR Nabisco CEO Ross Johnson in a battle for control of the corporation that had made its fortune from selling tobacco (Camel and Winston), alcohol (Heublein Spirits, maker of Smirnoff vodka and Don Q Rum) and processed food (Oreos and Mallomars). The drama featured a cast of more than a dozen other leading companies, banks, and law firms: Shearson Lehman Hutton, American Express, Dillon Read, Drexel Burnham Lambert, The First Boston Group, Forstman Little, Goldman Sachs, Lazard Frères, Morgan Stanley, Salomon Brothers, Skadden Arps, and Wasserstein Perella. In the end, Kravis signed a $31.4 billion deal for one of America’s premier companies—at the time, the highest price ever paid for a corporation.

Since the deal was financed with debt, it heightened the pressure on RJR to produce profits by any means necessary. In the musical-chairs game leading up to the deal, businesses involved in the negotiations included corporate giants such as Kellogg’s, Pepsi, Philip Morris, and Pillsbury. During the hostile-takeover boom of the 1980s, nearly one-third of the largest U.S. manufacturers were acquired or merged. If companies were being traded like baseball cards, what executive had to worry about the long-term liability of the company’s products or practices?

Deregulation: “Deregulation” is the dismantling of existing regulations or their lax enforcement. Beginning in the 1970s, businesses argued that government regulations, not changes in the global economy, were a main cause of lower profits, and therefore these regulations should be suspended or “reformed.” In his successful 1980 campaign for president, Ronald Reagan promised business audiences that he would “turn you loose again to do the things I know you can do so well,” and delivered on his promise for regulatory relief by withdrawing, relaxing, or not enforcing dozens of regulations, including many of those passed in the previous two decades. One of Reagan’s contributions was to centralize regulatory oversight in the White House. As James Miller, the head of the newly created Vice President’s Task Force on Regulatory Relief (and an alumnus of the business think tank the American Enterprise Institute) put it, by claiming direct oversight, the president “would not have guerrilla warfare from agencies that don’t want to follow Reagan’s prescription for regulatory relaxation.” In other words, politics, not science, was to inform regulatory decisions.

One example of this deregulation helps explain the growth of marketing of unhealthy food to children. In 1984, the Federal Communications Commission lifted restrictions on television advertising to children that had been in place since the 1970s, opening the door to a flood of ads for fast food, soda, sweetened cereals, and candy targeting young children. That decade marked the beginning of the dramatic rise in child obesity. After the 2008 financial crisis, corporations renewed their war on regulation, charging that it (not their risky speculation) was preventing a return to economic growth.

Two other examples show how deregulation can harm health. In 1994, as a result of intense lobbying by vitamin and food supplement makers, the U.S. Congress passed the Dietary Supplement Health and Education Act (DSHEA), which limited the Food and Drug Administration’s authority to regulate supplements. Under the new Act, as long as manufacturers made no claims about their products’ treating, preventing, or curing diseases, the FDA had to prove they were harmful rather than the industry having the prior obligation to prove they were safe. Consumer Reports judged that “the law has left consumers without the protections surrounding the manufacture and marketing of over-the-counter or prescription medication.” Supplement manufacturers were now able to launch products without any testing at all, just by sending the FDA a copy of the language on the label. In 2010, a Government Accountability Office (GAO) report on FDA oversight of dietary supplements found that nearly all of the herbal dietary supplements that the GAO tested contained trace amounts of lead and other contaminants, and 16 of the 40 supplements tested contained pesticide residues that appeared to exceed legal limits. Among the illegal claims that supplementary makers made were that a product containing ginkgo biloba was a treatment for Alzheimer’s disease, and a product containing ginseng could prevent diabetes and cancer. The deregulation instituted by DSHEA endangered the health of consumers and provided misleading and deceptive health education. This further complicated the task of nutrition educators accountable to the public rather than corporations. These educators now needed, not only to give people the facts they needed to make informed food choices, but also to counteract the better funded misinformation campaigns that industry sponsored.

A study of alcohol regulation in the United Kingdom concluded that the deregulation of alcohol marketing that began in the 1960s and continues to the present has significantly increased the health-related harms caused by alcohol. Repealing laws that limited the hours and places of sales, and the pricing and marketing of beer, wine, and liquor contributed to increases in deaths from cirrhosis of the liver, hospital admissions for alcoholic liver disease and acute intoxication, and binge drinking among teenage girls. Compared to the United States, which still has more robust state and local alcohol regulations in place, the United Kingdom has higher rates of alcohol consumption, fewer alcohol abstainers, and a youth and childhood drinking rate more than twice the American rate. In 2013, bowing to pressure from the alcohol industry, the United Kingdom again missed an opportunity to remedy these problems by rejecting a proposal to institute regulations that would have used alcohol pricing to discourage excess use, a decision decried by public health advocates.

Tax relief: Another plank of the business plan for restoring profitability is tax relief. Although U.S. businesses paid lower corporate taxes than in Europe and Japan, American businesses insisted that high taxes were a deterrent to economic growth and a drag on the U.S. economy. Beginning in 1980 with the Reagan tax cuts and continuing for the next three decades, U.S. corporations saw their tax rates fall. Between 1955 and 2010, the percentage of federal revenues generated by corporate taxes fell from 27.3 percent to 8.9 percent. In the same period, the percentage of the gross domestic product that came from corporate taxes fell from 4.3 percent to 1.3 percent. By 2010, compared to other nations, U.S. corporate taxes constituted a smaller percentage of the GDP (1.8 percent) than those in Australia (5.9 percent), Japan (3.9 percent), or Great Britain (3.6 percent). In 1978, Congress passed and President Carter signed a tax bill that cut the top rate of capital gains taxes from 48 percent to 28 percent, thus also reducing the taxes on the private investors who supplied corporations with the capital needed for expansion, including expansion of the industries that promoted hyperconsumption. By 2012, the effective corporate tax rate in the United States had dropped to 17.8 percent, about 40 percent of the 1960 rate.

Many loopholes that businesses won in the tax code further reduced corporate taxes. A 2008 Government Accountability Office study found that 55 percent of United States companies paid no federal income taxes during at least one year in a seven-year period it studied. It also found that from 1998 through 2005, two out of every three United States corporations paid no federal income taxes. One favored strategy for avoiding taxes is to shift profits to countries with low tax rates. According to a 2013 Congressional Research Service report, United States corporations operating in the top five tax havens (the Netherlands, Ireland, Bermuda, Switzerland and Luxembourg) generated 43 percent of their profits in these countries but employed only four percent of their foreign employees and seven percent of their foreign investment in these locations.

Lower corporate taxes, combined with lower taxes on the wealthy, contributed to government deficits and provided ammunition for the conservative argument that the United States could no longer afford a government that provided extensive services or took on ambitious regulatory efforts to protect public health or the environment. In fact, as President Reagan put it, government became the problem, not the solution. This represents an amazing bait-and-switch by big-business–minded leaders in the United States: by failing to tax businesses, they rob other government programs of the tax income needed to carry out their social functions. Then, when the government’s bottom line looks bleak due to the dearth of tax revenues, it’s the social programs, not the free-wheeling corporations, that get the blame and suffer the budget axe. The political support that corporate leaders have generated for this austerity program, despite its devastating impact on public health and poverty reduction, is one of their greatest triumphs.

In summary, lower rates for corporate taxes and capital gains taxes, combined with the lower personal income taxes for the wealthy inaugurated by President George W. Bush in 2001 and 2003 hurt the health of the public in three important ways. The tax cuts increased income inequality, a powerful contributor to health inequality. They deprived the government of revenues needed to maintain strong public health and other safety net programs. And, by freeing capital for investment, they fueled the growth of the corporate practices that encouraged hyper-consumption with its attendant increase in chronic diseases and injuries.

Privatization: “Privatization” is the transfer of services from the public to the private sector. Throughout the 1980s and 1990s, federal, state, and local governments sought to privatize public services, such as education, healthcare, and policing. Such privatization creates new profit opportunities for the businesses that provide these services but also reduces public accountability and oversight. Also privatized were the enforcements of public health and environmental regulations. The rationale for such privatization was that businesses were better equipped to set and enforce their own rules than government, and that private enforcement was more efficient than public. Thus, many local health departments privatized environmental health services; some states privatized regulation of the retail alcohol industry; and national regulators such as the Occupational Safety and Health Administration (OSHA) and the USDA turned over responsibility for some safety inspections to the companies being inspected. Another rationale for privatization of enforcement was that national governments often lacked the mandate or expertise to monitor increasingly global exchanges, leading them to delegate such responsibility to private international organizations, often controlled by the industry to be regulated.

How does privatization affect public health? While the specific impact depends on the details, among the consistent relevant scholarly criticisms of privatization are a loss of regulatory capacity, an increased share of costs shifted to profit, diminished oversight of privatized services, a tendency to allocate services based on cost rather than need, and the vulnerability of privatized services to market volatility. In general, privatization of regulatory or service functions reduces the power of government and increases the power of corporations. A study of privatization of tobacco companies in the former Soviet-bloc nations and other countries suggests that the process leads to increases in tobacco consumption. A review of privatization of water supplies in Latin America concluded that “privatization marked a troubling shift away from the conception of water as a ‘social good’ and toward the conception of water—and water management services—as commodities” and reduced access to clean water.

Threats from privatization continue. In 2013, the Obama Administration proposed new rules to protect the safety of food imported from other countries. Each year 130,000 Americans are hospitalized and 3,000 die from contaminated food. About 15 percent of the food comes from abroad, often from countries with limited capacity to monitor food safety. Yet the FDA inspects only one to two percent of all food imports. Acknowledging the political reality that corporations and their conservative allies were unlikely to fund regulations that required independent oversight of the food industry, the FDA instead proposed that private companies like Walmart and Cargill inspect their own imports, thus delegating a core public health function to private industry. A pernicious long-term effect of privatization is that it further diminishes the public sector, the only actor with the capacity and resources to make protecting public health a priority.

Market concentration: “Market concentration” is the tendency for the number of companies producing specific goods or services to decrease, with the resulting firms becoming bigger and controlling more of the market, often driving smaller companies out of business. In theory, capitalism promotes competition, but in practice, markets often concentrate, reducing competition. In the 1970s and beyond, many major industries become increasingly concentrated, both nationally and globally. For example, between 1970 and 2002, the proportion of food-processing sales in the United States accounted for by the fifty biggest companies increased by 39 percent. In the alcohol industry, concentration was even more pronounced. Between 1979 and 2006, the ten largest global beer makers more than doubled their global market share, from 28 percent to 70 percent. This concentration left business decisions about what to produce in the hands of a few major corporations, diminishing the power of governments and consumers to shape markets.

It also made it easier for the remaining few big corporations to afford the most advanced technological, marketing, and research and development expertise and to compete successfully with smaller companies on price. Lower prices for unhealthy products result in greater population exposure and risk. These competitive advantages led to further concentration, giving the biggest global corporations an even stronger voice in shaping the economy, politics, and the environments in which individuals made consumption decisions. In systems theory, this is a “positive feedback loop” that amplifies a problem rather than corrects it. Since most economists predict further concentration of the global consumer industries, absent intervention, the health problems associated with market concentration can be expected to grow.

Changing corporate political practices increase their clout

These economic changes led to dramatic changes in corporations’ business practices, but Powell’s memo emphasized that corporations needed to launch a political campaign against their opponents. After 1971, corporations moved to occupy Washington, D.C., making it the headquarters of their counter-offensive.

The trickle of lobbyists flowing into Washington in the 1970s turned into a flood that all but drowned out the voices of citizens. By 1998, according to the Center for Responsive Politics, special-interest groups had accumulated more than thirty-eight registered lobbyists and $2.7 million in lobbying expenditures for every member of Congress. Between 2000 and 2005 alone, the number of registered lobbyists in Washington, D.C., more than doubled, from 16,342 to 34,785, and annual spending on federal lobbying reached $2 billion.

To reinforce its Washington messages, corporations also beefed up lobbying at the state and local levels of government. After losing several key state battles, for example, the tobacco industry began hiring lobbyists in state capitals; by 1994, according to one study, at least 450 state-level lobbyists were working to resist tobacco-related legislation. In 2006, the Center for Public Integrity, an investigative journalism outlet, reported that companies and other organizations spent almost $1.3 billion to lobby state legislators, a 10 percent increase from the previous year. The 40,000 registered state lobbyists outnumbered state legislators five to one, and total spending on lobbying averaged $200,000 per legislator.

Campaign contributions provide another route to influence legislators. In 1971, in response to public pressure, Congress passed the Federal Elections Campaign Act, which required disclosure of campaign financing. After the Watergate scandal, Congress strengthened the Act in 1974 by creating a comprehensive system of regulation and enforcement, including public financing of presidential campaigns and the establishment of a central enforcement agency, the Federal Election Commission. In 1976, however, in the Buckley vs. Valeo decision, the U.S. Supreme Court (including Lewis Powell) struck down most limits on candidate expenditures and certain other limits on spending, calling such rules unconstitutional infringements of free speech. This and subsequent court decisions opened the door for a growing corporate role in election financing.

Thirty years later, in the 2006 congressional elections, corporate political action groups contributed $120 million to congressional candidates, a 33 percent increase from the 2002 elections. As the electoral winds shifted towards the Democrats that year, so did corporate contributions. The pharmaceutical, tobacco, and insurance companies that had previously heavily favored Republican candidates began to hedge their bets, increasing their contributions to Democratic candidates. “Our approach to our political contributions,” David Howard, a spokesman for Reynolds American Tobacco Company, told the Wall Street Journal, “is that we support those who will support us or will give us an ear.”

By 2012, another record year in campaign spending, outside corporate and PAC contributions became a primary source of funding for both Democratic and Republican congressional and presidential candidates. Total reported spending exceeded $4 billion, probably an underestimate. The U.S. Chamber of Commerce, which describes itself as the world’s largest business association, reported independent spending of $32,676,075 in the 2012 election cycle. However, according to the Sunlight Foundation, an independent monitor of campaign spending, only 6.9 percent of the Chamber’s spending supported winning candidates. The day after the election, Gregory Casey, head of the Business Industry Political Action Committee, a Chamber-rival organization that encourages political participation by corporations, told the Washington Post, “We learned you cannot address the fiscal and cultural differences in our society by throwing money at political dogmas that may have outlived their usefulness.”

A year later this Wednesday-morning quarterbacking seemed to have been forgotten. In a 2013 speech to small businessmen, Thomas Donahue, the President of the Chamber of Commerce, sounded like he was channeling Lewis Powell. He exhorted the audience to “defend and advance a free-enterprise system” whenever it “comes under attack” and to give politicians in Washington a piece of their minds. “It’s about time,” he thundered, “that our leaders in Washington start making the tough decisions that we pay them to make!”

Corporations had long used the revolving door between the corporate suite and government offices in Washington to shape policy by sending top executives to advise presidents, and offering corporate jobs to departing public officials. In 1953, for example, General Motors president Charles Erwin Wilson became Eisenhower’s Secretary of Defense. Wilson’s famous quote, “What’s good for the country is good for General Motors, and vice versa,” demonstrated the belief that there was no conflict between the interests of the country and of a corporation. Later, Ford president Robert McNamara left Detroit to become John F. Kennedy’s Secretary of Defense.

What changed in the 1970s was the extent to which politics and business interests became entangled. In 1973, in an example that symbolized the revolving door, President Nixon appointed Earl Butz, then a director of Ralston Purina, a major company that produced food for people and animals, as Secretary of Agriculture, replacing Clifford Hardin, who then became head of Ralston Purina. Nixon, and later President Ford, gave the new Secretary Butz two explicitly political tasks: first, to help resuscitate a declining economy in order to win farm-state support for Nixon, and later, to bring down the rising cost of food to increase Gerald Ford’s popularity.

Butz took on these tasks with enthusiasm. But his political chores gave him the opportunity to take on a grander task: making government not the protector of farmers and consumers, but the expeditor for agribusiness, helping it become the dominant force shaping the production and distribution of the world’s food. Butz negotiated a deal to sell U.S. grains to the Soviets, leading to record rates of inflation of food prices in the United States but assuring profits for food companies. By 1975, six major corporations controlled 90 percent of the world’s $11 billion a year grain-export business. Subsequently, Butz used federal policy to favor large farms at the expense of smaller ones (his advice: “Get big or get out”) and to support subsidized production of corn, soy, and wheat, the staples of the industrial food system.

Over time, as Butz’s policies increased production, prices fell. Cheap corn, which the food writer Michael Pollan has called “the dubious legacy of Earl Butz,” became the building block of fast food, along with high-fructose corn syrup, and super-sized sodas. By allowing market forces to drive farm policy, Butz planted the seeds of the food system that is causing the current crop of diet-related chronic diseases. As it turned out, what was good for Ralston Purina and other big food companies was not so good for global health.

Public relations firms and law firms also grew enormously in this period in both number and size, largely as a result of corporate clients. The art of public relations first emerged in the 1920s. Using new insights from psychology, one of PR’s founders, Edward Bernay, a nephew of Sigmund Freud, helped corporate clients mold mass opinion. As Bernay recounts in his autobiography, George Washington Hill, the president of American Tobacco Company, asked Bernay in 1929, “How can we get women to smoke on the street? They’re smoking indoors but damn it, if they spend half the time outdoors, we’ll damn near double our female market. Do something. Act!” Bernay obliged by setting out to identify and then modify the beliefs that prevented women from smoking in public. Acting behind the scenes, he persuaded a feminist leader to invite women to march in the 1929 Easter Parade under the slogan “Women! Light another torch from freedom! Fight another sex taboo.” Young women marched down Fifth Avenue puffing Lucky Strikes, attracting wide newspaper coverage. Bernays had offered corporations a tool to manufacture, not new products, but the demand for them.

In his work for Beechnut Packing Company, a leading pork producer, Bernay had the opportunity to contribute to another twentieth-century epidemic, heart disease. To convince Americans that a heavy breakfast of bacon, a Beechnut product, and eggs, promoted health, Bernays conducted a survey of 5,000 doctors, asking them if heavy breakfasts supported health. In a national advertising campaign with the headline, “Physicians urge heavy breakfasts to improve health,” he presented their favorable positive response as if it were a scientific study, a technique of third-party endorsement later used by other industries to promote unhealthy products. Sales of bacon soared.

From 1980 to 2000, the number of public relations specialists increased by 56 percent, with the largest number working in the corporate sector. Many new hires went to the mega-PR firms that emerged. WPP, founded in 1985, calls itself the world’s largest communications services group, employs 153,000 people working in 2,400 offices in 107 countries. Burson Marsteller, established in 1953, became the world’s largest public relations firm by 1983. Edelman, the largest independent PR company, employed 3,600 staff in 53 cities around the world. Some companies became to go-to source for corporations with a health-related image problem. Hill & Knowlton has been employed by 50 percent of global Fortune 500 companies and has helped the tobacco, infant formula, lead, vinyl chloride, and other industries fight off government regulations of products associated with health harms.

Increasingly, these companies became communications supermarkets, with the ability to provide corporations with help on lobbying, crisis management, and strategic planning, as well as more traditional public relations. Working on similar issues for multiple clients enabled these PR companies to gain the expertise needed to master the common tasks their clients expect: creating favorable opinion for a new product or brand, resisting government regulation, managing a crisis due a safety threat, or turning back a new tax. As government downsized, the corporate capacity to manipulate government super-sized, further tilting the asymmetrical power relationship.

Businesses also expanded existing trade associations, like the Chamber of Commerce and the National Association of Manufacturers, while creating new ones to bring the corporate policy agenda to elected officials. In the 1960s, according to business historian David Vogel, trade associations were “under-staffed, relatively unsophisticated, and were held in little regard by either the companies that belonged to them or the legislators whose views they were supposed to influence.” By 1980, more than 2,000 trade organizations, employing 42,000 staff, had their headquarters in Washington, D.C. For the first time since the 1920s, when Secretary of Commerce Herbert Hoover pushed for the establishment and growth of trade associations, the total number of people working for private businesses such as trade associations and lobbying, law, and PR firms exceeded the number of federal employees in the Washington metropolitan area.

A new environment for health decisions

By the 1980s, these changes in business and politics had transformed the environment in which investors and corporate managers made decisions, narrowing their options for satisfying Wall Street, while widening their opportunities for using their political clout to overcome opposition to their business plans.

On the business side, corporate managers really had only a few choices that allowed them to satisfy investors’ growing thirst for quick and steady returns on investment. They could create blockbuster products that produced windfall profits for at least a few quarters or years. They could dramatically increase market share by finding new populations of customers or by driving competitors out of business. They could buy and sell companies, hoping to profit by selling off assets or acquiring fast-growing businesses. Finally, companies could increase profits by firing workers, cutting benefits, or using their political muscle to lower taxes or resist or weaken regulations, thereby leaving more revenue as profit. On the political front, corporations expanded each of these opportunities by deploying their new armies of lawyers, lobbyists, public relations staff, and trade associations.

Each of these options had health consequences, but two, the marketing of blockbuster products and the relentless efforts to develop new markets, have had direct and particularly harmful results for people’s health. It is the growth of these corporate practices that set the stage for the global epidemics of chronic diseases and injuries.

Blockbusters satisfy investors, who then reward the managers who have brought them to market. But as the stories of Saturday night specials, SUVs, and Vioxx show, new products often have flaws or unintended consequences, and the massive advertising and retail distribution needed to achieve blockbuster status mean that millions of people can be exposed to a product before the hazard becomes apparent. In addition, given the benefits that blockbusters bring, managers feel justified in cutting corners, resisting safety regulations, exaggerating the benefits, or minimizing the harms. After all, given that chronic conditions take years to develop and that faulty product design may be difficult to detect without counting bodies, it is likely that managers will have moved on and companies changed hands before problems are identified.

A second option for corporate managers who are expected to produce positive quarterly returns is to find new markets where rapid expansion is possible. As we have seen, the tobacco industry discovered women as new recruits for tobacco use in the 1920s. After the 1990s, the industry turned its attention to women and young people in Africa, Asia, and Latin America. In this situation, an industry took an existing product and marketed it to new customers. The marketing of alcopops and guns to young women and unhealthy foods to blacks and Latinos in the United States are other examples of this strategy.

Excerpted from “Lethal but Legal: Corporations, Consumption, and Protecting Public Health” by Nicholas Freudenberg. Copyright © 2014 by Nicholas Freudenberg. Reprinted by arrangement with Oxford University Press, a division of Oxford University. All rights reserved.


By Nicholas Freudenberg



Related Topics ------------------------------------------

Books Corporations Deregulation Lethal But Legal Mcdonalds Public Health Regulation