The birth of food-phobia

How industrialization, bad science and middle-class paranoia made us irrationally terrified of contamination

Published March 24, 2012 11:45AM (EDT)

       (<a href='http://www.shutterstock.com/gallery-2702p1.html'>Dhoxax</a> via <a href='http://www.shutterstock.com/'>Shutterstock</a>)
(Dhoxax via Shutterstock)

This article is adapted from the new book "Fear of Food," from University of Chicago Press.

At the root of our anxiety about food lies something that is common to all humans — what Paul Rozin has called the “omnivore’s dilemma.” This means that unlike, say, koala bears, whose diet consists only of eucalyptus leaves and who can therefore venture no further than where eucalyptus trees grow, our ability to eat a large variety of foods has enabled us to survive practically anywhere on the globe. The dilemma is that some of these foods can kill us, resulting in a natural anxiety about food.

These days, our fears rest not on wariness about that new plant we just came across in the wild, but on fears about what has been done to our food before it reaches our tables. These are the natural result of the growth of a market economy that inserted middlemen between producers and consumers of food. In recent years the ways in which industrialization and globalization have completely transformed how the food we eat is grown, shipped, processed, and sold have helped ratchet up these fears much further.

As a glance at a web page exposing “urban legends” will indicate, there are an amazing number of bizarre fears about our food supply (often involving Coca-Cola) always floating about. These might provide some insight into the nature of conspiracy theories, but they are not the kind of fear that interest me. What interests me are fears that have had the backing of the nation’s most eminent scientific, medical, and governmental authorities. Many of them were the most eminent nutritional scientists of their time. The government agencies involved were staffed by experts at the top of their fields. Yet, as we shall see, many of the fears they stoked turned out to be either groundless or at best unduly exaggerated. Others involved frightening all Americans about things that should concern only a minority. These scares, and the anxiety about food they have created, result from a confluence of forces that since the end of the nineteenth century have transformed how Americans eat and think about their food.

First and foremost is something that contemporary home economists often noted: that for many years the production and preparation of food has been steadily migrating out of the home. In the country’s early years, when 90 percent of Americans lived on farms, the outsiders handling their food were mainly millers and vendors of essentials like salt and molasses. One usually had a personal, trusting relationship with these suppliers, who were often neighbors.

By the late nineteenth century, though, industrialization, urbanization, and a transportation revolution had transformed the nation. Cities boomed, railroads crisscrossed the nation, massive steamships crowded its ports, and urbanites were provided with foods that were not only not grown by neighbors — they were not even grown in neighboring countries. Large impersonal companies were now in charge of the canning, salting, refining, milling, baking, and other ways of preserving and preparing foods that had previously been done at home or by neighbors.

All along the way, the foods passed through the hands of strangers with plenty of opportunities to profit by altering them, to the detriment of the quality and the healthfulness of the foods. There was, then, plenty of reason to mistrust what had happened to food before it reached the table.

These natural concerns were heightened by modern science. In the late nineteenth century, nutritional scientists discovered that food was not just undifferentiated fuel for the human engine. Rather, they said, it consisted of proteins, fats, and carbohydrates, each of which played a different role in preserving health. Only scientists could calculate how much of them were necessary. They then laid much of the groundwork for modern anxiety about food by warning that taste was the least reliable guide to healthy eating.

At the same time, fears were stoked by the new germ theory of disease. That contemporary Americans fear food more than the French is rather ironic, for many modern food fears originated in France. It was there, in the 1870s, that the scientist Louis Pasteur transformed perceptions of illness by discovering that many serious diseases were caused by microscopic organisms called microbes, bacteria, or germs. This “germ theory” of disease saved innumerable lives by leading to the development of a number of vaccines and the introduction of antiseptic procedures in hospitals. However, it also fueled growing fears about what industrialization was doing to the food supply.

Of course, fear of what unseen hands might be doing to our food is natural to omnivores, but taste, sight, smell (and the occasional catastrophic experience) were usually adequate for deciding what could or could not be eaten. The germ theory, however, helped remove these decisions from the realms of sensory perception and placed them in the hands of scientists in laboratories.

By the end of the nineteenth century, these scientists were using powerful new microscopes to paint an ever more frightening picture. First, they confirmed that germs were so tiny that there was absolutely no way that they could be detected outside a laboratory. In 1895 the New York Times reported that if a quarter of a million of one kind of these pathogenic bacteria were laid side by side, they would only take up an inch of space. Eight billion of another variety could be packed into a drop of fluid. Worse, their ability to reproduce was nothing short of astounding. There was a bacillus, it said, that in only five days could multiply quickly enough to fill all the space occupied by the waters of Earth’s oceans.

The reported dangers of ingesting germs multiplied exponentially as well. By 1900 Pasteur and his successors had shown that germs were the cause of deadly diseases such as rabies, diphtheria, and tuberculosis. They then became prime suspects in many ailments, such as cancer and smallpox, for which they were innocent. In 1902 a U.S. government scientist even claimed to have discovered that laziness was caused by germs. Some years later Harvey Wiley, head of the government’s Bureau of Chemistry, used the germ theory to explain why his bald head had suddenly produced a full growth of wavy hair. He had discovered, he said, that baldness was caused by germs in the scalp and had conquered it by riding around Washington, D.C., in his open car, exposing his head to the sun, which killed the germs.

America’s doctors were initially slow to adopt the germ theory, but public health authorities accepted it quite readily. In the mid-nineteenth century, their movement to clean up the nation’s cities was grounded in the theory that disease was spread by invisible miasmas — noxious fumes emanating from putrefying garbage and other rotting organic matter. It was but a short step from there to accepting the notion that dirt and garbage were ideal breeding grounds for invisible germs.

Indeed, for a while the two theories coexisted quite happily, for it was initially thought that bacteria flourished only in decaying and putrefying substances — the very things that produced miasmas. It was not difficult, then, to accept the idea that foul-smelling toilets, drains, and the huge piles of horse manure that lined city streets harbored dangerous bacteria instead of miasmas. Soon germs became even more frightening than miasmas. Scientists warned that they were “practically ubiquitous” and were carried to humans in dust, in dirty clothing, and especially in food and beverages.

The idea that dirt caused disease was accepted quite easily by middle-class Americans. They had been developing a penchant for personal cleanliness since early in the nineteenth century. Intoning popular notions such as “cleanliness is next to godliness,” they had reinforced their sense of moral superiority over the “great unwashed” masses by bathing regularly and taking pride in the cleanliness of their houses. It was also embraced by the women teaching the new “domestic science” in the schools, who used it to buttress their shaky claims to be scientific. They could now teach “bacteriology in the kitchen,” which meant learning “the difference between apparent cleanliness and chemical cleanliness.”

But the most fearsome enemy in the war against germs was not a lovable pet, but the annoying housefly. Dr. Walter Reed’s investigations of disease among the American troops who invaded Cuba during the Spanish-American War in 1898 had famously led to the discovery that mosquitoes carried and spread the germs that caused yellow fever. But yellow fever was hardly present in the United States. Much more important was his subsequent discovery that houseflies could carry the bacteria causing typhoid to food, for that disease killed an estimated 50,000 Americans a year. Although Reed’s studies had shown that this could happen only when flies were practically immersed in human, not animal, excrement, his observations soon metamorphosed into the belief that during the war typhoid-carrying flies had killed more American soldiers than had the Spanish. This was buttressed by a kind of primitive epidemiology, wherein experts noted that typhoid peaked in the autumn, when the fly population was also at its peak.

Of course, it took much more than just some frightening ideas to arouse Americans about their food. The immense amounts of money involved in the food industries meant that, inevitably, huge financial stakes were involved as well. However, the stakeholders were not just the usual suspects, the large corporations that dominated food production. They also included much less mendacious interests as well.

Well-meaning public health authorities sought to demonstrate their importance by issuing exaggerated warnings about food dangers. Home economists helped justify their role in the education system by teaching how proper eating would avoid life-threatening diseases. During World War II, the federal government propagated the misguided notion that taking vitamins would make up for the deficiencies caused by food processing and help the nation defend itself from invasion. After the war, nonprofit philanthropies such as the American Heart Association raised billions of dollars in donations to spread the message that eating the wrong foods was killing millions of Americans. Scientific and medical researchers were awarded many more billions in government and corporate grants for studies warning about the dangers of eating fats, sugar, salt, and a host of other foods.

But the resulting food fears needed a receptive audience, and that is precisely what middle-class Americans were primed to be. By the early twentieth century, they had become the dominant force in American culture. Because they mainly lived in cities and large towns, they benefited most from the innovations in transportation, processing, and marketing that led to greatly expanded food choices. However, the resulting erosion of the reassuring personal relationships between sellers and buyers made them particularly susceptible to food scares. The media now became their major source of information about the safety of their food. Since much of this information was now scientific in origin, it was therefore the middle-class media — “quality” newspapers and magazines, and, later, radio and television news and public affairs shows — that played the major roles in disseminating it.

The residual Puritanism of the American middle class also helped make them susceptible to food fears. A culture that for hundreds of years encouraged people to feel guilty about self-indulgence, one that saw the road to salvation as paved by individual self-denial, made them particularly receptive to calls for self-sacrifice in the name of healthy living. This helped them lend a sympathetic ear to scientific nutritionists’ repeated warnings that good taste — that is, pleasure — is the worst guide to healthy eating.

By the end of the twentieth century, this guilt-ridden culture seemed to have weakened, as the notion that self-indulgence would benefit both individuals and society gained ground. However, at the heart of this more solipsistic view of life there still lay the old idea that illness and death were the result of an individual’s own actions, including — and often especially — how they ate. (“Americans,” a British wag has remarked, “like to think that death is optional.”) As a result, middle-class Americans have lurched from worrying about one fear of alimentary origin to another, with no apparent end in sight.

The spread of AIDS in the 1980s once more aroused public fears of the spread of infectious diseases. A slew of movies, television programs, and newspaper and magazine articles on bioterrorism stirred this pot. Commercial interests smelled opportunity in this and began stoking germophobia in promoting their soaps, detergents, and even prescription drugs. In 1992 a marketing journal wrote of how in the past four years this promotion of “germ warfare” had led to “amazing” growth in sales of home hygiene products. Then, in 1997 Pfizer introduced a consumer version of Purell, the hand sanitizer it developed for the medical profession, and mounted a fearmongering advertising campaign to promote it. Its success spurred other companies to join in and use germophobia to promote a host of competing disinfectants. Pfizer then responded with Purell 2-Go, small bottles of which could be attached to backpacks, lunch boxes, and key chains, so that people could disinfect their way through the outside world.

The SARS scare in 2004 heightened germophobia even more. Even though SARS was spread by a virus, not a bacterium, most Americans were unaware of the difference. They began snapping up a panoply of germicidal products such as contraptions that sprayed disinfectant on doorknobs, portable subway straps (for those who did not have “City Mitts” anti-microbial gloves), and, to combat germs in airplanes, the Air Supply Ionic Personal Air Purifier. A book called "Germs Are Not for Sharing" instructed children in how to play without touching each other. Those fearing germs on fruits and vegetables could buy the Lotus Sanitizing System, which used an electric charge to infuse tap water with ozone that would kill the bacteria on them — a process not unlike the Electrozone treatment that was purported to turn sea water into a disinfectant a hundred years earlier.

By then, of course, the water supply had once again fallen under suspicion, leading to the enormous growth of the bottled water industry. However, nothing could shake Americans’ confidence that breakfast cereals, dairy products, and other foods in brightly colored, tightly wrapped packages were free of harmful bacteria. It all amounted to impressive testimony to the ongoing effectiveness of advertising campaigns such as those mounted by Kellogg’s and Nabisco in the early twentieth century.

This represented yet another irony: that many of the fears that originated in the industrialization of the food supply were ultimately dissipated by the kind of packaging and marketing that were integral parts of it.

Excerpted with permission from "Fear of Food: Why We Worry About What We Eat," by Harvey Levenstein, from University of Chicago Press.


By Harvey Levenstein

Harvey Levenstein is professor emeritus of history at McMaster University in Hamilton, Ontario. He has published a number of books on American history, including "Revolution at the Table: The Transformation of the American Diet" and "Paradox of Plenty: A Social History of Eating in Modern America."

MORE FROM Harvey Levenstein


Related Topics ------------------------------------------

Editor's Picks Food History Science