2016-03-10

BOARDING A PLANE RECENTLY, I overheard a young woman ask her seatmates if they had snacks with them that might contain nuts. As people began rifling through their carry-ons, she explained she was “crazy allergic” to peanuts and apologized for the hassle. She told everyone in the rows around her that she was more than happy to pay for any replacement snacks.

A man sitting next to her dug a granola bar out of his bag, scrutinized its list of ingredients and asked, “How do you go to restaurants? Or out to dinner parties?” Half-jokingly, she shrugged and said, “I just don’t eat.”

The woman next to them lamented that it seems as if everyone alive has some type of allergy. She then confessed to being tormented by grass and ragweed pollen each summer. An older man across the aisle added that his grandson was also severely allergic to peanuts, so he could sympathize.

Silently, I commiserated with her, too, as I jotted down notes on the conversation. I’m a medical anthropologist, not a journalist, which means I study health and illness from a social and cultural perspective. Last winter, after contracting viral bronchitis for the fourth time in as many months, I was diagnosed with allergies. The lining of my nose and throat were so visibly irritated that I had to take steroids for six months to tamp down the inflammation of my airways. My immune system had been completely overwhelmed, making me more susceptible to infection. In other words, and like millions of other people, my allergies were literally making me sick. But until my own suffering began, I never realized just how many people have some type of allergy.

Today the American College of Allergy, Asthma, and Immunology estimates that nasal allergies affect approximately 50 million people in the United States alone. If you add in other forms of allergy, those numbers are even more staggering. A recent study published in the Journal of Allergy and Clinical Immunology revealed that nearly 46 percent of the adult US population and 36 percent of its children show signs of sensitization to one or more allergens. According to the US Centers for Disease Control, over the past decade alone, the instance of food allergies has significantly increased in children younger than age 18. Any way you look at it, that’s a lot of people sneezing, itching, and otherwise irritated.

Shortly after my diagnosis, I began looking for more information on allergies. What I really wanted were a few comprehensive overviews on the topic written for a general audience, anything that might help me start piecing together what was causing the rise of allergies in industrialized nations all around the globe. After listening to me complain for the umpteenth time about my own allergies and how little has been written on the subject outside of the medical literature, a good friend finally asked, “Why don’t you write about it?”

The truth is that until the moment I was diagnosed, I had been assiduously avoiding thinking too much about allergies. Allergies, at their best, are frustrating annoyances that can be remedied with medication. At their worst, they are frightening and life-threatening reactions to the world around us.

In 1996, a single bee flew into an open car window and stung my father on his neck as he idled at a stop sign. Allergic to bee venom, my dad immediately went into anaphylactic shock. His neck ballooned, cutting off his air supply. Within minutes, he went into full cardiac arrest and was dead by the time paramedics arrived.

During the summer months, I carry an injection of epinephrine — just in case I find myself on the wrong end of a stinger. I don’t know if I inherited my father’s allergy to bee venom because my skin is nonreactive to the scratch tests used to determine which substances trigger allergic reactions. I have hidden — but still very real — allergies. I take allergy medication on a daily basis and, for the most part, it does the trick. But simply seeing a bee is enough to start my heart racing and my mind spinning. I don’t like to think about what could happen to me if I’m stung. I don’t like to think about my dad’s death. My diagnosis last winter finally forced me to confront my family’s allergic past, so I did what any good scholar would do: I started a research project on the rise of allergies in the United States.

Since I began the project, I’ve collected stories about people wrangling with dust mites, doing daily battle against gluten or lactose or eggs, trying to avoid unusually high pollen counts or mold spores, breaking out into itchy rashes after switching laundry detergents or soaps, and ending up in the ER after ingesting a prescribed dose of penicillin or a stray shrimp. I’ve listened to family physicians talk about the difficulty in diagnosing and treating allergies, and interviewed specialists about the severity of the problem. This year, I began informally polling people about what they wanted to know about allergies. Almost everyone’s answers converged on the same set of questions: Are allergies getting worse? Are they any more prevalent today than they were 10, 20, or even 50 years ago — or are we just getting better at recognizing or diagnosing them? And if allergies really are getting worse, then why — are we to blame somehow?

Are allergies getting worse?

This straightforward question seems like it should be simple to answer, but as all three of the medical histories I consulted make clear, allergy is a particularly vexing malady to diagnose, to study, and to treat. For starters, as a medical condition, it’s notoriously hard to define or to categorize. What, for instance, is the difference between an irritation, an intolerance, and a full-blown allergy? From a scientific perspective, it’s not always so easy to tell, especially when a patient’s reaction is mild. This is partially due to the fact that the field of allergy medicine has been riddled with difficulties from its start.

According to Mark Jackson, a medical historian at the University of Exeter and author of Allergy: The History of a Modern Malady, early allergists had to cope with a lack of uniform testing, reliable methods for diagnosis, and standard treatment protocols. Our troubles with allergy first began with what was a murky collection of symptoms. In the early 1800s, as the industrial revolution rumbled on, doctors began to notice a growing number of patients presenting with severe and lasting summer or “rose colds.” As Jackson notes, “hay fever” only emerged as a well-defined clinical disorder after 1819. Its entrance into the medical lexicon spurred more diagnoses until hay fever had become, by the 1930s, “the fourth most common form of chronic disease and a major public health concern in the United States.” But up until as late as the 1980s, there simply were no “standard best practices” in allergy medicine.

Compounding these problems, Jackson argues that early epidemiological studies that focused on allergies were confounded by “methodological and conceptual problems” and “insufficient information.” This left doctors and allergy sufferers in the lurch; patients had to rely on the know-how and expertise of individual allergists for relief, which meant that, until fairly recently, allergy medicine was more art than science.

Adding to the problem of rigorously defining “allergy,” allergists had and continue to have a hard time pinpointing which substances are causing wayward immune reactions in the first place. Our bodies’ immune response to allergens often disguises itself or mimics symptoms of other serious ailments. Because of this, it can take years of doctors’ appointments, suffering, and testing before a patient’s troubles are diagnosed as an allergy and their symptoms finally alleviated. This is especially the case, as Matthew Smith, a medical historian at The Centre for the Social History of Health and Healthcare at Glasgow Caledonian University, brilliantly explores in his book, Another Person’s Poison: A History of Food Allergy, if those allergic reactions involve food.

Unlike respiratory allergies and asthma, whose rates have remained relatively constant, the rate of food allergies has risen significantly in recent years. But, complicating matters, this rise is partially the effect of improvements in diagnostic technologies and our scientific understanding of immune mechanisms. Food allergies, since they are often nonreactive to skin tests, are “invisible” and rely on the subjective experiences and observations of the patients to diagnose correctly. This requires, as Smith notes, a more intimate relationship between allergist and patient — one based upon mutual trust.

Smith’s book is a fascinating overview of the contested history and meanings of food allergy over the past century. Food allergies have become a proving ground for the validity of immunological research and allergy medicine. A chimerical-like object, “food allergy” has allowed different groups of experts — orthodox allergists and clinical ecologists among them — to stake out territory and claim expertise. But the fight over the legitimacy of things like “gluten intolerance” isn’t just about semantics or redefining what allergy is. The mysterious rise of peanut allergies in children in the 1990s catapulted food allergies — and allergies writ large — into the public spotlight. As Smith argues, “Peanut allergy made allergy matter.” But then this begs the question: Did the realization that nut allergies were increasing cause an artificial bump in nut allergy diagnoses?

The answer is, as all three of these histories suggest, both yes and no. When awareness of a particular allergy is raised, it triggers a subsequent rise in diagnoses. Those new patients — and the agglomeration of all their varied and various symptoms — generate a reconfiguration of the definition of allergy to include a wider range of indicators for future diagnosis. This “looping effect” can make it hard to separate the biology of an allergy from our social and cultural understanding of it, which in turn makes it very hard to tell if allergies are really getting worse or whether we’re just getting better at recognizing and categorizing them.

In focusing on respiratory allergies, however, Gregg Mitman, a professor of History of Science, Medical History, and Environmental Studies at the University of Wisconsin–Madison, comes to a starkly different conclusion in his book Breathing Space: How Allergies Shape Our Lives and Landscapes. He argues that worsening allergies are directly linked “to changes in the natural and built environment over the last 150 years.” Mitman suggests that our scientific focus on better understanding the individual patient’s reactions to allergens, over that of the possible environmental factors in their causation, has led to a worsening of the problem. We ignore at our own peril the larger social and structural issues at play, such as poverty’s effect on asthma and allergy rates. For Mitman, allergy is an “ecological” problem — not strictly a biological or genetic one.

Are we to blame somehow?

Once I discovered I had allergies, I immediately began to speculate on their origin. Were allergies genetic? Maybe I had inherited more than my hair color or height from my father. Then again, perhaps allergies were environmental. For decades, I had been living in sprawling urban areas, replete with air pollution, dirt, rat feces, and cockroaches. But the more I searched for a possible cause, the more I began to hypothesize about the cumulative effects of things like genetically modified food, climate change, and manmade chemicals on our overall health. I couldn’t help but wonder if we were making ourselves sick. Are allergies simply a symptom of a much larger manmade problem?

It turns out that the answer, again, is both yes and no.

Globally, increasing allergies are positively correlated with industrial development and a growing middle class, so it’s clear that something we are doing is affecting how our immune systems function. What is far less certain is what, specifically, has been causing our allergies to worsen. At least in terms of our scientific understanding, the rise of allergies over the last centuries remains a puzzle. But that hasn’t stopped us from speculating. As these histories highlight, the fear that we are the architects of our own sneezing, itching, and suffering is nothing new.

Early allergies, as Mitman argues, were seen as nature’s way of “taking revenge” against humans for their rapacious greed. Physicians and pundits saw hay fever and asthma as idiosyncratic reactions to a human world out of balance with nature. And since the rise of densely populated urban areas and increasing industrialization tracked with an observable increase in respiratory allergies, experts ventured that allergies were the result of increasingly “unnatural” living conditions. Early sufferers of hay fever were thought too sensitive for their urban environments. The solution, for those who could manage it, was relocation to an environment with cleaner, healthier air. Such “health tourism” helped to craft entire cities in Colorado, Arizona, and California as destinations for hay fever sufferers. Those who were unable to afford relocation were left to cope as best they could.

In the early days of immunology, allergies were symbolic of individuality in the worst possible way. Allergies were described in the early medical literature as “individuality run mad” or as “immunity gone astray,” placing the blame for allergies squarely on the individual and his or her faulty immune system rather than the environment. As scientific research on immunity advanced, the thinking shifted. Allergy was no longer just about individual “nervousness,” psychological weakness, or physical frailty, but about individual biological reactions to allergens. This made nature the enemy of the allergy sufferer, rather than part of the cure. In their battle against the ill effects of nature, scientists developed the first scratch tests for pollen in 1914, while botanists busied themselves measuring pollen counts and creating pollen maps. After World War II, the story shifted again, as experts began suspecting environmental chemicals and food additives as the origin of our allergy woes. Matthew Smith notes that as early as the 1950s, clinical ecologists (as opposed to orthodox allergists) started to worry about the “allergenicity of food additives” and “fomented alarm about the modern diet more generally.”

Ultimately, the historical narratives about the causes of allergy are always the same. The “something-is-out-of-whack” argument posits that our own ingenuity, scientific advancement, and technological prowess have thrown us out of an imagined equilibrium with nature. The logic here is simple and seems sound enough: anything natural is healthy, whereas anything manmade is probably not. The flip side of this is, of course, that “untouched” nature is more healthful than anything manmade or urban. Western civilization itself, as Jackson suggests, was couched as inherently harmful to our health. The solution could be as simple as “getting back to nature,” eating organic, or using more “natural” products.

Jackson is much more critical of this reasoning than Smith or Mitman. Smith is largely agnostic on the causes of allergy and prefers to focus on how the meanings of allergy medicine have changed over time. But as an allergy sufferer himself, Mitman has little trouble blaming increased air pollution and climate change for our worsening respiratory allergies (a theory that has correlation on its side, but — as of this writing — no direct causation).

Urbanization, pollution, and alterations to agricultural practices have always been blamed for increases in disease rates — even when those links have had no scientific basis. Medical historian Charles Rosenberg argues that by the mid-19th century, the claim that “nature is better” was already old hat — and mostly wrong. Rural farmers in the 1800s were just as readily dying of cancer and tuberculosis as city dwellers, but those facts didn’t get in the way of people believing that living closer to “nature” was somehow better for them. It’s worth quoting Rosenberg at length here:

In one sense, this ironic and persistent emphasis on the role of civilization in the causation of disease is no more than a cliché, a variation of traditional primitivistic notions, endless evocations of lost worlds in which humankind had not been corrupted by wealth and artifice — all versions and reiterations of the Garden of Eden’s Faustian bargain recast in epidemiological terms.

But before you think we’re wiser than our forebears, think again. The rise in popularity of all-natural and organic products and the palpable fear of genetically modified food are indications that we still believe that natural is better than modified or processed, and are still as anxious about the ill effects of industrial chemicals and pollution and pesticides as we’ve ever been. As Eula Biss writes in her best-selling book On Immunity, “In this context, fear of toxicity strikes me as an old anxiety with a new name. Where the word filth once suggested, with its moralist air, the evils of the flesh, the word toxic now condemns the chemical evils of our industrial world.”

¤

And I guess that’s what we’re left with in lieu of answers: a general fear that something we’re doing — either to ourselves or to our environment — is making us very sick. We want to be responsible for the allergy epidemic because if that were really true, then it would also mean that we could govern our medical fates. If we caused the allergies in the first place, then we might be able to wrest some control back over our immune systems by eating organic foods, drinking filtered water, or breathing cleaner air.

What reading these histories together really shows us, however, is how foolish we might be in thinking that allergies are, at root, a manmade problem with a manmade solution. The truth is that we simply don’t know enough about our allergies to come up with more definitive answers or effective solutions. Going a step further, it’s also possible that our troubles with allergy stem directly from the evolution of our immune system itself. When it comes to germs, our immune system’s task is to protect us from infection. We need robust and highly responsive immune systems to battle viruses, bacteria, and parasites. But when it comes to allergens, our immune system’s “job” is less clear. We need less reactive immune systems to coexist with pollen, dust mites, and dairy products. In effect, allergies are really a problem of our immune system doing its job too well. It’s a paradox that has no easy solution.

¤

Theresa MacPhail is an assistant professor of Science & Technology Studies at Stevens Institute of Technology. She has a BA in journalism from the University of New Hampshire and a PhD in medical anthropology from the University of California-Berkeley.

Show more