Life: The disorder

More and more adults and teens are popping pills for ADD, "generalized anxiety disorder" and other quasi-societal conditions. Is it time to retire our moralistic distinction between "recreational" and "medical" drugs?

Published November 25, 2005 12:00PM (EST)

They would show up weekly, pulling into my driveway because I wasn't yet old enough to drive: desperate, chronically studious college kids looking for a fix. The year was 1995. I was 15 years old, an acne-spattered high school sophomore who had become, through a peculiar sequence of events I'll get to soon enough, an accidental dealer of Ritalin to those whose doctors had deemed them ineligible for a prescription. Undergrads who couldn't keep their eyes open while perusing Plato, law students with reading loads that would give Harold Bloom an aneurysm, medical residents who deemed sleep a disease -- they all flocked to me, paying between $3 and $5 for pills that converted their minds into binge-studying, test-devouring, world-dominating machines. Until my stockpile dried up, I constantly had at least 70 bucks burning a hole in my pocket. For a kid in the burbs who had food and shelter more or less covered by his mother, this was the equivalent of a doctor's salary.

OK, I know, that analogy is a stretch, though it's the one that comes to my mind as I stare, feeling faintly prophetic, at this recent headline in the New York Times: "Use of Attention-Deficit Drugs Is Found to Soar Among Adults." The article points out that a new study, by Medco Health Solutions, reveals that use of ADD medications has doubled in those between the ages of 20 and 44 in the past four years. Why's that? Likely in part because the first heavily medicated generation of teens is now drifting into adulthood and still renewing their prescriptions, and partly because new diagnoses are steadily increasing. "Adult ADD" -- full name: attention deficient and hyperactivity disorder -- appears to be at the cusp of making the transition of so many psychosocial disorders before it: from unheard of to skeptically acknowledged to culturally sanctioned.

Earlier this year it was the subject of a sober cover story in the New York Times Magazine, after which, curiously, television advertisements for Strattera, Eli Lilly's drug for adult ADD, suddenly seemed impossible to avoid. Robert S. Epstein, Medco's chief doctor, tells the Times that the current data indicates "a clear recognition and new thinking that treatment for A.D.H.D. does not go away for many children after adolescence." Another doctor, James McGough of UCLA, adds that still more adults should be on such drugs -- a sentiment echoed a few days later on the "Today" show by Dr. Edward Hallowell, author of "Delivered From Distraction: Getting the Most Out of Life With Attention Deficit Disorder."

Is it me, or is there something peculiar going on here? Adults have taken what began as a controversial adolescent disorder and coolly co-opted it as their own, as if there were never any doubts about its legitimacy. In the '70s, when ADD drugs were first being tested, they were among the only psychotropic meds for which clinical trials involved children and teens first. The thinking was simple: The adolescent years are ones of hormonal pandemonium that make focusing on pre-calc next to impossible for many; pills like Ritalin eased the pain. In time such reasoning was applied to younger kids -- twitchy, foot-tapping 7- and 8-year-olds, too. As for adults? It was assumed that growing up meant, well, growing up, and that taking such pills would be viewed as a frowned-upon crutch. But today's revised attitude has it that the trouble one had with memorizing state capitals or grasping the quadratic formula may be similar to the trouble one has listening to that PowerPoint presentation. We are all -- or many of us are, potentially -- antsy kids spaced out in the back of the class.

How fitting.

We live in a society where it's increasingly difficult to differentiate between adults and kids. Go to a mall, squint your eyes, and see if you can tell the difference between the alarming 18-year-olds who seem 35 and the much more alarming 35-year-olds trying to pass for 18. A case can be made that recognizing adult ADD isn't so much an enlightened leap in Western medicine as a questionable evolution in a culture that recently welcomed the dubious word "adultescent" into the 2005 edition of Webster's New World College Dictionary. (This, on the heels of "teensploitation," made official in 2004.) More than anything, Medco's findings can be thought of as a small step toward the reinvention of how we view adulthood. Since time immemorial grown-ups have made a point of telling children that adulthood isn't easy -- that it's a constant exercise in (cue affectedly furrowed brows) doing things you don't want to do. But such preaching suddenly sounds archaic, doesn't it, when the same adult superpowers are now patting themselves on the back for acknowledging that those PowerPoint presentations may, like Pink Floyd's "The Wall," be a whole lot more palatable on drugs?

I don't mean to sound overly flip. I'm simply finding it hard not to see this as a tragicomic step toward the classic concern about psychotropic drugs: the redefining of life as a disorder. Think, for a moment, about antidepressants. Over the past decade they shifted from being adult-only drugs to being acceptably prescribed (off-label) to children and teens. How come? On one hand it was recognized that children suffered from serious depression, and that certain pills were remarkably effective treatments. But at the same time the definition of treatable depression was watered down -- renamed as social anxiety disorder, panic disorder and, my personal favorite, generalized anxiety disorder -- to include a seemingly endless demographic of adults and children. So the same way we recognized that adult disorders can be applied to children, we are now, with ADD, noting that those of childhood can be applied to adults. It makes it hard not to imagine a future in which the smallest hardships (trouble studying, stress over a breakup, or perhaps a desire to prevent such nuisances) lead seamlessly to a fully medicated existence starting well before the onset of adulthood.

If this sounds a tad too Huxleyian, consider this: Another recent Medco study points out that during the same period in which adult prescriptions for ADD medications doubled, there was an 85 percent increase in sleeping pill prescriptions for teens. The cause? No one knows for sure, but the most popular theory is that it just might have something to do with all those stimulants kids are being fed to focus on that test. "It leads you to wonder," Dr. Epstein tells the Times, "whether these children are being treated for insomnia caused by hyperactivity or whether the medication itself causes insomnia." It also leads you to wonder, Dr. Epstein, if medication to medicate the effects of medication is what we'll soon be touting as medical advances. Or, more pressingly, if we truly gain anything but superficial relief by continuing to believe the reason for the increased popularity of these drugs is to treat disorders as opposed to enhancing performance.

It's hip to pronounce today's world as increasingly pressurized and achievement-oriented, to say we live in a time that's "more competitive than ever," and so forth. Time magazine recently polled 501 13-year-olds, and 67 percent were positive, just positive, despite any real evidence, that being young is harder today than it was when their parents were kids. Ask 501 45-year-olds the same question and I'll bet you'll get the same answer for the same socially masochistic reasons that prompt people to slap "Life's a Bitch and Then You Die" bumper stickers on their cars. No doubt being a kid has changed in the past decade or so: Childhood is more micromanaged, more tuned in to the potential of dire consequences, less about play, more about work -- in short, more adultlike. But I have trouble buying into the sepia-toned myth that it was "easier" or "simpler" to be either a kid or an adult in, say, the '40s and '50s. The shift is in how we perceive life's trials: Dealing with them head-on used to be virtuous; now chemically vaporizing them is.

But must it be only one or the other? It seems especially stubborn -- dare I say immature -- that the medical community refuses to acknowledge just how much certain psychotropic drugs blur the line between the biochemical and societal. Even more peculiar is that while we usher in a state of being permanently medicated, selective dosing is still viewed as "recreational" and "risky." What's interesting about ADD drugs is that they are remarkably effective regardless of how your brain looks when scanned, achieving what for centuries we've turned to coffee to accomplish, with about the same potential for side effects. So here's a radical thought: Why not just put them in the same category? After all, what's worse, continuing to find ways to define the everyday in terms of disorders until we're all taking pills to curb the effects of other pills, or admitting that we've synthesized substances that can help, from time to time, in different doses for both adults and children, take the edge off in a way that doesn't throw you off track? To me it seems more honest this way, more grown-up, and less likely to rouse our collective inner voices into an anxious chorus constantly wondering what's "wrong" with us.

The argument against this pro-enhancement mind-set, of course, is that it breeds addiction. Though to refute this, one need only look at a fact D.A.R.E. counselors hate to admit about illegal drugs -- that most people who do them never become addicted, that many people smoke pot and do coke much the way they "drink responsibly," and for many of the same reasons (relaxation, focus, confidence boosting) that people ask their doctors if a variety of pills is right for them. Really, it comes down to whether we want to view lifestyle pharmaceuticals as something indulged in passively or actively -- a healthy reinvention of adulthood or a submissive rejection of the difficulties and responsibilities that come with growing up. There are no easy answers here, but until these questions are the ones brought up on the "Today" show -- instead of the dog-and-pony act of whether the drugs are being "abused" -- we are, as a psychiatrist might say, in a state of denial.

All of which brings me back to high school, to those days of petty dealing. Here's how I came into my Ritalin stash in the first place: One afternoon, a girl came up to me while I was smoking a joint behind the auto-shop building and proposed that, in exchange for some Ritalin, I give her some pot. "My parents made me test for ADD so I could do better in school," she said, "but it's bullshit and the Ritalin makes me all tweaky. I don't need it, so I just pretend to take it, to make my parents happy. Yeah, they're total freaks. Anyway..." The girl was one of these naturally well-adjusted types who longs to be maladjusted, the sort who shellacs her scalp with Manic Panic hair dye while listening to Green Day bootlegs in order to mute the prescient knowledge that, in a few years, she'll be practicing corporate law and calling Ansel Adams a madcap genius. I guess she wanted to impress me by seeming nonchalantly badass, because in exchange for a dime sack of what was essentially stems and seeds, she gave me an entire year's worth of Ritalin.

What to do with it? I had no clue. Back in the dark days of 1995, the idea that "everyone had ADD" was already well ingrained in our cynical minds. But pill popping wasn't yet so common among suburban teenagers, and at least among my friends nothing seemed less cool than doing a drug to be able to do your homework, the very thing we took drugs to forget about. The only kid I knew who'd played around with Ritalin recreationally was the brother of my sixth grade girlfriend, a scrappy numbskull who even at age 12 was a cautionary tale of what not to become. "Dude, it's awesome," he told me when he found out I was sitting on a pile of Ritalin, adding, as if this were a good thing: "My nose won't stop bleeding!" One day I heard that college students were paying for Ritalin. I abandoned the idea of experimentation and went into business.

All of which would be a classic tale of teenage deviance if it didn't so closely resemble what the doctors of America are now calling a medical advance. The people coming to me, with their fidgety frontal lobes and desire to get ahead, were young adults whose doctors, back in the square '90s, would have questioned their motives had they asked for some Ritalin. So they relied on me, the kid who felt like a grown-up selling them drugs. Today, meanwhile, they'd likely be able to find a physician to supply what back then only a high school dealer could, though only under the rubric of diagnosis, of acknowledging that something is "wrong" with them. This is what we're calling progress? Seems we have a ways to go.


By David Amsden

David Amsden, a contributing editor at New York magazine, is the author of the novel "Important Things That Don't Matter," which is now available in paperback. He lives in Brooklyn.

MORE FROM David Amsden


Related Topics ------------------------------------------

Drugs