Our Diseases, Our Selves

Over the past few weeks, I’ve been following coverage of the Institute of Medicine’s recent recommendation of a new name and new diagnostic criteria for chronic fatigue syndrome. In a 250+ page report, the IOM, a division of the National Academy of Sciences, proposed that the disease be renamed “systemic exertion intolerance disease.” This would link it more closely with its central feature while distancing it from a designation many patients see as both demeaning and dismissive of the serious impairment that can accompany the condition. It’s a move that has been applauded by a number of advocates and researchers who study the disease, although others caution that more work is needed to develop a definitive test, as well as medications that can effectively treat it.

disease.jpg

The disorder, which is also called myalgic encephalomyelitis (ME), is characterized by persistent fatigue lasting for more than six months, muscle and joint pain, unrefreshing sleep, and post-exertional malaise. Estimates of the number of affected Americans vary widely, from 836,000 to as many as 2.5 million.* I was struck by the divergence of these numbers, as well as by the following statistic which might explain why: it goes undiagnosed in an estimated 84 to 91 percent of patients. This could be the result of physicians’ lack of familiarity with ME/CFS, doubt about the seriousness of symptoms, or a belief that the patient is making up or exaggerating the extent of the illness. But regardless of your perspective on the disease, that’s an alarming rate of underdiagnosis.

As I’ve been perusing the responses from patients and comments from the public debating the nature of the disorder, I’ve noticed that reactions to the IOM recommendations tend to fall into one of two camps. One group is sympathetic to the disease and its sufferers, urging compassion, education, and continued research; not surprisingly, this group seems to consist mainly of patients with ME/CFS, people who have friends or relatives with it, and physicians who treat them. The second group sees patients as malingerers who are overstating their symptoms to get special consideration; they blame our modern lifestyle for inducing widespread fatigue in our population and point to the lack of a conclusive diagnostic test as evidence that the disease doesn’t exist.

All of this brings me to the following question, which I think is relevant not just to the current discussion but to the entire enterprise of Western medicine: what makes a disease “real”? When are diseases worthy of sympathy and concern, insurance reimbursement, research money, and pharmaceutical therapies, and when are they considered to exist only within a patient’s imagination? Few people in the twenty-first century would dispute, for instance, that pneumonia, malaria, and yellow fever are caused by particular microorganisms, their presence in the body detectable through various tests of blood and fluids. But what about conditions for which we have not yet identified a specific pathology? Does the lack of a clear mechanism for the causation of a disease mean that those who are affected by it are suffering any less? Are a patient’s perceived symptoms enough for an ailment to be considered “real”?

I’m distinguishing here between “disease,” which is a pathological condition that induces a particular set of  markers of suboptimal health in an individual, and “illness,” which is the patient’s experience of that disease. Naming a disease confers legitimacy; being diagnosed with it assigns validity to a patient’s suffering, gives him a disease identity, and connects him with a community of the afflicted. And if naming a disease confers a degree of legitimacy, then outlining a biological mechanism for it bestows even more. Disorders with an identifiable pathology are “real,” while all others are suspect. But this process is subject to revision. As the history of medicine shows us, a number of conditions are now considered real that were once thought to be caused by a lack of morality and self-control, namely alcoholism and addiction. Others, including hysteria, chlorosis, neurasthenia, and homosexuality, were once classified as diseases but are now no longer recognized as such.

“Disease,” as Charles Rosenberg reminds us, “does not exist until we have agreed that it does, by perceiving, naming, and responding to it.” It always occurs within a social context and makes little sense outside of the social and cultural environment within which it is embedded. That is why, to varying degrees, what physicians are responding to is a patient’s subjective assessment of how she is experiencing disease: the level of pain, the physical disability, the fatigue, the fever, the extent to which an ailment is interfering with her life.

To say that diseases can exist independently of us is to misunderstand their fundamental nature as human concepts and social actors. They are not mere biological events, but are made legible and assigned meaning through our system of fears, morals, and values. Whether the proposed name change from chronic fatigue syndrome to systemic exertion intolerance disease will lead to greater acceptance for the disorder and those who suffer from it remains to be seen. But it's brought attention to the process of how we define and name diseases. The ways in which we explain their causation and assign responsibility and blame set forth standards for acceptable behavior and delineate the boundaries of what we consider normal. Our relationship with disease reveals how we understand ourselves as a society. All diseases are therefore both not real and real—not real in the sense that they wouldn't exist without us, and real because we have agreed that they do.

 

*By way of comparison, about 5 million Americans are currently living with Alzheimer’s and about 1.2 million have HIV.

 

Sources:

Charles E. Rosenberg and Janet Golden, eds. Framing Disease: Studies in Cultural History. New Brunswick, NJ: Rutgers University Press, 1992.

 

 

Medicating Normalcy

adderrall

When I was in elementary school in the 1970s, I was friends with a boy who was considered hyperactive, which I vaguely understood to mean that he had excess energy and was therefore not supposed to eat sugar. He was occasionally disruptive in class and often had trouble focusing on group activities. My friend seemed to be constantly in motion, bouncing up from his chair during spelling tests and sprinting through the playground at recess, unable to keep still or remain quiet for any length of time. Another classmate, a girl, was a year older than the rest of us because she had been held back to repeat a grade for academic reasons. She was “slow,” a term we used at the time to refer to someone with a cognitive developmental disability.

If these two were growing up today, there’s a good chance they would be diagnosed with an attention disorder and medicated with a drug such as Adderall or Concerta. While A.D.H.D. has been around for awhile—it’s been listed in the Diagnostic and Statistical Manual of Mental Disorders in some form since at least 1968—its incidence in children has skyrocketed over the past few decades. As Alan Schwarz reported several months ago in the New York Times, the number of children taking medication for A.D.H.D. has increased from 600,000 in 1990 to 3.5 million today, while sales of stimulants prescribed for the condition rose more than fivefold in just one decade, from $1.7 billion in 2002 to nearly $9 billion in 2012. And researchers recently identified a new form of attention disorder in young people. Called “sluggish cognitive tempo,” it’s characterized by daydreaming, lethargy, and slow mental processing. It may affect as many as two million American children and could be treated by the same medications currently used for A.D.H.D.

This apparent epidemic of behavioral disorders in children highlights the convergence of a number of factors. In the late 1990s, changes in federal guidelines allowed the direct marketing of drugs to consumers, prompting increased awareness of disordered behaviors such as those which characterize A.D.H.D. Pharmaceutical companies routinely fund research into illnesses for which they manufacture drug therapies. As Schwarz (again) found, some of the chief supporters of sluggish cognitive tempo have financial ties to Eli Lilly; the company’s drug Strattera is one of the main medications prescribed for A.D.H.D. At the same time, overworked teachers in underfunded school districts lack the capacity to give special attention to rambunctious students, and instead urge parents to medicate them to reduce conflict in the classroom. Most important, the definition of what constitutes “normal” has narrowed. Thirty years ago, my unruly friend who wanted to run around during reading time and my absentminded classmate who forgot to write her name on her tests fell toward the extremes on the spectrum of normal behavior. Today they might be diagnosed with A.D.H.D. or sluggish cognitive tempo and given medication to make them less rowdy or more focused.

Normal childhood behavior these days means paying attention in class, answering all the questions on tests, turning in homework on time, and participating in classroom activities in a non-disruptive way. Children today, in short, are expected to be compliant. There will always be those who lack the ability to conform to an ever-constricting range of what constitutes normal behavior. For families with the access and the interest, pharmaceutical companies offer drugs designed to bring these young people within a threshold of what we consider acceptable.