Johnson & Johnson’s Baby Powder: Harmless Household Product or Lethal Carcinogen?

On Monday night, a jury in St. Louis awarded $72 million to the family of a woman who died of ovarian cancer after using Johnson & Johnson’s baby powder and other talcum-based products for years. The verdict came after a three-week trial in which lawyers for the plaintiff, an Alabama woman named Jacqueline Fox, argued that Johnson & Johnson had known of the dangers of talcum powder since the 1980s and concealed the risks. The corporation’s lawyers countered by saying the safety of talcum powder was supported by decades of scientific evidence and there was no direct proof of causation between its products and Fox’s cancer.

Fox used Johnson & Johnson’s baby powder and another talc-based product called Shower to Shower for 35 years. “It just became second nature, like brushing your teeth,” her son said. “It’s a household name.” The company has come under fire in recent years from consumer safety groups for the use of questionable ingredients in its products, including formaldehyde and 1,4-dioxane, both of which are considered likely carcinogens. Fox’s case was the first to reach a monetary award among some 1,200 lawsuits pending nationally against the company.

The case bears a notable resemblance to the lawsuits against the tobacco companies, with attorneys for both the plaintiff and the defendant taking a page from the playbook of their respective side. Fox’s lawyers claimed that Johnson & Johnson’s own medical consultants warned in internal documents of the risk of ovarian cancer from hygienic talc use, just as tobacco companies knew for decades that smoking caused lung cancer but sought to suppress the evidence. And the pharmaceutical giant responded as the tobacco industry did in the numerous lawsuits it faced in the 1980s and 1990s: by creating doubt about the mechanism of cancer causation and upholding the safety of its products.

I find this case uniquely disturbing because the image of Johnson & Johnson’s baby powder as a household product that evokes a sense of comfort and protection is so at odds with the jury’s finding that it caused or contributed to a case of fatal ovarian cancer. The company appears to be right in claiming that the scientific evidence is inconclusive: some studies have shown a slightly increased risk of ovarian cancer among women who use products containing talcum powder, while others have found no link. It’s important to note that until the 1970s talcum-based products contained asbestos, so people who used them before that time were exposed to a known carcinogen. Still, the research is unsettled enough that the American Cancer Society advises people who are concerned about talcum powder to avoid using it “[u]ntil more information is available.”

Without examining the trial transcripts or interviewing the jurors, it’s impossible to know for sure what factors influenced the verdict. I imagine the tobacco settlements have irrevocably changed the environment surrounding these types of lawsuits—that there’s a sizable segment of the American public which is understandably suspicious of large corporations trying to conceal research about the health risks of their products. I suspect there’s also an element of causation and blame at work here, about wanting to assign responsibility for a disease that remains, for the most part, perplexing and impenetrable. We all make choices that affect our health on a daily basis, from what kind of shampoo to use to what to eat for lunch, and we want assurance that the repercussions of the decisions we make with good intentions will be in our best interest. But as the unprecedented $72 million verdict shows, we have an immense uneasiness about the dangers lurking behind the most benign-seeming household products. And we fear that those products, rather than benefiting us, will instead do us harm.

Shooting the Moon on Cancer

During his final State of the Union address on January 12th, President Obama announced a “moonshot” to cure cancer and appointed Vice President Biden to lead it. The edict is reminiscent of Nixon’s 1971 War on Cancer, which was envisioned as an all-out effort to eradicate the disease by marshaling the kinds of resources and scientific knowledge that two years earlier had sent a man to the moon. By most measures we’re much better off now than we were four decades ago: cancer treatments have improved drastically, people are living longer after diagnosis, and mortality rates have been falling since their peak in the early 1990s. But as anyone who has been touched by cancer can attest—and in the United States, that’s nearly all of us—the war is far from over.

Biden, whose son died of brain cancer last year, outlined a plan that’s essentially twofold: to increase public and private resources in the fight against cancer, and to promote cooperation among the various individuals, organizations and institutions working in cancer research. The initiative will likely lead to increased funding for the National Institutes of Health, a prospect that has many scientists giddy with anticipation. But the complexities of cancer, which are now much clearer than in 1971, underscore the multiple challenges confronting us. On NPR, one researcher described how the same form of cancer can act differently in different people because of the immense number of genetic distinctions between us. And in the New York Times, Gina Kolata and Gardiner Harris pointed out that the moonshot reflects an outmoded view of cancer as one disease rather than hundreds, and the idea of discovering a single cure is therefore “misleading and outdated.”

Nixon’s initiative signaled an optimism in the certainty of scientific progress to combat a disease that many regarded with dread. In polls, articles, and letters, Americans at the time debated whether they’d want to be told of a cancer diagnosis and worried about being in close contact with cancer patients. The disease’s many unknowns generated fears of contracting it, desperation about the pain and debilitation associated with it, and plenty of unorthodox cures (this was, after all, the era of laetrile and Krebiozen).

Much has changed in the intervening decades, and if you’re diagnosed with a form of cancer today, you’d undoubtedly have a better prognosis than in 1971. But one aspect of cancer in American culture has not changed, and that’s the mystique surrounding the disease. Cancer is not the biggest cause of death in the US—heart disease takes top honors—but it remains the most feared. It occupies an outsize place in the landscape of health and wellness, suffering and death. As such, it demands a bold approach. Winning the “war on cancer” would necessitate breaking down the disease into types and subtypes: not cancer, but cancers; not cancer but ductal carcinoma in situ and acute myeloid leukemia and retinoblastoma. But this would dilute its power as a singular cultural force, an adversary that (the thinking goes) with a massive coordinated input of resources could be vanquished once and for all.

Photo credit: Cecil Fox, National Cancer Institute

Photo credit: Cecil Fox, National Cancer Institute

Biden’s moonshot doesn’t just reproduce an outmoded idea of cancer; it is dependent upon it. It also promotes research and therapies in search of a cure at the expense of prevention, which he fails to mention a single time. Innovative new treatments that showcase scientific advancement are flashy and exciting, unlike lifestyle recommendations around nutrition or exercise, or widespread public health efforts to reduce the presence of environmental carcinogens. There’s also the issue of how to measure progress with a goal as elusive as curing cancer. Is it in decreased incidence or mortality rates? In lowering the number of new diagnoses to zero? Perhaps the moonshot should instead focus on reducing human suffering associated with cancers by emphasizing prevention and addressing inequalities that affect health and health outcomes. It’s an objective that’s unquestionably less spectacular than curing cancer, but certainly more achievable.

 

Sources:

James T. Patterson, The Dread Disease: Cancer and Modern American Culture. Harvard University Press, 1987.

 

History on Screen: The Knick, William Halsted, and Breast Cancer Surgery

Recently I watched the first episode of The Knick, a new series on Cinemax that revolves around the goings-on at a fictitious hospital in turn-of-the-century New York. It stars Clive Owen as Dr. John Thackery, a brilliant and arrogant surgeon who treats his coworkers contemptuously but earns their grudging respect because he’s so darn good at his job. I’ve read that the show draws on the collections and expertise of Stanley Burns, who runs the Burns Archive of historical photographs. As a medical historian, I suppose it’s an occupational inevitability that I would view The Knick with an eye toward accuracy. Mercifully, I found the show’s depiction of the state of medicine and public health at the time to be largely appropriate: the overcrowded tenements, the immigrant mother with incurable tuberculosis, the post-surgical infections that physicians were powerless to treat in an age before antibiotics. I was a bit surprised by one scene in which Thackery and his colleagues operate in an open surgical theater, their sleeves rolled up and street clothes covered by sterile aprons as they dig their ungloved hands into a patient; while not strictly anachronistic, these practices were certainly on their way out in 1900. But overall, I was gratified to see that the show’s producers seem to be taking the medical history side of things seriously, even if they inject a hefty dose—or overdose—of drama.

A temperamental genius, Thackery thrives on difficult situations that call for quick thinking and improvisation. He pioneers innovative techniques, often in the midst of demanding surgeries, and invents a new type of clamp when he can’t find one to suit his needs. He is also a drug addict who patronizes opium dens and injects himself with liquid cocaine on his way to work. The character appears to be based on William Stewart Halsted, an American surgeon known for all of these qualities, right down to the drug addiction. Born in 1852 to a Puritan family from Long Island, he attended Andover and Yale, where he was an indifferent student, and the College of Physicians and Surgeons, where he excelled. After additional training in Europe, he returned to the US to begin his surgical career, first in New York City, then at Johns Hopkins Medical School. In addition to performing one of the first blood transfusions and being among the first to insist on an aseptic surgical environment, he was famously a cocaine addict, having earlier begun experimenting with the drug as an anesthetic. His colleagues covered for his erratic behavior, turning the other cheek when he arrived late for operations or missed work for days or weeks at a time. Twice he was shipped off to the nineteenth-century version of rehab, where doctors countered his cocaine addiction by dosing him with heroin. Although Halsted remained a cocaine addict all his life, he managed it well enough that by the time he died in 1922 he was considered one of the country’s preeminent surgeons and the founder of modern surgery.

Halsted pioneered another modern innovation, as well: the overtreatment of breast cancer. In the late nineteenth century, women often waited until the disease had reached an advanced stage before seeking medical treatment. As historian Robert A. Aronowitz writes, clinicians “generally estimated the size of women’s breast tumors on their initial visit as being the size of one or another bird egg.” When cancer was this far along, the prognosis was poor: more than 60 percent of patients experienced a local recurrence after surgery, according to figures compiled by Halsted.

In the 1880s, Halsted began working on a way to address these recurrences. Like his contemporaries, he assumed that cancer started as a local disease and spread outward in a logical, orderly fashion, invading the closest lymph nodes first before dispersing to outlying tissues. Recurrences were the result of a surgeon acting too conservatively by not removing enough tissue and leaving cancerous cells behind. The procedure he developed, which would become known as the Halsted radical mastectomy, removed the entire breast, underarm lymph nodes, and both chest muscles en bloc, or in one piece, without cutting into the tumor at all. Halsted claimed astonishing success with his operation, reporting in 1895 a local recurrence rate of six percent. Several years later, he compiled additional data that, while less impressive than his earlier results, still outshone what other surgeons were accomplishing with less extensive operations: 52 percent of his patients lived three years without a local or regional occurrence.

By 1915, the Halsted radical mastectomy had become the standard operation for breast cancer in all stages, early to late. Physicians in subsequent decades would push Halsted’s procedure even further, going to ever more extreme lengths in pursuit of cancerous cells. At Memorial Sloan-Kettering Hospital in New York, George T. Pack and his student, Jerome Urban, spent the 1950s promoting the superradical mastectomy, a five-hour procedure in which the surgeon removed the breast, underarm lymph nodes, chest muscles, several ribs, and part of the sternum before pulling the remaining breast over the hole in the chest wall and suturing the entire thing closed. Other surgeons performed bilateral oophorectomies on women with breast cancer, removing both ovaries in an attempt to cut off the estrogen that fed some tumors. While neither of these procedures became a widely utilized treatment for the disease, they illustrate the increasingly militarized mindset of cancer doctors who saw their mission in heroic terms and considered a woman’s state of mind following the loss of a breast, and perhaps several other body parts, to be, at best, a negligible consideration.

The Halsted radical mastectomy was on its way out by the late 1970s; within a few years, it would comprise less than five percent of breast cancer surgeries. The demise of Halsted’s eponymous operation had several causes. First, data from cancer survivors showed that the procedure was no more effective at reducing mortality than simple mastectomy, or mastectomy combined with radiation. Second, the radical mastectomy was highly disfiguring, leaving women with a deformed chest where the breast had been, hollow areas beneath the clavicle and underarm, and lymphedema, or swelling of the arm following the removal of lymph nodes. As the women’s health movement expanded in the 1970s, patients grew more vocal about insisting on less disabling treatments, such as lumpectomies and simple mastectomies.

William Stewart Halsted

William Stewart Halsted

Halsted’s life and the state of surgery, medicine and public health at the turn of the twentieth century are a rich source of material for a television series, with the built-in drama of epidemic diseases, inadequate treatments, and high mortality rates. But Halsted’s legacy is complicated. He pushed his field forward and introduced innovations, such as surgical gloves, that led to better and safer conditions for patients. But he also became the standard-bearer for an aggressive approach to breast cancer that in many cases resulted in overtreatment. The Halsted radical mastectomy undoubtedly prevented thousands of women from dying of breast cancer, but for others with small tumors or less advanced disease it was surely excessive. And hidden behind the statistics of the number of lives saved were actual women who had to live with the physical and emotional scars of a deforming surgery. The figure of the heroic doctor may still be with us, but the mutilated bodies left behind have been forgotten.


Sources:

Robert A. Aronowitz, Unnatural History: Breast Cancer and American Society. Cambridge University Press, 2007.

Barron H. Lerner, The Breast Cancer Wars: Hope, Fear, and the Pursuit of a Cure in Twentieth-Century America. Oxford University Press, 2001.

Howard Markel, An Anatomy of Addiction: Sigmund Freud, William Halsted, and the Miracle Drug Cocaine. Pantheon, 2011.

James S. Olson, Bathsheba’s Breast: Women, Cancer & History. Johns Hopkins University Press, 2002.