History on Screen: The Knick, William Halsted, and Breast Cancer Surgery

Recently I watched the first episode of The Knick, a new series on Cinemax that revolves around the goings-on at a fictitious hospital in turn-of-the-century New York. It stars Clive Owen as Dr. John Thackery, a brilliant and arrogant surgeon who treats his coworkers contemptuously but earns their grudging respect because he’s so darn good at his job. I’ve read that the show draws on the collections and expertise of Stanley Burns, who runs the Burns Archive of historical photographs. As a medical historian, I suppose it’s an occupational inevitability that I would view The Knick with an eye toward accuracy. Mercifully, I found the show’s depiction of the state of medicine and public health at the time to be largely appropriate: the overcrowded tenements, the immigrant mother with incurable tuberculosis, the post-surgical infections that physicians were powerless to treat in an age before antibiotics. I was a bit surprised by one scene in which Thackery and his colleagues operate in an open surgical theater, their sleeves rolled up and street clothes covered by sterile aprons as they dig their ungloved hands into a patient; while not strictly anachronistic, these practices were certainly on their way out in 1900. But overall, I was gratified to see that the show’s producers seem to be taking the medical history side of things seriously, even if they inject a hefty dose—or overdose—of drama.

A temperamental genius, Thackery thrives on difficult situations that call for quick thinking and improvisation. He pioneers innovative techniques, often in the midst of demanding surgeries, and invents a new type of clamp when he can’t find one to suit his needs. He is also a drug addict who patronizes opium dens and injects himself with liquid cocaine on his way to work. The character appears to be based on William Stewart Halsted, an American surgeon known for all of these qualities, right down to the drug addiction. Born in 1852 to a Puritan family from Long Island, he attended Andover and Yale, where he was an indifferent student, and the College of Physicians and Surgeons, where he excelled. After additional training in Europe, he returned to the US to begin his surgical career, first in New York City, then at Johns Hopkins Medical School. In addition to performing one of the first blood transfusions and being among the first to insist on an aseptic surgical environment, he was famously a cocaine addict, having earlier begun experimenting with the drug as an anesthetic. His colleagues covered for his erratic behavior, turning the other cheek when he arrived late for operations or missed work for days or weeks at a time. Twice he was shipped off to the nineteenth-century version of rehab, where doctors countered his cocaine addiction by dosing him with heroin. Although Halsted remained a cocaine addict all his life, he managed it well enough that by the time he died in 1922 he was considered one of the country’s preeminent surgeons and the founder of modern surgery.

Halsted pioneered another modern innovation, as well: the overtreatment of breast cancer. In the late nineteenth century, women often waited until the disease had reached an advanced stage before seeking medical treatment. As historian Robert A. Aronowitz writes, clinicians “generally estimated the size of women’s breast tumors on their initial visit as being the size of one or another bird egg.” When cancer was this far along, the prognosis was poor: more than 60 percent of patients experienced a local recurrence after surgery, according to figures compiled by Halsted.

In the 1880s, Halsted began working on a way to address these recurrences. Like his contemporaries, he assumed that cancer started as a local disease and spread outward in a logical, orderly fashion, invading the closest lymph nodes first before dispersing to outlying tissues. Recurrences were the result of a surgeon acting too conservatively by not removing enough tissue and leaving cancerous cells behind. The procedure he developed, which would become known as the Halsted radical mastectomy, removed the entire breast, underarm lymph nodes, and both chest muscles en bloc, or in one piece, without cutting into the tumor at all. Halsted claimed astonishing success with his operation, reporting in 1895 a local recurrence rate of six percent. Several years later, he compiled additional data that, while less impressive than his earlier results, still outshone what other surgeons were accomplishing with less extensive operations: 52 percent of his patients lived three years without a local or regional occurrence.

By 1915, the Halsted radical mastectomy had become the standard operation for breast cancer in all stages, early to late. Physicians in subsequent decades would push Halsted’s procedure even further, going to ever more extreme lengths in pursuit of cancerous cells. At Memorial Sloan-Kettering Hospital in New York, George T. Pack and his student, Jerome Urban, spent the 1950s promoting the superradical mastectomy, a five-hour procedure in which the surgeon removed the breast, underarm lymph nodes, chest muscles, several ribs, and part of the sternum before pulling the remaining breast over the hole in the chest wall and suturing the entire thing closed. Other surgeons performed bilateral oophorectomies on women with breast cancer, removing both ovaries in an attempt to cut off the estrogen that fed some tumors. While neither of these procedures became a widely utilized treatment for the disease, they illustrate the increasingly militarized mindset of cancer doctors who saw their mission in heroic terms and considered a woman’s state of mind following the loss of a breast, and perhaps several other body parts, to be, at best, a negligible consideration.

The Halsted radical mastectomy was on its way out by the late 1970s; within a few years, it would comprise less than five percent of breast cancer surgeries. The demise of Halsted’s eponymous operation had several causes. First, data from cancer survivors showed that the procedure was no more effective at reducing mortality than simple mastectomy, or mastectomy combined with radiation. Second, the radical mastectomy was highly disfiguring, leaving women with a deformed chest where the breast had been, hollow areas beneath the clavicle and underarm, and lymphedema, or swelling of the arm following the removal of lymph nodes. As the women’s health movement expanded in the 1970s, patients grew more vocal about insisting on less disabling treatments, such as lumpectomies and simple mastectomies.

William Stewart Halsted

William Stewart Halsted

Halsted’s life and the state of surgery, medicine and public health at the turn of the twentieth century are a rich source of material for a television series, with the built-in drama of epidemic diseases, inadequate treatments, and high mortality rates. But Halsted’s legacy is complicated. He pushed his field forward and introduced innovations, such as surgical gloves, that led to better and safer conditions for patients. But he also became the standard-bearer for an aggressive approach to breast cancer that in many cases resulted in overtreatment. The Halsted radical mastectomy undoubtedly prevented thousands of women from dying of breast cancer, but for others with small tumors or less advanced disease it was surely excessive. And hidden behind the statistics of the number of lives saved were actual women who had to live with the physical and emotional scars of a deforming surgery. The figure of the heroic doctor may still be with us, but the mutilated bodies left behind have been forgotten.


Robert A. Aronowitz, Unnatural History: Breast Cancer and American Society. Cambridge University Press, 2007.

Barron H. Lerner, The Breast Cancer Wars: Hope, Fear, and the Pursuit of a Cure in Twentieth-Century America. Oxford University Press, 2001.

Howard Markel, An Anatomy of Addiction: Sigmund Freud, William Halsted, and the Miracle Drug Cocaine. Pantheon, 2011.

James S. Olson, Bathsheba’s Breast: Women, Cancer & History. Johns Hopkins University Press, 2002.

Your Beard Is Full of Tuberculosis

Victorian beard

On a crowded L train to Williamsburg one recent evening, I clasped my hand around the subway pole and scanned the multitude of hipster men surrounding me. As I studied the slim trousers ending just above sockless ankles, the plaid shirts encasing concave torsos, and the array of earnest tote bags, I spotted several men with full beards. Apparently these unfortunate hipsters were not aware that the style was on its way out (so 2013!), or maybe they were trying to get maximum benefit from their facial hair transplants. Certainly they were not East Asian, for whoever has met an East Asian man with the ability to grow a thick beard?

The embrace of facial hair by the hipster crowd has a historical precedent in the Victorian era, when full beards served as a symbol of masculinity and a stylistic corollary to the elaborate outfits and ornate home furnishings favored by fashionable contemporaries. Women clad themselves in long dresses with full skirts, bustles, and bodices, their hats topped with flowers, feathers, and, occasionally, entire stuffed birds. Men’s sartorial fashion was somewhat less extravagant, featuring neckties and waistcoats in rich fabrics like silk and brocade. At home, overstuffed sofas and armchairs, heavy drapes, and wall-to-wall carpets filled Victorian parlors. The preference for opulence even extended into bathrooms, which often contained luxuriant carpets and drapes, as well as ornamental wood cabinetry.

But in all of those folds of fabric and lush decorations lurked a hidden danger: germs. At the turn of the twentieth century, the leading causes of death were infectious and communicable diseases, especially tuberculosis, pneumonia, influenza, and diarrheal illnesses. Tuberculosis was particularly feared for the slow, painful death it induced in its victims; it consumed the body from the inside out, provoking a graveyard cough or “death rattle” in its final stages, when the patient’s gaunt appearance indicated that the end was near. In 1900, the disease killed one out of every ten Americans overall, and one in four young adults. Physicians had been able to diagnose tuberculosis accurately since 1882, when German bacteriologist Robert Koch identified the microorganism responsible for causing it. But this knowledge did nothing to improve a patient’s prognosis, for no cure existed. It wouldn’t be until after World War II, when antibiotics came into general use, that sufferers would finally have an effective remedy.

Koch’s discovery prefaced a new science of bacteriology. Toward the end of the nineteenth century, the lessons of the laboratory began to reach into American homes and public spaces, changing individual behaviors and cultural preferences. Spitting was a particular target of public health authorities. Tuberculosis-laden sputum could travel from the street into the home on women’s trailing skirts; once inside, it dried into deadly dust that imperiled vulnerable infants and children. In cities across the nation, concerned citizens urged women to shorten their hemlines to avoid dragging germs around on their clothing. “Don’t ever spit on any floor, be hopeful and cheerful, keep the window open,” read one pamphlet. The common communion cup, once a familiar sight in Protestant churches, disappeared as it became implicated in the spread of disease. Hoteliers instituted a practice of wrapping woolen blankets in extra-long sheets that were folded over on top; when a hotel guest departed, the sheet was laundered to remove any tubercular germs exhaled during slumber. Homeowners ripped out the overstuffed upholstery and heavy fabrics of their Victorian-era interiors, replacing them with metal and glass. In bathrooms, white porcelain tiles and the white china toilet supplanted carpeted walls and floors. Preferences shifted to materials that could be cleaned of dust and disinfected, slick surfaces where germs would be unable to gain a foothold.

The stripped-down, modern aesthetic extended to personal style, as well. Women’s hemlines grew shorter, their silhouettes more streamlined. Men began to shed their full beards and moustaches in favor of a clean-shaven look. In 1903, an editorial in Harper’s Weekly commented on the “passing of the beard,” noting that “the theory of science is that the beard is infected with the germs of tuberculosis.” Writing in the same magazine four years later, an observer remarked upon the “revolt against the whisker” that “has run like wild-fire over the land.” By the 1920s, the elaborate fashions of the Victorian era were nowhere in evidence. Picture, for instance, a flapper-era female wearing a cropped hairstyle and a calf-length shift. Or the neatly trimmed moustaches of Teddy Roosevelt and William Howard Taft, the last two presidents to sport facial hair in their official portraits.

A century ago, men with full beards would have felt cultural pressure to shave to protect themselves and their families from the dangerous germs concealed within. It’s a sign of how much our understanding of bacteriology has changed that today’s hipsters harbor no such worries; indeed, few are probably even aware of the historical precedent of disease-laden facial hair. I was never a fan of the look to begin with, and now I can’t help thinking back to earlier fears of contagion whenever I see these beards. But short of a tuberculosis epidemic, which of course I don’t wish for, I’ll have to hope for some other imperative that will bring about a contemporary “revolt against the whisker.”



Nancy Tomes, The Gospel of Germs. Cambridge, MA: Harvard University Press, 1998.

HPV Testing and the History of the Pap Smear

Several weeks ago, the U.S. Food and Drug Administration approved the Cobas HPV test as a primary screening method for cervical cancer. As the first alternative to the familiar Pap smear ever to be green-lighted by the agency, this is big news. If gynecologists and other health practitioners adopt the FDA’s recommendations, it could change women’s experience of and relationship to cancer screening, a process we undergo throughout our adult lives. The HPV test probably won’t replace the Pap smear anytime soon, but it could pose a challenge to the diagnostic’s sixty-year standing as the undisputed first-line defense against cervical cancer.

The Cobas HPV test, manufactured by Roche, works by detecting fourteen high-risk strains of the human papilloma virus, including HPV 16 and HPV 18, the pair responsible for 70% of all cervical cancers. (The Centers for Disease Control estimates that 90% of cervical cancers are caused by a strain of HPV.) If a patient tests positive for HPV 16 or 18, the new FDA guidelines recommend a colposcopy to check for cervical cell abnormalities. If she tests positive for one of the other twelve high-risk HPV strains, the recommended follow-up is a Pap smear to determine the need for a colposcopy. But critics fear that the new guidelines will lead to overtesting and unnecessary procedures, especially in younger women, many of whom have HPV but will clear the virus on their own within a year or two. Biopsies and colposcopies are more invasive, painful, and expensive than Pap testing, and might increase the risk of problems with fertility and pre-term labor down the road.


When George Papanicolaou began the experiments in the 1920s that would lead to the development of his namesake test, cervical cancer was among the most widespread forms of the disease in women, and was by many accounts the commonest. It was also deadly. With no routine method to detect early-stage cancers, many patients weren’t diagnosed until the disease had already metastasized. Even for those who heeded the symptoms of irregular bleeding and discharge, medicine offered little by way of treatment or cure. As Joseph Colt Bloodgood, a prominent surgeon at Johns Hopkins Hospital, grimly observed in 1931, cervical cancer “is today predominantly a hopeless disease.”

Papanicolaou, a Greek-born zoologist and physician, spent his days studying the menstrual cycle of guinea pigs at Cornell University Hospital in New York City. Using a nasal speculum and a cotton swab, he extracted and examined cervical cells from the diminutive animals. Eventually he extended his work to “human females,” using his wife, Mary, as a research subject. He discovered that his technique allowed for the identification of abnormal, precancerous cells shed by the cervix. After a few false starts—his first presentation of his work was at a eugenics conference in 1928 and was panned by attendees—he went back to the lab, spending another decade on swabs and slides. By 1941 he had gotten his ducks in a row, and with a collaborator he published his results in a persuasive paper that was quickly embraced by colleagues. Thus was born the Pap smear.

The Pap smear is not an infallible diagnostic. It can’t distinguish between cells that will become invasive and those that will never spread outside the cervix. Results can be ambiguous and slides are sometimes misread. Nonetheless, the Pap smear was a breakthrough at the time because it detected precancerous changes in cervical cells. It upended the customary timeline of cervical cancer, pushing the clock back by enabling diagnosis of the disease at a stage when lesions could be treated with relative ease and success. Since its introduction, it has contributed to a remarkable reduction in American mortality from cervical cancer, from 44 per 100,000 in 1947 to 2.4 per 100,000 in 2010, a roughly eighteenfold decrease in just over sixty years.

When women in the U.S. die from cervical cancer today, it’s generally because they never had a Pap test, hadn’t had one within the past five years, or failed to follow up on abnormal results with appropriate treatment. The problem isn’t with the test itself; it’s with uneven access to screening and follow-up care. These are issues of class, geographic location, insurance status, and health literacy that the HPV test will do nothing to address. The Pap smear may not be perfect, but when utilized correctly it does a pretty good job of detecting cervical cancer. The FDA’s approval of the Cobas HPV test as a first-line defense and its new cervical cancer screening guidelines have the potential to subject millions of women to decades of invasive, expensive procedures, upending six decades of established practice for a protocol with no clear gains in effectiveness. And that is a very big deal.



Siddhartha Mukherjee, The Emperor of All Maladies: A Biography of Cancer. New York: Scribner, 2010.

Ilana Löwy, A Woman’s Disease: The History of Cervical Cancer. New York: Oxford University Press, 2011.

Monica J. Casper and Adele E. Clarke, “Making the Pap Smear into the ‘Right Tool’ for the Job: Cervical Cancer Screening in the USA, circa 1940-95,” Social Studies of Science 28 (1998): 255-90.

Joseph Colt Bloodgood, “Responsibility of the Medical Profession for Cancer Education, with Special Reference to Cancer of the Cervix,” American Journal of Cancer 15 (1931): 1577-85.

Statistics on cervical cancer from the National Cancer Institute at http://seer.cancer.gov/statfacts/html/cervix.html.