Football and the Risk of Concussion

I’ve been thinking a lot about risk lately. In medicine and public health, it’s an idea that’s always present, usually invoked toward the goal of disease prevention. Over the years, the ways in which the concept of risk has been put forth have changed as the major causes of mortality have shifted from infectious to chronic disease. In the eighteenth and nineteenth centuries, an epidemic of cholera or yellow fever might have been seen as a way to separate acceptable citizens from unacceptable, the latter premised on some combination of ethnicity, race, religion, class and moral principles. More recently, public health recommendations have focused on lifestyle practices that can reduce our risk of developing cancer, heart disease, and other chronic illnesses with multifactorial causes.

mouse_risk.jpg

I’m interested in how we experience risk and how this shapes the decisions we make about what to eat, where to live, the types of behaviors we engage in and the situations we’re comfortable with. How does each of us choose to respond to a series of unknowns about, for instance, the dangers of genetically modified food, the possible link between cellphone radiation and cancer, or the relationship between pesticides and hormonal imbalances? If Alzheimer’s disease runs in your family, what do you do to decrease the chances you’ll develop it? If you’re diagnosed with a precancerous condition that may or may not become invasive, do you remove the suspicious cells immediately or wait to see if they spread? What does it mean for our bodies to be constantly at risk, under threat from sources both known and unknown that we cannot see or regulate?

In the upcoming months, I’ll be exploring these ideas and more in a series of essays on risk. My premise is that the ways in which we choose to deal with risk are fundamentally about control, and are aimed at addressing the illusion that we have command over disease outcomes in a world ruled by randomness and unpredictability. Cancer screenings, lifestyle habits, and the other behaviors we adopt to stay healthy are an attempt to reduce our risk, to make the uncertain certain, to bring what’s unknown into the realm of the foreseeable. As a way of managing the future, this approach assumes a linearity of outcomes; if I engage in x behavior, then I will prevent y disease. It assumes that illness can be reduced to a series of inputs and corresponding outputs, that wellness is more than a game of chance or a spin of a roulette wheel. The boundaries of what we consider reasonable measures to embrace for the sake of our health will differ for each of us based on our individual tolerance for ambiguity and what we consider an acceptable level of risk. As I delve into an investigation of the relationship between risk and health, the underlying question I’ll be concerned is this: what level of uncertainty can each of us live with, and how does it affect our behavior?

So here goes, my first essay in an ongoing series on risk.

With the Denver Broncos’ 24-10 victory over the Carolina Panthers in Super Bowl 50, the 2015 football season came to its much-hyped conclusion. I didn’t watch the game, but I have been following closely any public health news involving the National Football League. Just days before the Super Bowl, the family of Ken Stabler, a former NFL quarterback, announced that he suffered from chronic traumatic encephalopathy (CTE), a degenerative brain disease that can trigger memory loss, erratic behavior, aggression, depression, and poor impulse control. The most prominent quarterback yet to be diagnosed, Stabler joins Junior Seau, Frank Gifford, Mike Webster, and over 100 former players found to have the disease, which is caused by repeated brain trauma and can only be determined after death by a physical examination of the brain. Retired NFL players suffer from numerous chronic injuries that affect their physical and mental well-being: in addition to the multiple concussions, there are torn ligaments, dislocated joints, and repeated broken bones that can no longer effectively be managed by cortisone injections and off-the-field treatments. Many athletes end up addicted to painkillers; some, like Seau, commit suicide or die from drug overdoses, isolated from family and friends. One particularly moving article in the New York Times profiled Willie Wood, a 79-year old former safety for the Green Bay Packers who was part of the most memorable play of Super Bowl I, yet can no longer recall that he was in the NFL. And incidents of domestic abuse against the partners and spouses of players continue to make headlines, including the unforgettable video of Ray Rice knocking his girlfriend unconscious in an elevator at an Atlantic City casino.

Despite these controversies, football remains enormously popular in the United States. Revenue for the NFL was $11 billion in 2014, and league commissioner Roger Goodell pocketed $34 million in compensation that year. The NFL has managed to spin the concussion issue in a way that paints the league as highly concerned about player safety. Goodell touts the 39 safety-related rules he has implemented during his tenure, and the settlement last fall in a class-action lawsuit brought by former players set up a compensation fund to cover certain medical expenses for retired athletes (although some criticized the deal because it doesn’t address symptoms of CTE in those who are still alive). Increasing awareness of the danger of concussions has prompted discussions about how to make the game safer for young athletes. One approach that’s been floated is to have players scrimmage and run drills without helmets and protective padding, forcing them to treat each other gently in practice while saving the vigorous tackles for game day. The Ivy League just agreed to eliminate full-contact hitting from all practices during the regular season, a policy that Dartmouth College adopted in 2010. And earlier this week, the NFL’s top health and safety official finally acknowledged the link between football and CTE after years of equivocating on the subject.

But controlled violence is such a central aspect of football that I wonder how much the sport and its culture can be altered without changing its underlying appeal. Would football be a profoundly different thing with the adoption of protocols that reduce the likelihood of concussions and other injuries? How much room is there for change within the game that football has become? Players continue to get bigger and stronger, putting up impressive stats at younger and younger ages. My friend’s nephew, a standout high school player in Texas and a Division I college prospect, was 6’2” and weighed 220 pounds when he reported as a freshman for pre-season training—numbers that I imagine will only expand as he continues to train, and ones that players around him will have to match in order to remain competitive.

With mounting knowledge of the link between football and degenerative brain disease, I’m interested in the level of risk that’s acceptable in a sport where serious acute and chronic injuries are increasingly the norm. In a recent CNN town hall, Florida senator Marco Rubio asserted that football teaches kids important life lessons about teamwork and fair play, and pointed out that there are risks inherent in plenty of activities we engage in, such as driving a car. True enough, but driving a car is an essential part of daily life for many of us, which means that we have little choice but to assume the associated risks. Football is voluntary. I realize that for some, football is less than completely voluntary, from children who face parental pressure to professional athletes who feel compelled to remain in the game because they’re supporting families or facing limited options outside the sport. Still, playing football is an acquired risk to a greater degree than driving or riding in a car, the dominant form of transportation in our suburbanized communities. And if risk reduction is about attempting to control for uncertainty, then the accumulating evidence about CTE and other severe injuries is sure to change the calculus of how parents and players assess participation in a sport where lifelong mental and physical disabilities are not just possible, but probable.

Johnson & Johnson’s Baby Powder: Harmless Household Product or Lethal Carcinogen?

On Monday night, a jury in St. Louis awarded $72 million to the family of a woman who died of ovarian cancer after using Johnson & Johnson’s baby powder and other talcum-based products for years. The verdict came after a three-week trial in which lawyers for the plaintiff, an Alabama woman named Jacqueline Fox, argued that Johnson & Johnson had known of the dangers of talcum powder since the 1980s and concealed the risks. The corporation’s lawyers countered by saying the safety of talcum powder was supported by decades of scientific evidence and there was no direct proof of causation between its products and Fox’s cancer.

Fox used Johnson & Johnson’s baby powder and another talc-based product called Shower to Shower for 35 years. “It just became second nature, like brushing your teeth,” her son said. “It’s a household name.” The company has come under fire in recent years from consumer safety groups for the use of questionable ingredients in its products, including formaldehyde and 1,4-dioxane, both of which are considered likely carcinogens. Fox’s case was the first to reach a monetary award among some 1,200 lawsuits pending nationally against the company.

The case bears a notable resemblance to the lawsuits against the tobacco companies, with attorneys for both the plaintiff and the defendant taking a page from the playbook of their respective side. Fox’s lawyers claimed that Johnson & Johnson’s own medical consultants warned in internal documents of the risk of ovarian cancer from hygienic talc use, just as tobacco companies knew for decades that smoking caused lung cancer but sought to suppress the evidence. And the pharmaceutical giant responded as the tobacco industry did in the numerous lawsuits it faced in the 1980s and 1990s: by creating doubt about the mechanism of cancer causation and upholding the safety of its products.

I find this case uniquely disturbing because the image of Johnson & Johnson’s baby powder as a household product that evokes a sense of comfort and protection is so at odds with the jury’s finding that it caused or contributed to a case of fatal ovarian cancer. The company appears to be right in claiming that the scientific evidence is inconclusive: some studies have shown a slightly increased risk of ovarian cancer among women who use products containing talcum powder, while others have found no link. It’s important to note that until the 1970s talcum-based products contained asbestos, so people who used them before that time were exposed to a known carcinogen. Still, the research is unsettled enough that the American Cancer Society advises people who are concerned about talcum powder to avoid using it “[u]ntil more information is available.”

Without examining the trial transcripts or interviewing the jurors, it’s impossible to know for sure what factors influenced the verdict. I imagine the tobacco settlements have irrevocably changed the environment surrounding these types of lawsuits—that there’s a sizable segment of the American public which is understandably suspicious of large corporations trying to conceal research about the health risks of their products. I suspect there’s also an element of causation and blame at work here, about wanting to assign responsibility for a disease that remains, for the most part, perplexing and impenetrable. We all make choices that affect our health on a daily basis, from what kind of shampoo to use to what to eat for lunch, and we want assurance that the repercussions of the decisions we make with good intentions will be in our best interest. But as the unprecedented $72 million verdict shows, we have an immense uneasiness about the dangers lurking behind the most benign-seeming household products. And we fear that those products, rather than benefiting us, will instead do us harm.

The Coerciveness of Public Health

This morning I awoke to the news that Chris Christie, the governor of New Jersey, thinks parents should have a choice about whether to vaccinate their children. He has since backtracked on his statement and affirmed his support for vaccination. But as a measles outbreak spreads across California, Arizona, and twelve other states, it’s exposing the tension between personal autonomy and community well-being that’s an ever-present part of the doctrine of public health.

needle

The current measles outbreak most likely started when a single infected individual visited Disneyland over the holidays, exposing thousands of vacationers to a highly communicable disease that the CDC declared eliminated from the U.S. in 2000. At another time—say, ten years ago—the outbreak might have been contained to a handful of cases. But as numerous media outlets have reported, immunization rates have been dropping in recent years, particularly in wealthy enclaves where parents still believe the debunked link between vaccines and autism, aim for a toxin-free lifestyle, or distrust Big Pharma and the vaccine industrial complex.

I am young enough to have benefited from the scientific advances that led to widespread immunization in the 1970s, and old enough to have parents who both had measles as children and can recall the dread surrounding polio when they were growing up. Vaccines are a clear example of how public health is supposed to work. One of the unambiguous public health successes of the twentieth century, vaccines have transformed ailments such as pertussis, diphtheria, and chickenpox from fearsome childhood afflictions that could cause lifelong complications, and even death, to avertible diseases.

The basic premise of public health is the prevention of disease, and public health guidelines have led to increased life expectancy and decreased incidence of communicable illnesses, as well as some chronic ones. Yet public health regulations have always had to balance individual civil liberties with public safety. People are free to make their own choices, as long as they don’t infringe on the public good. For the most part you’re still allowed to smoke in your own home (although your neighbors could sue you for it), but you can’t subject me to your secondhand smoke in restaurants, bars, or office buildings.

I believe in handwashing, USDA inspections, the use of seatbelts, and the pasteurization of milk. I believe in quarantines when they are based on the best available information and are applied evenly. (A quarantine that isolates all travelers from West Africa who have symptoms of Ebola would be reasonable; one that singles out black Africans from anywhere on the continent regardless of health status would not.) In short, I am in favor of a coercive public health apparatus. The problem with the current measles outbreak is that enforcement has become too lax, with too many states allowing parents to opt out of immunizing their children because of ill-conceived beliefs that are incompatible with the public good.

Every parent spends a lifetime making choices about how to raise their child, from environment and lifestyle to moral and ethical guidance. But some choices have a greater capacity to impact the lives of others. If you want to let your child run around with scissors, watch R-rated movies, and eat nothing but pork rinds all day, you can. If you want to home-school your child because you want greater control over the curriculum he or she is being taught, you’re free to do that, too. And if you want to keep your child from getting vaccinated against communicable diseases, then the state won’t step in to force you. Opting out of vaccinations might not make you a bad parent any more than raising a fried-snack fiend might. But unless you’re planning to spend your days in physical isolation from every other human on the planet, it does make you a bad member of the public.