https://guernicamag.com/wp-content/uploads/2015/12/16735998022_b847c02d48_z.jpg

Photo taken from the United Nations Photo Flickr page.

My husband’s favorite disease is Syphilis. The bacteria’s timeline, which is complicated and unpredictable, wows him. After the initial chancre, it can lie dormant for years before resurfacing as cognitive deficits or blindness. In its honor we have Syphilis memorabilia on the mantelpiece in our bedroom: A tin lunchbox with a stencil of a man in a gas mask and the block-lettered slogan “The Enemy is Syphilis,” a plush pink spiral doll in the shape of the bacterium.

My husband’s not a creep. He’s an infectious disease doctor, a man who respects the contagions he encounters at work. Many of his patients have compromised immune systems and suffer illnesses that healthy people rarely get. He’s diagnosed fungal infections in the blood, the brain, the lungs. He often encounters toxoplasmosis, a parasite usually contracted from cat litter that can damage brain tissue. He’s treated patients with typhoid fever, malaria, and necrotizing fasciitis, a deadly condition caused by “flesh-eating” bacteria.

ID doctors are the detectives of the hospital, the people hospitalists consult when they’re stumped. Some diagnoses come easily, others require complex sleuth work: analyzing symptoms, lab tests, and personal histories while keeping track of exposure. My husband’s job is most interesting when someone’s a challenge to diagnose. Health is the goal, but sickness is the fascination.

“By virtue of having bodies,” Eula Biss points out in her recent book On Immunity, humans are “dangerous.” Taking care of infectious individuals can be as much about limiting public risk as it as about recovery. When Barack Obama asked Congress for six million dollars to treat Ebola patients in November 2014, he emphasized the potential risk to Americans, rather than the current risk to West Africans. “Over the longer term,” he said, “my administration recognizes that the best way to prevent additional cases at home will be to contain and eliminate the epidemic at its source in Africa.”

When it comes to managing infections, civil liberties often compete with community programs focused on the wellbeing of the masses.

Microbes that could cause a pandemic are all over, just a few mutations away from their deadly potential. The television series The Walking Dead and movies like Twelve Monkeys, 28 Days Later, and Contagion cash in on the hysteria epidemics can unleash. In the 1995 film Outbreak, a hemorrhagic fever spreads in Cedar Creek and the military puts the small town under quarantine, policing the perimeter with barricades and helicopters. Rumors start, and, justifiably, panic ensues. Doctors and nurses care for the sick, but, with no known cure, their efforts fail. The survival of thousands rests on the timely and heroic brilliance of two doctors, who happen to be ex-lovers with unresolved issues. If this pair can’t put aside their feelings and find an infected monkey that may carry ingredients for an antidote, bureaucrats at the Pentagon will firebomb Cedar Creek. Everyone will die. General McClintock, who heads committee that approves the massacre, rationalizes this decision as a necessary evil. The unsuspecting townspeople will be casualties in a war against disease, and the nation will be saved from a pandemic.

Outbreak is laughably bad, the kind of movie that’s better suited for a drinking game than serious watching. As the clock ticks ominously along, communication between health officials and civilians breaks down. Fragile personalities delay epidemiological inquiry. Everyone wants someone to blame. Worst of all, big government’s “solution” to the infection is absurdly overblown, a decision based on blind fear and not science. If Outbreak was real, the International Court of Justice at The Hague would be conducting an investigation.

Still, as sensational as it is, Outbreak provides insight into the kinds of ethical questions public health officials, doctors, and patients must navigate. When it comes to managing infections, civil liberties often compete with community programs focused on the wellbeing of the masses. The kinds of choices most Americans take for granted—whether to disclose a medical condition, for example—can quickly vanish as medical knowledge and public fears shape new policies.

During disease outbreaks, one of the greatest threats to any population is a “super-spreader,” a host of a bacteria or virus who disproportionately passes contagions on to his or her contacts. In the early years of the AIDS epidemic, Gaëtan Dugas, a Canadian Air flight attendant, was thought to be such a super-spreader. By his own estimations, Dugas had sex with over 2,500 men in North America. In a 1983 study by the Center for Disease Control, 40 of the 248 North American who’d so far been diagnosed with HIV had had intercourse with Dugas, or with another man who’d had intercourse with Dugas. Though later studies showed that Dugas’s role in the spread of the virus was likely overstated, his impact on the men he infected proved to be fatal in the early days of the epidemic. As a hypersexual gay man, he became an easy target for anger and fear, a place for worriers to throw their blame.

The idea of a super-spreader is troubling for many reasons. For one thing, sick individuals rarely know how contagious they are. Many people are non-spreaders. When they’re sick, they affect no one. Super-spreaders, in contrast, are thought to be responsible for 80 percent of new infections, which means a single person can seriously derail public health efforts. The reasons super-spreaders pass disease so effectively varies depending on the outbreak and the person. Dugas’s mobility and sexual activity made him an ideal vector for HIV transmission. Other super-spreaders simply shed more virus than most people, or don’t exhibit symptoms and therefore don’t know to take precautions. Super-spreaders can be difficult to locate. Public health officials often have to work backwards to find them, locating outbreaks and then tracing those outbreaks to a source.

During the 2003 SARS epidemic, most sick people infected no one. The survival of the virus relied on a few key people. A doctor from Guangdong Province went to a wedding in Hong Kong. Sixteen people at his hotel became ill. Another young man infected every doctor, nurse, and medical student who examined him, ultimately spreading SARS to over a hundred healthcare workers. An international flight attendant infected 160 people, some of whom were also super-spreaders. The disease spread from Hong Kong to Toronto, where 257 more people came down with the virus. A map of this progression would look a bit like a fireworks show, single lines of infection suddenly bursting.

The most famous, and most maligned, super-spreader is Mary Mallon, more commonly known as Typhoid Mary. Because typhoid is transmitted through infected food or water, Mallon, an Irish cook in New York City and a healthy carrier of the Salmonella typhi bacterium, was uniquely poised to spread infection, which she did to an estimated 51 people, three of whom died. In 1907, when George Soper, a civil engineer, first told Mallon her suspected role in the outbreak, she attacked him with a carving fork. Though Louis Pasteur had already established germ theory, many still believed that disease sprang from filth and miasmas, gases that rose from sewers into tenements. Immoral behavior, which social scientists linked to poverty, was another common explanation for illness. When the poor became sick, the assumption was often that they deserved it. Considering the prevalence of such disease theories, and Mallon’s own clean bill of health, it’s no surprise she found the accusation insulting.

Mary Mallon’s story highlights the tension between civil liberties and disease control, a conflict central to public health.

A few months after her initial encounter with Soper, Mallon was forcibly removed to Riverside Hospital, on North Brother Island in the East River, where she lived in isolation from 1907-1910. In 1909 she sued the health department for holding her unfairly. “I am an innocent human being,” she argued. “I have committed no crime and I am treated like an outcast—a criminal. It is unjust, outrageous, uncivilized.” Though she lost her court case, a new health commissioner took pity and released her in 1910 on the condition that she no longer work as a professional cook. Her new job, as a laundress, was far less lucrative than what she was used to, and Mallon broke her promise to stay out of the food industry. In January 1915, when 25 people at Sloane Maternity Hospital in Manhattan contracted typhoid, she was discovered working in the kitchen under the pseudonym Mary Brown. Mallon was returned to North Brother Island, where she remained until her death in 1938.

It’s easy to empathize with both sides of Typhoid Mary’s story. On the one hand, she lived during a time when disease was poorly understood and discrimination against the poor, the Irish, and unmarried women was rampant. With no symptoms of typhoid, Mallon had trouble coming to terms with her role as a carrier. Judith Leavitt, author of Typhoid Mary, believes Mallon’s suspicions that she was targeted unfairly are true. No one else was quarantined for spreading the disease, though more healthy carriers of typhoid were identified. On the other hand, the health department’s job was to protect the public. Isolating her no doubt prevented infections, and probably deaths.

Mary Mallon’s story highlights the tension between civil liberties and disease control, a conflict central to public health. Health campaigns work best when individuals act voluntarily on behalf of a community, but to make something voluntary means risking the possibility of non-compliance. On the flip side, when public health officials mandate participation, civil liberties vanish, and though lives may be saved, quality of life diminishes. The eradication of smallpox, widely considered to be a medical triumph, was enabled in part by mandatory vaccinations. Healthcare workers trekked to remote communities to administer shots, and in some cases vaccinations were given at gunpoint. These measures were both coercive and successful. The disease, which was estimated to kill over 300 million people in the 20th century, no longer exists outside of laboratories.

Eula Biss tells us that infectious diseases require us to think of the collective in addition to the individual. “The natural body meets the body politic in the act of vaccination,” she observes. “A single needle penetrates both.” If super-spreaders are the most dangerous vectors of disease, then vaccines are their opposites—a mostly low-risk solution that requires the participation of the masses. When enough people have immunity, an outbreak reaches a tipping point and declines so drastically that even the unvaccinated benefit. Today, some vaccinations are recommended while others are required. Laws vary by state. Often times, campaigns for optional vaccines focus not only on the health benefits for the person getting vaccinated but also those benefits incurred by neighbors. Advertisements, like the slogan for the flu shot, which tells consumers to “protect yourself and those around you,” appeal to our sense of self-protection and to our altruism.

Though Spencer didn’t pass Ebola to anyone, the fact of his sickness was used to justify mandatory quarantines, legislation that reflects fear more than science.

Much of the anger over this summer’s measles outbreak and its association with the decline in Measles, Mumps, and Rubella (MMR) vaccinations revolves around the assumption that anti-vaxxers are careless or selfish. If getting vaccinated is an act of altruism, then forgoing vaccinations, some assume, must be motivated by malice. In fact, studies show that fear of vaccinations is linked to distrust of the government and pharmaceutical companies. Disease presents a known threat. Vaccinations, on the other, are perceived as inscrutable, possessing unnamable and unknowable risks, or, in the case of the MMR vaccine and its rumored, now disproven, link to autism, the possibility of harm. Though the FDA, for example, states that the presence of trace amounts of aluminum and formaldehyde in vaccines is very low-risk, the demographic most likely to forego vaccinations—the white children of upper-middle-class, college-educated, married women—are unlikely to find comfort in government assurances. As Biss points out, “We resist vaccination in part because we want to rule ourselves.”

The anthropologist Emily Martin believes that our bodies function as metaphors and that these metaphors can reveal how we conceptualize the world. After surveying popular media during the peak of the AIDS crisis in the late 1980s, Martin noticed the proliferation of images of the body as a scene of institutional violence; the body was drawn either at war or as a highly efficient police state. White blood cells constituted the infantry while larger macrophages were compared to “armored units.” Killer T cells played the role of bombers, aiming lethal proteins at infected cells. Pathogens were portrayed as external enemies, “invaders” who were first detected then assaulted. Extermination was the goal. The immune system, she writes, “maintains a clear boundary between self and non self” and is “accompanied by a conception of the non self world as foreign and hostile.”

If a sick body is like a nation at war, then inoculations play a puzzling role. Depending on whom you ask, they can be either self or non self, invaders or members of the home team. For some, they are suspicious characters. Friends disguised as enemies. For others, they are enemies disguised as friends. They don’t neatly fit any category.

To carry the metaphor further, vaccinations are strangers. While most vaccinations, like most strangers, are benign, anti-vaxxers fear the potential violence they can unleash on the body. For many, this fear looms larger than the medical data supporting vaccination, larger than the risk of disease itself.

I got married a year ago at the height of the Ebola epidemic. At our wedding reception, my father gave a toast that was also a shout-out to my husband’s new occupation. He compared our love to a virus. “They’ve infected each other,” he said. “Look at their symptoms. Sweaty palms. Strange new behaviors. Shifts in their personalities. I’m afraid there’s no cure.” Later in the evening, he told several of our friends that he hoped they got sick, too.

Two days later, my husband’s hospital began preparing for the possibility of admitting Ebola patients. My husband is a fellow in a training program, not yet a doctor hired at full salary, and he and his colleagues were all asked whether they’d be willing to care for an Ebola patient if one arrived.

I told him I was surprised that doctors had the option not to treat certain patients.

“This is an unusual circumstance.”

“What do you want to do?” I asked.

“After all this training,” he said, “I think I should.”

My reaction was mixed. I was proud my husband had the ability and generosity to treat an Ebola patient, but it seemed unfair that two days after saying “I do,” I was supposed to consent to something that to me seemed quite dangerous. My concern for the public was in direct conflict with my desire to protect the man I loved.

“I don’t want you to say yes,” I told him, “but I don’t want you to say no, either.”

After Dr. Craig Spencer, the first New York City resident to contract Ebola, was released from Bellevue hospital, he wrote an article detailing his own thoughts about his highly publicized illness. “I was labeled a fraud, a hipster, and a hero,” he observed. “I understand the fear that gripped the country after I fell ill, because I felt it on a personal level. People fear the unknown, and fear in measured doses can be therapeutic and inform rational responses, but in excess, it fosters poor decision making that can be harmful.”

We still struggle to conceptualize the relationship between community and individual health.

The criticism Spencer received focused on the asymptomatic days after his return from Guinea, where he’d been treating Ebola patients, and before he developed a fever and submitted himself to medical care. “The whole country soon knew where I like to walk, eat, and unwind,” he wrote. “People excoriated me for going out in the city when I was symptomatic, but I hadn’t been symptomatic—just sad.” In fact, Spencer followed protocol meticulously. During his time in Guinea he tracked his risks in a journal and maintained minimal exposure to the disease. When he returned to the US, he checked his temperature twice daily. When his fever developed, he reported himself to the hospital.

Still, the media painted him neither as a humanitarian fighting Ebola overseas nor as a victim of disease. Instead, he was labeled a biological threat, a selfish and careless man who endangered an American city. Though Spencer didn’t pass Ebola to anyone—the virus, though deadly, is far less contagious than influenza or measles—the fact of his sickness was used to justify mandatory quarantines, legislation that reflects fear more than science. “Instead of being welcomed as respected humanitarians,” Spencer scolds, “my US colleagues who have returned home from battling Ebola have been treated as pariahs. I believe we send the wrong message by imposing a 21-day waiting period before they can transition from public health hazard to hero.” Though the epidemic rages on in West Africa, “for many politicians,” he notes, “the current Ebola epidemic ended on November 4, 2014, the day of midterm elections.”

Fear isn’t useless. It’s essential that we maintain the tension between individual liberties and community health, and expressing fears, particularly competing fears, is one way that’s done. Public skepticism helps ensure decision-makers do enough to minimize disease without abusing power or diminishing civil liberties. Still, it’s vital to keep paranoia in check. The theorist Eve Sedgwick posits that, like typhoid and measles, paranoia is communicable. We pass it along to those we interact with. This happened when the media responded to Dr. Spencer’s Ebola infection and it continues to happen in the ongoing debate about childhood vaccinations and measles. Paranoia distorts decision-making, which is part of the reason that a hundred years after Mary Mallon’s isolation began on North Brother Island, we still struggle to conceptualize the relationship between community and individual health. When it comes to disease, whether we acknowledge it or not, we’re part of the public. The public isn’t an abstraction. It’s us.

Anya Groner

Anya Groner's essays, stories, and poems can be found in journals including The New York Times, The Atlantic, Ecotone, Meridian, VIDA, and The Oxford American. She teaches writing at the New Orleans Center for Creative Writing.

At Guernica, we’ve spent the last 15 years producing uncompromising journalism.

More than 80% of our finances come from readers like you. And we’re constantly working to produce a magazine that deserves you—a magazine that is a platform for ideas fostering justice, equality, and civic action.

If you value Guernica’s role in this era of obfuscation, please donate.

Help us stay in the fight by giving here.