Sunday, April 26, 2020

Anatomists of Melancholy in the Age of Coronavirus


Anatomists of Melancholy in the Age of Coronavirus


Anne Case and Angus Deaton diagnose the deadly despair that arises from the lack of a college degree. The current crisis only exacerbates the problem.

By Spencer Lee-Lenfield 


Before 2015, few people would have thought of not finishing college as a public-health issue. That changed because of research done by Anne Case and Angus Deaton, economists at Princeton who are also married. For the past six years, they have been collaboratively researching an alarming long-term increase in what they call "deaths of despair" — suicides, drug overdoses, and alcoholism-related illnesses — among white non-Hispanic Americans without a bachelor’s degree in middle age. 

Change any one of those attributes (race, nationality, education), and the trend disappears. Mortality has not increased among white Americans with a bachelor’s degree, nor American people of color, nor non-Americans without a bachelor’s degree. (Indeed, all-cause mortality among those groups has continued to go down, as usual.) Something about not having a bachelor’s degree in America, especially when white, can be deadly. 

The term "deaths of despair" has taken on a life of its own, becoming ubiquitous in newspapers, magazines, and op-eds. It has been the subject of think-tank panels, conferences, and even government inquiry. "America Will Struggle After Coronavirus. These Charts Show Why," proclaims a New York Times article that visualizes some of their research. This past fall, Congress’s Joint Economic Committee issued its own report on "Long-Term Trends in Deaths of Despair." 

Case and Deaton’s new book, Deaths of Despair and the Future of Capitalism (Princeton University Press), takes their message even further. Capitalism itself, they argue, needs serious reform if it is to make good on its potential to improve the lives of all Americans. In particular, as Case pointedly observed in a lecture last year at Stanford University, "We don’t think [American capitalism] is working for people without a four-year college degree — and that’s two-thirds of Americans between the ages of 25 and 64." The coronavirus outbreak, the dire economic forecast, the millions of newly unemployed — all of these recent events raise the stakes of their research. 

What difference does education make to a life? 

It was an accident of government record-keeping that first allowed Case and Deaton to recognize that change in mortality among white Americans in middle age could be accounted for almost entirely by education level. Since 1989, U.S. death certificates have collected information on the highest level of education attained by the deceased. Mortality from all causes among middle-aged white Americans appeared steady at first, but then, they realized, the fates of the more- and less-educated were actually diverging. When they separated the groups, they saw that white middle-aged college graduates’ mortality had dropped 40 percent from 1990 to 2017, while those without a college degree became 25 percent more likely to die in middle age. The result: By 2017, those who hadn’t finished or never went to a four-year college were four times as likely to die between age 45 and 54 as those who did. 

It’s too simple to say that college inoculates people against deaths of despair, and Case and Deaton do not say so. But their research on this mortality gap does raise the stakes on postsecondary education. If college helps produce not just creative and flexible thinking or higher lifetime earnings, but also greater average health and happiness for those who graduate, what should the role of college in American life look like? Should college enrollments grown ever higher, producing as many graduates as possible? Or should we instead focus on producing better job training and opportunities for people without a college degree? Now that the pandemic has once again plunged the world into an economic recession, these questions grow more urgent and less likely to be addressed anytime soon, while the gap between the experiences of those with and without undergraduate degrees continues to widen in the United States. 

Consider the difference education has made in the lives of Case and Deaton, which they discussed recently in their tidy, spare offices at Princeton, just down the hall from each other. Case grew up in Binghamton, N.Y., where her mother (a teacher) and her father (an engineer) had moved, partly because of the quality of the area’s schools. They were strict Irish-American Catholics who supported the civil-rights movement and social justice in general. They also believed strongly in the importance of education, celebrated their four daughters’ good grades with spaghetti dinners out, and always presumed that their children would go to college. 

Only after leaving home for the State University of New York at Albany, where she graduated first in her class, did Case feel her life had truly begun. She realized it was campuses she loved, and so she stayed: first by going to Princeton for a public-policy degree and then a Ph.D. in economics. She did not, in her words, "want to make money for rich people"; she wanted to help ordinary people "keep body and soul together." Despite the boot camp-like, ego-flattening asperities of early-stage graduate training, as well as raw sexism from faculty members, her love for research and teaching led her to turn down a chance to work for the World Bank, and her career took her to Harvard and then back to Princeton. 

Deaton was born in a Scottish town south of Edinburgh just months after the end of the Second World War. His father came from a coal town where schools served the mines and only one child a year received much education beyond age 12. Deaton’s father never really learned how to write, which led him to covet a solid education for his children. By a small miracle, he persuaded local schoolteachers to coach Deaton free in advance of an exam that won him a scholarship to Fettes College, an Edinburgh boarding school. From there, Deaton went to the University of Cambridge, where he studied economics and earned a Ph.D. He taught at the University of Bristol and then Princeton, eventually winning a Nobel Prize. 

Deaton is very tall, and very large; Case is also tall, but seems almost wispy next to him. (At one point in the book, they describe themselves as being "beyond obese" and "on the cusp of being underweight," respectively.) His suit jackets billow about him like cloaks, and he has a fondness for bow ties, a number of which he inherited from his mentor, the Cambridge economist Richard Stone. Both love fly-fishing (Deaton taught Case and calls her the better angler); for 20 years, they have spent as much of the summer as they can in the Montana town of Varney Bridge, known for its excellent rivers, where the research leading to their current book unexpectedly began. 

Case keeps her posture very upright; like many of her subjects, she suffers from sciatica, a chronic back condition that causes intense pain in certain positions. In the summer of 2014, Case wondered about others’ experiences of chronic pain, so for vacation reading she brought along decades of data collected by the National Health Interview Survey, an annual study of medical problems in 35,000 households across the United States. It turned out that, year after year, Americans kept experiencing more and more pain in middle age. In fact, middle-aged Americans were reporting more pain than the elderly. 

Education is at once a potential cause of and partial remedy for deaths of despair. 

As Case pondered this mysterious increase in midlife pain at a table at the front of their cabin, Deaton sat in a comfortable armchair near the back, working on a presentation of their findings. The behavioral economist Daniel Kahneman had for some time encouraged them to think seriously about happiness as a topic of research, and they were now sifting through Gallup data to see if there were any connection between states’ average happiness and suicide rates. (Conclusion: no.) 

It has been well established that suicide rates are generally low among black and Latinx Americans in comparison with whites, and also decline steadily with age. As Deaton looked through the data, he found an odd parallel with Case’s pain statistics: In contrast to minorities, suicide rates among white Americans were peaking in middle age. They decided they should compare the suicide rate among middle-aged white Americans to mortality rates from all causes — cancer, heart disease, everything else — in the same age group, expecting to find a decrease in mortality over time. So they downloaded a huge data set from the Centers for Disease Control and Prevention, ran the numbers, and arrived at the anomaly that would shape the next chapter of their careers. 

Before their research, one of the best-established trends in the social sciences had been that in the modern world, mortality rates always go down. That trend held in any large population over time, regardless of country, gender, or any other variable, with rare, brief exceptions like the 1918 influenza pandemic, the AIDS crisis, and the immediate aftermath of the Soviet Union’s dissolution. Finding other counterexamples to this pattern was supposed to be impossible, so when Case and Deaton did, in 2015, they were at first convinced they had made a mistake. No one could have missed such a huge number of unexpected deaths. They estimated that since the late 1990s, some 600,000 people had died who would otherwise have lived longer, healthier lives. Five years on, the mortality trend shows no sign of reversing. 

Shock and consternation greeted Case and Deaton’s first paper on these deaths, published in 2015 in the Proceedings of the National Academy of Sciences. When they presented it, Deaton recalls, "The reaction was always the same: People’s jaws just dropped. And no one could say, ‘Well, you made this mistake.’" The Dartmouth economists Ellen Meara and Jonathan Skinner wrote a dismayed comment alongside Case and Deaton’s article: "It is remarkable that it took more than a decade to bring this reversal to the attention of the scientific community." 

Fellow social scientists occasionally brought up issues of method or interpretation, but few disputed the basic finding. The Harvard economist David Cutler told The New York Times that deaths the profession had previously written off as blips due to the opioid epidemic seemed, in light of the article, "more like incoming missiles." At that point, the commentary about the political implications of their research often struck a tone of unhurried, if sincere, concern — the 2016 primaries were still months away. 

After Donald Trump’s election, Case and Deaton’s research took on a new significance. In March 2017 they published a much longer paper, "Mortality and Morbidity in the 21st Century," that looked for potential causes of deaths of despair. They rejected the idea that despair was simply due to poverty, as well as the idea that income inequality was to blame. Rather, they suspected, different kinds of setbacks — declining wages, the loss of jobs requiring only a high-school education, family breakdown, weak social bonds, frequent illness — had piled on top of one another as far back as the early 1980s, increasing the pain on each successive generation of white Americans who didn’t finish college. 

Political commentators seized anew upon "deaths of despair" — the very words struck a long, resounding chord. Their idea seemed to lend empirical support to a narrative of political backlash by less-educated white Americans against minorities and elites. "Americans are dying ‘deaths of despair.’ Will Trump help?" asked The Washington Post’s editorial board. "This may help explain Trump, according to economists studying mortality," posited The Huffington Post. (Neither Case nor Deaton claimed their work had anything to do with Trump.) Others projected racial motivations onto their work, contending they were peddling narratives about white decline tailored to draw media attention. "Why is the story more dramatic or attractive when it’s about white people?" asked the journalist Malcolm Harris in Pacific Standard. "In 2017 the narrative that sells involves white workers who are unemployed, suffering, ignored, dying." 

In an early 2018 working paper published by the National Bureau of Economic Research, Christopher J. Ruhm, a professor of economics and public policy at the University of Virginia, suggested that "the ‘deaths of despair’ framing, while provocative, is unlikely to explain the main sources of the fatal drug epidemic." He continued: "The fatal overdose epidemic is likely to primarily reflect drug problems rather than deaths of despair." Case and Deaton responded that they had explicitly tested and rejected his hypothesis, and that in their work "despair" was meant to be "a label, not an explanation." 

As the phrase "deaths of despair" became more popular, Case and Deaton found themselves frustrated by a number of recurrent misunderstandings. Critics often seemed to them to be replying to points entirely different than the ones they intended to make. Regarding race, they think it is perfectly reasonable that the still-higher black mortality rate deserves every bit as much attention and concern as the rate among whites in their study. (Mortality among Latinx and Asian Americans is lower than that among whites.) Others incorrectly said their work focused on rural areas or particular regions of the country, perhaps conflating it with popular books like J.D. Vance’s memoir Hillbilly Elegy, when in reality deaths of despair were hitting metropolitan areas just as hard. 

Lastly, they did not think the underlying problem was simply poverty. Money alone, in the form of aid or wages, would not fix matters, although they did believe higher wages were a necessary step toward a remedy. More and more, they found themselves turning to sociology — Durkheim’s Suicide, Andrew Cherlin and Sara McLanahan’s work on family dynamics, Robert Putnam’s Bowling Alone — for insight. In this light, problems like a lack of social connection, a sense of purposelessness in one’s work, and doubt that the future would improve underlay the mortality trends. Unemployment, income inequality, and other ills might be part of that problem, but the center of their story had broadened into something like the slow unraveling of a way of life. 

Education is only one of the three central categories defining the main population of concern in Case and Deaton’s work. But education is at once a potential cause of and partial remedy for deaths of despair in ways that race and nationality, being fixed, are not. 

The record unemployment caused by the Covid-19 epidemic has only intensified anxieties about the economic vulnerability of American workers without B.A.s. "Less-educated Americans are either essential, which puts their lives at risk, or nonessential, which puts their livelihoods at risk," noted Deaton in a recent online talk, emphasizing that social distancing is harder to maintain the less money one has. Case said in a recent New York Times roundtable, "Eventually, when the time comes for people to go back to work, I worry that some large fraction of working-class people won’t have work to go back to." 

She and Deaton argue in Deaths of Despair and the Future of Capitalism that capitalism requires serious reform if it is to make good on its potential to improve the lives of all Americans rather than just a privileged few. Education, particularly postsecondary education, plays an important, recurring role: Neither savior nor villain, it offers all manner of protections to those who have it, yet leaves behind those who don’t. 

"The four-year college degree is increasingly dividing America," they write in the book. "A four-year degree has become the key marker of social status, as if there were a requirement for nongraduates to wear a circular scarlet badge bearing the letters BA crossed through by a diagonal red line." Mortality from deaths of despair since 1995 continues to decline among college graduates across race and gender, but nongraduates have not seen a similar decrease. Drinking is more common among graduates, but binge-drinking more common among nongraduates. The incidence of debilitating physical and mental pain has grown since the late 1990s among young and middle-aged white nongraduates, while graduates are one-third less likely than the general population to experience chronic pain. 

In the late 1970s, college graduates made about 40 percent more than those with just a high-school diploma; by 2000, that gap had doubled. It’s not awful that college graduates make more money; after all, people should have incentives to pursue education, Case and Deaton reason. And if the only trend were college graduates’ rising earnings, the situation would not be so bad. But on top of that, the earnings of people who did not graduate from college have been declining. Moreover, the pay gap has not proven an effective incentive to finish college: College-enrollment and college-completion rates have not followed the swelling difference in earnings. Roughly two-thirds of Americans over 25 do not have a bachelor’s degree: That is a lot of scarlet badges. 

Case and Deaton’s account of the employment outlook for those with a high-school education or less is grim. In the past, an employee might have worked from the mailroom or janitorial closet upward through on-the-job training. Today, companies offer fewer opportunities for internal advancement. Case and Deaton scathingly denounce the practice of domestic outsourcing — replacing roles like security guards by contracting with big security firms that can offer cheaper (because non-unionized) labor. They also lament stagnant wages and lay much blame upon the American health-care industry — the book’s principal villain. 

Health care soaks up money that could have gone to wages — nearly $11,000 per worker — and consumes 18 percent of the U.S. economy (Switzerland, at 12 percent, is a distant second), yet offers worse health outcomes per dollar than every other rich country’s health system. Deaton calls it "a very expensive system that’s not delivering much." Estimated waste in health care totals around $1 trillion per year. The American health-care system also conditions employees to feel grateful for securing health care at all. "Economists would say they’re ‘pinning [employees] to their participation constraint,’" Case notes with disapproving irony, using air quotes, "paying them as little as possible to get them to continue to show up." 

In the midst of the pandemic, with mass unemployment jeopardizing coverage for many as a disease ravages the country, this state of affairs seems particularly explosive. Deaton thinks that the pandemic will result in either a "hero scenario" or a "villain scenario" for the health-care industry. In the hero scenario, pharmaceutical companies find a vaccine quickly and make it available widely and cheaply while doctors and nurses work bravely; insurers waive copays and deductibles, and they or the government provide free testing. In the villain scenario, drugs are expensive and rationed by price, ordinary people with infections are crushed under massive unexpected medical debt, and insurance premiums skyrocket in coming years. 

"Now there are people who are going to face tens of thousands of dollars of medical bills for having been put on a ventilator," Case noted at the New York Times roundtable. "There are predictions that health-insurance premiums could rise by as much as 40 percent. I think something is going to break. And when it breaks, we may think about real reform." 

“We do not accept the basic premise that people are useless to the economy unless they have a bachelor's degree.” 

Among the difficult questions raised by Case and Deaton’s research are these: Should we try to send as many Americans to college as possible (potentially changing the point of higher education in the process)? Should we try to build more and better careers and futures that don’t require college? And do most of the jobs that require a degree do so out of necessity, or merely to winnow an applicant pool? 

Case and Deaton are not specialists in the economics of American higher education. They are wary of speculating, and quick to defer to the expertise of others. "I’ve written two books now in which I came out at the end of the book thinking I need to know more about education than I do know, and I don’t think either of us are really specialized in that … Your readers are going to know a lot more than we know!" Deaton admits. They didn’t start out working in education, but as they wrote the book, the topic of education became hard to avoid, even as, Case admits, "we’re on thin ice here." 

That said, they advance a few general beliefs. Case worries that, at present, most high schools focus too much on providing an education to students headed to college, with curricula for everyone else merely a watered-down form of college prep: "We need to be doing more up through age 18 for people who are not collegebound, so that we don’t end up with a bimodal society." She continues: "I’ve flip-flopped as we’ve written the book on whether I think that everyone who can should go to college." 

She started out thinking that was ridiculous. "Not everyone has to go to college. We just need to rewrite the social contract between capital and labor in a way that gives people who haven’t been to college a say in their workplace and dignity in their workplace and a living wage." But, at the same time, she thinks, the education system needs to change in a serious way, and that "will be a really heavy lift." 

Case and Deaton write jointly in their book about the attractiveness of a German model that gives greater resources to more flexible forms of vocational training. Deaton finds it puzzling that the model hasn’t yet caught on in the United States: "People have been saying for years that the German system would be good here … but somehow it never happens. And I don’t know what the institutional blocks to that are." 

They note that at Montana State University at Bozeman, near where they spend their summers, the economics major includes agricultural aspects — "Students are learning economics, but they’re learning economics that’s going to help them when they’re out working," Case says. Deaton adds, "If everybody went to college, it would make sense that you’d have to have a much wider range of colleges than you have now. Many of us in Britain were upset when they made technical colleges into universities because they turned very good technical schools into bad liberal-arts universities." 

Throughout their book, they return with skepticism to the idea of meritocracy. Deaton notes that for much of the 20th century, people like himself, who came from humble backgrounds, had opportunities to do amazing things with their lives, things their parents might not have thought possible. "Equality of opportunity" rings hollow in his ears now. The people who use it as a rallying cry tend to do so as a justification: "‘Well, if there’s equality of opportunity, we don’t have to worry about the people who don’t succeed because they had their chance.’ And that’s not a satisfactory answer." 

Case and Deaton often cite the British sociologist Michael Young, best known for coining the term "meritocracy" in 1958 — which, they point out, was originally meant to describe a kind of dystopia. American life at present is, Deaton says, "the worst of Michael Young’s predictions in some sense. We’ve created an underclass, or what he called ‘the populists’ — and the rest of us he called ‘the hypocrisy.’" 

"We would like to see a world in which everyone who can benefit from going to college, and wants to go to college, should be able to do so," they write. "But we do not accept the basic premise that people are useless to the economy unless they have a bachelor’s degree. And we certainly do not think that those who do not get one should be somehow disrespected or treated as second-class citizens." 

"If people who get a B.A. think, ‘We’re the winners, and if we’re the winners, they’re the losers,’ that’s a really destructive setup," Case says. Deaton mentions the work of Kwame Anthony Appiah, who argues that too often "winners" are all on a single track, whereas different tracks would allow several ways to excel. "The person who decides to be an engraver," Deaton adds, "doesn’t have to be seen as a failed mathematician." 

In other words, people should not have to feel like losers in the great social lottery for choosing the level of education appropriate to their own sense of their abilities. Income is important, but only part of the social problem. It is not material envy, but rather a sense of being mocked or blamed for one’s own life outcomes that Case and Deaton think makes the mix of pain or illness, low pay, and lack of a college degree so dangerous for individuals and so corrosive to society. The salves for amorphous "despair" may be just as slippery as despair itself: belonging, dignity, meaning. 

Correction (4/19/2020, 11:25 a.m.): This article originally said that Angus Deaton did not receive a Ph.D. In fact, he did. The text has been corrected accordingly. 

Spencer Lee-Lenfield is an assistant editor at The Yale Review, contributing editor at Harvard Magazine, and doctoral student in comparative literature at Yale University.


Saturday, April 4, 2020

New Pathogen, Old Politics



New Pathogen, Old Politics 


We should be wary of simplistic uses of history, but we can learn from the logic of social responses. 

ALEX DE WAAL  Boston Review

There is a saying among epidemiologists: “If you’ve seen one pandemic, you’ve seen one pandemic.” Echoing this trade wisdom in an interview two weeks ago, Bruce Aylward, the assistant director of the World Health Organization (WHO), pointed out that each new pandemic follows its own logic, and those who rely on past experiences to draw conclusions for public health will make mistakes. With each new pandemic it is tempting to scour history books for parallels and lessons learned. But as many have stressed, the wisdom to be gained is often greatly exaggerated. 

Still, it is possible to steer a course between the Scylla of historical blindness and the Charybdis of hasty generalization. In her book about the era of the Black Death of 1348, A Distant Mirror (1978), the historian Barbara Tuchman confines her remarks on the present to a few oblique lines in the preface. “If one insists upon a lesson from history,” she writes, it is, as the French medievalist Edouard Perroy contended, that “Certain ways of behavior, certain reactions against fate, throw mutual light upon each other.” My working premise is that although the pathogen may be new, the logic of social response is not, and it is here that we can see historical continuities. An especially telling case study—still an object of fascination and controversy among historians of health and disease—is the devastating outbreak of cholera in Hamburg at the end of the nineteenth century, the subject of Richard Evans’s superbly researched book, Death in Hamburg (1987). 

With each new pandemic it is tempting to scour history books for parallels and lessons learned. An especially telling case study is the devastating outbreak of cholera in Hamburg at the end of the nineteenth century. 

On the morning of August 24, 1892, Robert Koch arrived at Hamburg railway station from his laboratory in Berlin. Germany’s most famous medical scientist, he was already credited with discovering the anthrax disease cycle and the bacillus that causes tuberculosis. In the 1880s he had traveled to Egypt and India, where he succeeded in isolating the bacterium responsible for cholera, and on his return to Berlin he was fêted by Kaiser Wilhelm, invested with the Order of the Crown, and put in charge of protecting the empire from epidemics of infectious diseases. 

Nine days before Koch’s train arrived in Hamburg, a doctor in the neighboring town of Altona had been called to see a stricken construction worker, whose job included inspecting the sewage works. He was suffering from acute vomiting and diarrhea; the diagnosis was cholera. In the first sign of the lethal controversy that was just beginning to erupt, the physician’s superior medical officer refused to accept the diagnosis. From August 16 to 23 the daily count of cases grew exponentially to more than 300; over the following six weeks some 8,600 residents of Hamburg perished. Like a forest fire racing through dry tinder, the epidemic burned itself out in October, an ending helped by the efforts of Koch and his team. 

As we know now, those deaths were totally preventable. The immediate cause of death was vibrio cholerae, but the city authorities were accomplices to mass mortality, having long resisted spending public money on public health and fearing that a declaration of cholera—with the quarantine and isolation sure to follow—would bring their trading city to a halt. In Altona, just outside Hamburg’s jurisdiction, there were few infections; in Hamburg’s sister port of Bremen, a self-administering former Hanseatic League city-state, there were just six cases, half of them recent arrivals from Hamburg. Hamburg suffered alone that year. 

In their pitch and consequence these events have the narrative structure and moral tensions of a theatrical tragedy. Besides the cholera vibrio itself, which takes the shape of a comma (like its typographical counterpart, potentially catastrophic if inserted at a crucial juncture), the dramatis personae are Koch, the chemist and hygienist Max von Pettenkofer, the physician-anthropologist Rudolf Virchow, and a chorus of the afflicted themselves and some of their revolutionary spokesmen. There are five subplots. Science contends with superstition and fatalism; the new germ theory of disease disputes with so-called ecological or local conjunctural theories; militarized centralizing bureaucracy spars with liberal capitalism; the anthropocentric “epidemic narrative” that promises a return to the safety of life-as-normal wrestles with the logic of evolution operating on different timescales, from the microscopic to the macro-ecological; and last, an open, democratic society questions its limits. 

As we will see, some that is old is new again. 

Cholera: the nineteenth century’s most fearsome pandemic 

Until the early nineteenth century, cholera was endemic to the Ganges Delta in Bengal, but it appears not to have been found elsewhere. The causative bacillus lives in warm water and multiplies in the human intestine, transmitted by fecal contamination. That was cholera’s macro-ecology: all it needed was to survive in just a few shallow wells during each dry season, with every annual flood spreading the germ far and wide. 

Epidemics are inflection points in evolution across different scales, from the microbial to the planetary. The post-pandemic world is a changed ecosystem. 

Along with the great famine of the 1770s, one of the lethal gifts of the English East India Company was opening up routes whereby cholera could spread far more widely, colonizing new places as a kind of biological blowback. British investment in widespread irrigation to grow cotton created the perfect ecology in which the vibrio could find multiple local reserves—irrigation ditches and canals, reservoirs, wells, water tanks—and become endemic. In 1854, the English physician John Snow elegantly demonstrated that the infection was water borne. He showed this through an epidemiological study still heralded in textbooks today: after painstakingly plotting cases on a London street map, he asked each affected household where they obtained their drinking water, tracing the source to a single contaminated pump on Bow Street. 

According to legend, Snow asked the local alderman to remove the handle on the pump, and new cases promptly ceased. In fact, as Snow himself admitted, the epidemic was already subsiding by that time, but he had made his point: the dominant “miasma” explanation—that the disease was caused by locally-generated impure air—had a competitor theory that had the virtues of being simple and provable. In the same year that Snow was mapping the outbreak, the Florentine microbiologist Filippo Pacini described the bacillus, which he had extracted from the autopsies of victims. But Pacini had no powerful political apparatus behind him to endorse and broadcast his breakthrough, and medical studies were not sufficiently systematic for the correct conclusion to be drawn. Thus the paradigm shift was not automatic. Rather, advocates of the miasma theory refined their arguments, contending that complicated local interactions of soil, air, and personal characteristics accounted for the vagaries of the disease. Prominent among the exponents of these views was the indefatigable chemist, hygienist, and health reformer, Max von Pettenhofer, whom we shall encounter in Hamburg shortly. 

Cholera first reached Europe in 1830, causing mass mortality, panic, and unrest. The vibrio produces particularly nasty symptoms in its human host: once it enters the intestine, its ideal micro-ecology, it multiplies exponentially and drives out the resident microbiota within just a few hours. The stricken body loses control of its functions, lapses into fits of uncontrollable vomiting, diarrhea, and muscle spasms, and turns blue and bloated. Catastrophic dehydration then causes death in about half of those infected. 

Cholera first reached Europe in 1830, causing mass mortality, panic, and unrest. For the emergent bourgeoisie in Europe, the manner of cholera’s attack was no less terrifying than the prospect of mortality. 

For the emergent bourgeoisie in Europe, the manner of cholera’s attack was no less terrifying than the prospect of mortality: an individual could be stricken at dinner, or in a tramcar, causing revulsion and terror among his or her companions. Just as disturbing to the authorities were “cholera riots” in which peasants and the inhabitants of the newly expanding, grossly unsanitary industrial cities attacked landlords, city authorities, and in some cases physicians, accusing them of using the disease as a pretext for driving them out of their homes and seizing their property. Sometimes the poor even blamed the rich for having introduced the disease for that very purpose. 

Subsequent cholera pandemics coincided with the 1848 uprisings throughout Europe—with localized outbreaks for a decade, including the one that prompted Snow’s investigation—and the wars of the 1870s. In 1891 famine struck Russia, prompting a wave of westward migration by hundreds of thousands of people one or two economic steps up from the starving peasantry, and the vibrio traveled with them. Those tired, poor, huddled masses dreamed of America, and the Hamburg-America shipping company was the most-traveled route to the New World. The German health authorities registered the cases as the migrants moved; many were stopped at the border, but some passed through undetected. The epidemic warning lights were blinking red. 



Medical and ecological controversies, then and now 

Cholera is a pantomime villain in this drama: stealthy, sudden, and lethal. At the time of the Hamburg epidemic, there was still much controversy about its etiology. Was it a contaminant invader? Did it emerge when there was a special configuration of local conditions? Thirty years after Snow and Pacini, and eight years after Koch isolated the vibrio, there still wasn’t medical unanimity. Hamburg was to change that. 

The cholera controversies of the 1880s and 1890s were the first conducted under the dawning light of the new microbiology. 

Scientific method was itself developing alongside medical discoveries, and Koch was in the vanguard of both. “Koch’s postulates,” as we now call them, were criteria for determining whether the agent of a disease had indeed been correctly identified. According to the postulates, the microbiologist had first to identify the suspected microbe in all infected individuals; then it should be grown in culture; thirdly he had to use the microbe to infect an experimental experiment and observe it sicken with similar symptoms; and finally isolate the same microbe in the sick or deceased animal. The experiment had to be repeatable. Ironically, Koch’s identification of the cholera vibrio did not fulfill his own criteria; despite his best efforts he could not induce cholera in an animal host. It only affects humans. There were also plenty of who, why, and where questions left unanswered about outbreaks—enough material for skeptics to make the case that the germ theory was, at minimum, incomplete. 

The cholera controversies of the 1880s and 1890s were, nonetheless, the first conducted under the dawning light of the new microbiology. So-called “anti-contagionists” and “localists” argued that there surely had to be other conducive factors such as the weather, the soil, or the temperament of the individual patient. Radicals asked, why was it that the proletariat were always hardest hit? (Studies of disease patterns show that this wasn’t always the case, but it was true often enough to serve as grist for social reform agendas.) 

In the case of the COVID-19 coronavirus today, the mysteries are fewer, the scientific method is more robust, and the speed with which controversies are resolved is many times faster. The lapse between identifying a new disease and knowing its pathogen is closer to five days than five decades. The coronavirus was isolated within a few days of the first cases and its entire genome was sequenced and available online two weeks later. We have the benefits of testing and tracing and massive computational power in charting epidemiological scenarios. Still, much remains uncertain, and epidemiologists continue to revise their understanding of the case fatality rate and vulnerability factors. We do not know whether COVID-19 will infect 20 percent, 40 percent or 70 percent of the population. It is important to parse our ignorance, separating out what risk is calculable now, what risk will be calculated when we have better data, and what is profoundly uncertain because it cannot be captured by data gathering. 

Consider an example. In their influential modeling of possible trajectories and the impact of “non-pharmacological interventions” (NPIs, by which they mean policies such as quarantine or social distancing), Neil Fergusson and colleagues at Imperial College London include the following caveats: 

It is important to note at the outset that given SARS-CoV-2 is a newly emergent virus, much remains to be understood about its transmission. In addition, the impact of many of the NPIs detailed here depends critically on how people respond to their introduction, which is highly likely to vary between countries and even communities. Last, it is highly likely that there would be significant spontaneous changes in population behaviour even in the absence of government-mandated interventions. 

There are two caveats here, and they should be treated differently. The first is that the basic data for sound epidemiology are not yet known, but better approximations are constantly becoming available. This is an exercise in better calculation of risk. The second caveat, which Fergusson divides into two, is that outcomes will depend upon how people respond, both to official policies and because of other changing beliefs. Health behavior is harder to measure than epidemiological constants. The point is that the social component of the trajectory of the epidemic is uncertain in a way the medical component is not: although the margins can be narrowed, the risk really cannot be quantified. In a series of blog posts examining the intersection of health, environment, and politics, the scholar of science and technology policy Andy Stirling explains, “the crucial distinction between ‘uncertainty’ and ‘risk.’ A risk is what results from a structured calculation that must necessarily reflect a particular view. An ‘uncertainty’ is what these risk calculations might leave out.” Health behavior is just one part of this. 

The lapse between identifying a new disease and knowing its pathogen is now closer to five days than five decades. The entire genome of the new coronavirus was sequenced and available online just a few weeks after the first cases. 

Another element of uncertainty is that epidemics are inflection points in evolution across different scales, from the microbial to the planetary. Pathogens evolve; microbes populate the microbiomes of animals and plants, the soil and the water; remnants of viruses are found in our DNA. For bacteria and viruses, the boundaries of the human self hold no meaning, and the more that we discover about the viral remnants in our DNA and the richness of our microbiomes, the more we are compelled to acknowledge that point of view. The vicious nineteenth-century strains of cholera retained their prior strategy of rapidity and lethality, killing about half of the humans they colonized. In the mid-twentieth century, the “El Tor” strain evolved a new strategy of lower virulence. This is a common adaptive trajectory for pathogens, which prosper by treating their hosts as symbiotes instead of wantonly destroying them. The first pandemic of any new pathogen is, for the human population, usually the worst—so it was for bubonic plague in Asia and Europe, smallpox and measles in the Americas, and cholera. It is no solace to Homo sapiens facing COVID-19 today. 

Ecosystems change too. Most of the new pathogens that infect humans are zoonotic: they jump the species barrier, from wild monkeys or bats, or from domesticated chickens or pigs. This has always been the case. But in the past, a zoonotic pathogen might infect a band of hunter-gatherers; today, thanks to a globalized, deeply interconnected world, a single local outbreak can become a pandemic in a few weeks. Another new factor is the proximity of humans to domestic animals and factory farms. The 90 percent of nonhuman terrestrial vertebrate biomass on the planet that is husbanded for our consumption lives—if we can call it living—in ecosystems such as feedlots that have no precedent. 

As Mike Davis observes in The Monster at Our Door: The Global Threat of Avian Flu (2005), these are perfect incubators for new zoonoses, especially for avian influenza, which can evolve first in chickens, then jump to pig populations that act as a kind of pathogenic evolutionary accelerator, and finally make the leap to humans. In turn, each new human-pathogen dyad alters the ecology of global public health and disease: our built environment changes (in the nineteenth century with the introduction of municipal water supplies, for example); our biochemical environment changes (supplementing animal feed with antibiotics, for example); and our health behaviors change. Meanwhile, climate change is altering the ecologies of infectious diseases in ways that we cannot predict. The post-pandemic world is a changed ecosystem. 

Though a great deal of headway has been made into the study of these complex environmental factors, the uncertainties they introduce are left out of epidemiological models narrowly focused on predicting numbers of cases and deaths. The standard “epidemic narrative” consists of a stable “normality” threatened by the intrusion of a novel, alien pathogenic threat, followed by an epidemic and an epidemic response (of variable proficiency) and ends with a return to the status quo ante. That neat storyline simply doesn’t hold. In turn, in Hamburg 140 years ago and across the world today, what is “left out” depends on where you stand. 

How liberals failed to prevent epidemics 

So much for the microbial protagonist. Let’s turn now to the three human characters in our retelling of the Hamburg tragedy. 

First on stage is the dominating and ultimately tragic figure of Max von Pettenkofer (1818–1901)—almost unknown today, but 130 years ago at the height of his professional fame and Germany’s most eminent chemist. He championed medical research, advocated clean air and urban sanitation, and mentored dozens of students. In the comic-book version of our Hamburg story, though, all these achievements count for naught: he is instead the villain whose obstinate pride failed the people of Hamburg twice over. His biggest shortcoming was failing to prepare for waterborne diseases and refusing to order the construction of filtration plants to treat the city’s drinking water supply, so that people were drinking water piped straight from the river Elbe to storage tanks and from there to their homes. As water levels dropped in the dry, hot summer of 1892, contaminants were washed downstream from riverside towns and from the barges that plied the waterway. Filtering through sand efficiently removes vibrio cholerae. Other cities did it; Hamburg didn’t. 

Hamburg’s citizens believed in small government, balancing the books, and individual responsibility for health and well-being. Spending their tax money on a water filtration plant looked to be an extravagance. 

Why did he take this stand? Despite the centralizing ambitions of the Prussian state, and the uniform color of its territories on the political map of Europe, the administration of Germany was not yet unified. Hamburg, the second-largest city and its richest port, still retained the legacy of self-government from its membership of the Hanseatic League. The city was run by its own senate and zealously guarded its powers to make independent policy decisions, especially in matters of trade. Indeed Hamburg was the most “English” city in Germany, governed by an assembly of its citizens—by its constitution, a small and privileged group of property owners; by its social history, an oligarchy of traders and lawyers. They disliked and distrusted the military-bureaucratic Prussian ways of state. 

Those citizens believed in small government, balancing the books, and individual responsibility for health and well-being. Spending their tax money on a filtration plant looked to be an extravagance that threatened both the fiscal health of the city-state and the ethic by which it had prospered. These laissez faire doctrines resembled those that had led Britain to the utmost parsimony in famine relief in Ireland and India—its colonial administrators holding fast to the belief that public debt was a more egregious sin than mass starvation, and that the hungry could only remedy their plight through learning the self-discipline of hard work and husbanding scarce resources. 

The next-biggest failing of von Pettenkofer and his loyal disciples in Hamburg’s medical office was their refusal to accept the cholera diagnoses and issue a cholera declaration during those crucial days in August when the rate of infection was doubling each day. As we have now learned once again, to our collective cost, bacilli and viruses can multiply at an exponential rate. The delay of a day can make the difference between containing an outbreak and facing an epidemic. 

Why did he not do this? Part of the explanation is the intellectual inflexibility of the man of high standing. The other part is material interest. Port that it is, in the 1890s Hamburg’s economy, and the prosperity of its plutocrats, depended on keeping the harbor open and the ships moving. Goods were coming in from England and the United States. The larger part of Germany’s exports were arriving by barge and train to be loaded onto ships destined for every continent, and the Hamburg-America line had regular sailings for New York, the decks packed with migrants seeking a better life on the far shore of the Atlantic. 

If we read von Pettenkofer’s calculation as a straightforward tradeoff between profit and human life, we do him an injustice. Most of the public health measures to deal with cholera began as hand-me-downs from the medieval plagues, revised during the previous sixty years during the visitations of cholera, adapted each time based on a rule-of-thumb empirical assessment of what had worked and what hadn’t. New viruses and bacilli had emerged; society’s responses stayed much the same. 

The first draft of the plague control playbook was drawn up in Italian city-states in the years after the apocalyptic shock of the 1348 Black Death. Much like cholera half a millennium later, the plague arrived explosively and killed in a gruesome and rapid way. Its mortality was extraordinarily high: overall, perhaps a third of the population of Asia and Europe succumbed, and in most European cities, half of the residents perished, sometimes in just a few weeks. 

The calamity was widely attributed to divine wrath, to astronomical alignments, to witchcraft and sorcery. Italian princes, city elders, and merchants were more empirical. The first boards of health were set up in Venice and Florence in the same year that the plague appeared; these evolved into permanent magistracies over the next century, with authority to restrict travel and trade, and isolate infected individuals. Isolation hospitals, called lazzaretti, were set up to prevent contagion. Italian cities also issued certificates of health to important traders and diplomats, so that they could pass freely through checkpoints. The first passports were health cards. 

The first draft of the plague control playbook was drawn up in Italian city-states in the years after the apocalyptic shock of the 1348 Black Death. The first boards of health were set up in Venice and Florence, and the first passports were health cards. 

Observing that the plague tended to appear first on ships from the east and then spread when those ships arrived in port, they began comparing notes and drawing up advice. Quarantine was first trialed in the Venetian port of Ragusa (now Dubrovnik) in 1377—its name refers to the forty days that suspected vessels were kept offshore to see if sailors and passengers fell sick. Within a few decades, the fundamentals of plague control had been worked out by trial and error: alongside quarantine, what we would now call notification of cases of infection, isolation of the sick, imposition of cordons sanitaires and travel restrictions, and disinfection (usually through burning the property of those infected). The main item missing from the list was carrier control: the role of rats—to be precise, rat fleas—as the reservoir of infection was not known, and systematic suppression of rat infestations was never contemplated, and presumably would have been considered impractical if it had been. Instead, people assumed that plague spread by human-to-human contagion. 

The tools of plague containment were part of the scaffolding of the earliest administrative apparatus of the modern European state, and notably so in northern Italy. The science was somewhere between wrong and inexact, the motives mixed, the implementation quite often haphazard. Little wonder that critics condemned these measures as expensive, ineffective, and dangerous. The financial costs hardly need to be restated: the bureaucrats had to be paid, and interruptions to trade caused mercantile bankruptcies. Effectiveness could be questioned: the plague often managed to get through the defenses, and people would find ways of evading the restrictions or overwhelming the policemen dispatched to enforce them. The danger lay in the social unrest that followed unemployment, high food prices, and the intrusions of the police. 

It wasn’t until 1894 that the pathogen was identified, simultaneously by Alexandre Yersin (a former laboratory assistant at the Pasteur Institute in Paris) and the Japanese biochemist Shabasaburo Kitasato (who had trained under Koch in Berlin). They both isolated the microbial cause, a pathogen carried by rat fleas, called Pasteurella pestis or Yersinia pestis—a victory for European science over Asian, and for France over Germany. Plague remained endemic in India and China at that time with sporadic outbreaks, but had vanished from Europe (the last epidemic occurred in Marseille in 1720). Exactly why plague disappeared from Europe remains one of the enduring mysteries of microbial history: was it changes in the rat population, in the ecology of the transmission zones on the eastern borderlands of the continent, or the effectiveness of Europe’s quarantines and lazzaretti? 

The tools of plague containment were part of the scaffolding of the earliest administrative apparatus of the modern European state. 

The best-documented cases of the plague response toolkit are naturally the most recent, and a good (or bad) example was Bombay in 1896, which is germane to the Hamburg drama for two reasons. First, it illustrates the standard epidemic containment policies as deployed in the same decade. Second, it occurred two years after the Yersin/Kitasato breakthrough had revealed that the main mode of transmission was fleas-to-human rather than human-to-human. 

Despite the scientific discovery, British officers of the Indian Civil Service remained convinced that the plague endured due chiefly to Indian backwardness. The historian Rajnarayan Chandavarkar observes that even though the medical and scientific experts were up-to-date on the most recent discoveries, their “policies, formulated on the assumption that the plague was a virulently infectious disease, proved at best oppressive and at worst fatal.” Among these, “stringent inspections” on the railroads turned up few cases, while “pumping the sewers with disinfectants” simply drove rats and the fleas they carried into houses, where they promptly spread the infection. The disorderly, distrustful, and sometimes violent response of the Bombay residents, dismissed as superstition by colonial officers, is perfectly understandable. The official cure—if indeed it can be counted as such—was arguably as bad as the disease. 

Bombay also shows that von Pettenkofer was not alone in disputing the latest medical claims. Indeed, Hamburg was following well-established British precedent in downplaying the modes of transmission of pathogens, when commerce was at stake. After the opening of the Suez Canal in 1869, the International Cholera Control Commission in Istanbul insisted that British ships with infected sailors or passengers be kept at sea for the requisite forty days, bringing the (French-led) commission into conflict with ministers in London who insisted that quarantine regulations were a gross violation of the 1846 Free Trade Act. Influential English doctors insisted that the germ theory of cholera was “a humbug got up for the restriction of our commerce.” Until March 2020 at least, British public health policy retained a laissez faire strand quite distinct from continental Europe. 

The science was somewhere between wrong and inexact, the motives mixed, the implementation quite often haphazard. Exactly why plague disappeared from Europe remains one of the enduring mysteries of microbial history. 

Von Pettenkofer’s doctrines are thus much more comprehensible in the context of these centuries of practice in epidemic control with limited outcomes, the oppressive aspects of quarantine and isolation, and the uncertainties of the medical science and epidemiology of the time. His medical and social beliefs were an odd mélange; he is hard to place in today’s political spectrum. He advocated “localism,” believing in particular that cholera became virulent only in particular kinds of soil, and that it needed a human body with the requisite moral and psychological preconditions to develop into the full-blown disease. He held that health was a matter for individual family responsibility, not state diktat. 

In our drama, von Pettenkofer’s first fatal error began as a relatively minor fault, amplified by his inflexibility. He steadfastly refused to purify Hamburg’s drinking water supply by the relatively straightforward method of filtering it through sand, which efficiently removes the cholera vibrio. He could readily have accommodated the cleansing powers of sand into his general promotion of cleanliness and his view that the bacillus also needed a receptive soil to become potentially lethal. But almost as if gripped by a death wish, von Pettenkofer took a stand that water filtration was a needless expense without benefit. In our drama, we can imagine the audience silently urging him, “just filter the water supply! Just do it!” 

His second disastrous error was his refusal to declare epidemic cholera on August 18, 1892, or over the following few days. Only on August 23, the day before Koch arrived, did the Hamburg medical authorities admit that the disease was present in their city. By that time every part of the city was affected. 

The denouement of von Pettenkofer’s role occurs when he has been driven out of his post, but still rancorously defends his “local configuration” theory. He lays down the ultimate challenge to Koch: he will drink a solution containing the cholera vibrio and see what happens. The old man did this on October 7, 1892, recording the grotesque symptoms in his diary. He recovered, concluding that his view was vindicated: cholera needed both an infectious agent and also a conducive host. An identical experiment was undertaken by his disciple Rudolf Emmerich ten days later, which he performed on a stage in front of an audience of over a hundred people. Emmerich also survived. (Evans, in Death in Hamburg, suggests that Koch’s laboratory assistants, who provided the samples, suspected the purpose of the request and mercifully diluted the solutions.) Von Pettenkofer finally fulfilled his death wish with a pistol to his temple in 1901. 

How the centralizers—and their science—prevailed 

The Hamburg epidemic occurred at the inflection point in the rise of scientific medicine. The protagonist of this paradigm shift, the hero of the tale, is Koch (1843–1910): he is the one remembered for having taken charge of Hamburg’s failing public health system and clearing out the charlatans. When Koch arrived on the train from Berlin on the morning of August 24, carrying the Kaiser’s writ, he already knew the diagnosis was cholera; a doctor from Altona had arrived at his laboratory a few days earlier with a sealed jar containing samples from patients. But he apparently had no idea of how rapidly the disease had taken hold nor how negligent was the municipal response. 

There was no official delegation at the railway station to meet the Empire’s highest-ranking scientist. Koch had to make his own plans: his first stop was the city medical office, where he arrived at 9 a.m. The chief medical officer was Dr. Johann Kraus, a faithful acolyte of von Pettenkofer: he turned up only thirty minutes later, and had little information to impart, for he had done nothing other than sneer at the “hyperactive behavior” of his counterparts in other towns (such as Altona). Koch’s next stop was the New General Hospital in Eppendorf, where the director, Dr. Theodor Rumpf, was ready to greet him at the door. Koch asked straightaway if there were cholera cases to report, and Rumpf promptly gave him the figures, whereupon Koch remarked to his companion, “The first man in Hamburg who’s telling us the truth!” 

At this time and in this context, tropical disease was a huge impediment to colonization, while medicine was justification for empire. Kaiser Wilhelm’s proclamation of Koch’s success was a gamble on science, in the service of imperial politics. 

After visiting the hospitals, disinfection centers, and barracks where the migrants from Russia were housed awaiting their ships, Koch toured the old, overcrowded, ramshackle “Alley Quarters” in the city center. By this time he was becoming aware that hundreds were already dead. “I felt as if I was walking across a battlefield,” he said. And amid these unsanitary streets, courtyards, and canals, he was shocked: “In no other city have I come across such unhealthy dwellings, such plague spots, such breeding places of infection.” This was a man who had scoured the Alexandria and Calcutta hospitals in his search for the bacterial culprit. In the alleys he made a remark that became an infamous condemnation of Germany’s most cosmopolitan city: “Gentlemen,” he said, “I forget that I am in Europe.” 

If any moment in our drama were to mark the shift in the paradigm for understanding epidemic disease, this would be it. This is the point at which the naysayers’ defenses became hollow, that the unresolved medical and epidemiological controversies became only minor way stations on the iron railroad of progress, through which the express train of medical science could rush with only a blast of a whistle to warn the loiterers to get out of its way. In more than a metaphorical way, the emperor had arrived on that train. 

Recall that when Koch returned from Egypt and India proclaiming that he had discovered the cholera bacillus, there was indeed room for doubt. Koch had not satisfied his own postulates—no animal could be induced to fall sick with cholera—and epidemiological mysteries remained. (When Koch won the Nobel Prize in 1905, the citation was for his discovery of the tuberculosis bacillus, the more complete demonstration of his method.) Kaiser Wilhelm’s proclamation of Koch’s success was a gamble on science, in the service of imperial politics—he was strenuously seeking to catch up with the other European colonial powers. Seeking what he later called Germany’s “place in the sun” he had convened the Berlin Conference that divided Africa among them; his rush to industrialization was gathering pace; his unification of the disparate administrations across the patchwork quilt that had been the principalities, city-states, feudal, and episcopal estates of the former Holy Roman Empire was still incomplete. 

At this time and in this context, tropical disease was a huge impediment to colonization, while medicine, especially in France, was justification for empire. The canonization of Koch was a triumph for German medical science, instantly making his laboratory a peer of France’s Pasteur Institute. Then as now, scientific competition was interwoven with geostrategic rivalry; it was a matter of both prestige and imperial capability. (France won a minor victory with the name of Pasteurella pestis.) Health administration, with its requirements of a unified census, border controls, and the machinery of case notification—issuing certificates of good health—required and justified a centralized bureaucracy. Infectious disease reporting and control was not a matter that could be left to the discretion of cities or baronies; unless all parts of the body politic conformed to the same central protocol, the health of the whole would be vulnerable to the deficiencies of its weakest part. 

Health administration—with its requirements of a unified census, border controls, and the machinery of case notification—required a centralized bureaucracy. The germ theory of infection was the charter for military centralism over laissez faire minimal government. 

We can see now that Koch’s achievement was both scientific and rhetorical. His first scientific achievement was identifying the lifecycle of anthrax, but unable to specify the causal mechanism, he resorted to the persuasive metaphor of “host” and “parasite.” He went on to characterize the cholera vibrio as an “invader.” And—especially salient with respect to the rivalry between Berlin and Hamburg—the germ theory of infection was the charter for military centralism over laissez faire minimal government. On the train from Berlin arrived not just Koch but a freight of martial metaphor, mindset, and mobilizing capacity. 

Medicine and the military are indeed deeply entangled throughout recorded history. Armies were epidemics on the march; regiments were depleted more by infection than battle; sailors fell victim to nutritional deficiencies such as scurvy. The scandalous hospital conditions for soldiers during the Crimea War of the 1850s were the occasion for Florence Nightingale to establish British nursing. Biological warfare has long been attempted, though historic successes were due more to chance than design. One account of the plague’s entry into Europe was that the Mongol army besieging the Crimean town of Kaffa used catapults to fling infected corpses into the city. The story of the gruesome projectiles may be true, but that is not how plague is transmitted. The Spanish conquest of Mexico was incalculably aided by smallpox, which stowed away on the conquistadors’ ships and killed as many as half of the immunologically naïve native Americans in its first and most deadly epidemic, while leaving the invaders—their faces pockmarked from earlier, immunity-inducing infections—untouched. 

None of the above, however, imposed a military model on medicine itself. This changed with the application of modern industrial modes of organization to the organization of war, with the U.S. Civil War and the Franco-Prussian War. These were also the occasions on which modern medicine and the arsenal of epidemic control measures were applied to the same end. Tools of surveillance, standardization, and regimentation were applied equally to state-making, imperial expansion, industrial warfare, and population health. Just as war became about more than conquest, public health was never just about public health. 

So too with Koch’s visit to Hamburg. The elders of the city had reason to fear that cholera control would jeopardize not just their commerce but also their prized constitutional autonomy. Over the previous decades, while the British empire had tried to balance health-based controls with free trade, the French had been much more assertive in using the anti-infection arsenal in the service of expanding the writ of the colonial state. The historian Patrick Zylberman recounts how the French government portrayed the disease as an “invasion” from the Levant and India, which justified martial medical measures and the establishment of the outer ramparts of Europe’s sanitary frontier in the Middle East. 

Infectious disease control was not a matter that could be left to the discretion of cities or baronies. Unless all parts of the body politic conformed to the same central protocol, the health of the whole would be vulnerable to the deficiencies of its weakest part. 

The authorities in France’s Mediterranean ports didn’t want to require health inspections on arrival, so Paris assembled a coalition of European governments that imposed a regime of health inspection and oversight on the Ottoman Empire. So, even while the Ottomans were nominally independent, western European health officers were stationed in Cairo and Constantinople with the authority to control the westward departure of ships. Zylberman makes the point that the threat of cholera was sufficient justification for “pre-emptive intervention” in the eastern Mediterranean and even beyond: the Ottoman state was the “sick man of Europe” in two senses of the phrase, and the imperialists were already sinking their teeth into its weakening body. Whatever their geostrategic rivalries, Paris and Berlin saw the microbial threat from the east in a similar way: Germany imposed comparable measures along its long land frontiers. 

In Hamburg in August 1892, worries about Berlin’s militaristic rule and the loss of long-cherished liberties were, of course, less pressing than the terrifying scourge in the water supply. Koch did not declare “war” on the vibrio, and his comparison of the overwhelmed Hamburg hospitals to a “battlefield” was as far as his military metaphors went. Nowhere in the debates of the day do we read political rhetoric of bodily integrity and decay, infection and purification, that was adopted by the Nazis a generation later. But as the militarized Prussian state took over the administration of Hamburg, starting with its hospitals and water supply, the corner was turned. 

In the end Koch triumphed over von Pettenkofer; the biomedical paradigm shifted. Centralizing, authoritarian Prussia absorbed liberal Hamburg; Germany’s governance system consolidated. Less noticed, the military model of public health became hegemonic. The United States, which watched Hamburg closely—it was, after all, the port where the largest number of immigrants embarked—somehow managed this paradox. The U.S. Army Corps of Engineers became the principal weapon fighting Yellow Fever, in Cuba, Louisiana, and Panama alike, while generations of voters have rejected social medicine, as either a luxury the country cannot afford or as one step short of totalitarianism. 

The metaphor of “fighting” a disease, apt for the body’s immune response to a pathogen, is incongruous for the social response to an epidemic. Nonetheless, the language of has become so familiar today that it is adopted unreflectingly—a mark of true hegemony. The traffic in metaphors runs both ways. When mobilizing for war or authoritarian measures, political leaders inveigh against “infestation” by invaders or infiltrators that are akin to pathogens. In times of health crisis, they like to “declare war” on a microbial “invisible enemy.” 

The metaphor of “fighting” a disease, apt for the body’s immune response to a pathogen, is incongruous for the social response to an epidemic. Nonetheless, the language has become so familiar today that it is adopted unrelentingly—a mark of true hegemony. 

The U.S. army medical and engineering corps earned their place in the annals of public health with meticulous research into Yellow Fever transmission followed by rigorous enforcement of programs of draining, capping, oiling standing water in wells, cisterns, tanks, pools, and use of insecticides to eradicate mosquito-breeding sites. In modern times, and especially since the post 9/11 anthrax scare raised the specter of bioterrorism, so deeply has the U.S. Department of Defense bored into all aspects of U.S. foreign policy that the instrument of choice in responding to diverse crises around the world, including epidemic disease, is the military. The army were the first international providers of relief to Indonesia after the tsunami of 2004 and to Haiti after the earthquake of 2010. President Barack Obama dispatched the 101st Airborne to “fight” Ebola in West Africa in 2014. Today, the National Guard and the U.S. Navy at the forefront of the COVID-19 emergency response. Military logistics appear to be indispensable in filling the gap of an under-provisioned emergency public health service. 

However, recognizing the important operational role of the military in epidemic response shouldn’t seduce us into thinking that security officers and generals should be running the show. The American Civil Liberties Union was alive to this danger, warning in a 2008 report that coercive, law enforcement approaches would be counterproductive as well as dangerous to rights. Unsurprisingly, the potential for assuming war-time emergency powers and deploying security technologies is attractive to many political leaders precisely because of their dual usage. President Donald Trump has decided to call himself as a “wartime president.” He is following French President Emmanuel Macron, who declared “war” on the virus. In Italy, it is more of a police operation. Hungary’s Viktor Orbán has passed a law allowing him to rule by decree indefinitely and is blaming the pandemic on immigrants and refugees. In China, the lockdown is enforced by a combination of high-tech surveillance and old-fashioned Communist Party neighborhood mobilization—a “grid reaction.” In Israel, the government is proposing to deploy tracking technologies designed to follow terrorists against people believed to be infected with coronavirus. The Economist has coined the word “coronopticon” for such all-pervasive surveillance. 

Activist reformers and silent revolutionaries 

The wave of repressive measures enacted in response to COVID-19 would come as no surprise to the cast of our Hamburg drama. Europe’s nineteenth-century cholera epidemics marched in synchrony with its revolutions, notably in 1830–32 and 1848. “Cholera riots” were widespread. In 1892 mobs rampaged through the Russian cities of Astrakhan, Tashkent, Saratov and Donetsk. The cries of the afflicted provide the chorus to the protagonists, with a handful of spokesmen’s voices audible above the shouts and cries. But we would listen in vain for socialist revolutionaries. 

Each historical visitation of epidemic disease and its corresponding government measures was met with innumerable acts of everyday evasion and noncompliance. 

Karl Marx was lodging at 28 Dean Street in Soho in 1854, five minutes’ walk from the famed Broad Street pump (according to Google Maps) and a minute from the nearest black (infected) dot on John Snow’s map in Meard Street. But he made only passing reference to the outbreak in his correspondence with Friedrich Engels, blaming it on poor housing. Engels had done the same in his 1845 book The Conditions of the Working Class in England, and in his preface to the 1892 edition added the line: 

Again, the repeated visitations of cholera, typhus, small-pox, and other epidemics have shown the British bourgeois the urgent necessity of sanitation in his towns and cities, if he wishes to save himself and family from falling victims to such diseases. Accordingly, the most crying abuses described in this book have either disappeared or have been made less conspicuous. 

Engels, it seems, quietly concedes that public health is a bourgeois science, and an effective one. For the communists, war and class war were the locomotives of history, and microbes had merely hitched a ride. As the historian Samuel Cohn observes, this is a baffling surrender of a political battlefield where they could have outflanked their class enemies. “An analysis of cholera and its social consequences did not enter any of Marx’s works published in his lifetime,” he notes, “and he appears to have been oblivious to any manifestations of its social protest and class struggle.” He continues: 

Still more surprising is an absence of attention to cholera’s social violence by more recent historians of the New Left who have studied nineteenth- and twentieth-century class struggle meticulously—E.P. Thompson, Eric Hobsbawm, John Foster, John Calhoun, and others—despite these events sparking crowds estimated as high as 30,000, taking control of cities (even if only briefly), murdering governors, mayors, judges, physicians, pharmacists, and nurses, destroying factories and towns. 

The oversight has begun to be remedied. Postcolonial historians and medical anthropologists have explored local resistance to colonial health policies and the suspicions that surround, among other things, polio vaccination programs. But still there is relatively little research on resistance to public health measures during epidemic emergencies. This is a gap, because each historical visitation of epidemic disease and its corresponding government measures was met with innumerable acts of everyday evasion and noncompliance. 

In Italy in the sixteenth and seventeenth centuries, and most memorably in London in 1665, chroniclers of the plague have written about the reckless indifference of poor people to the dangers of contagion, and their subversion of whatever sanitary measures were imposed upon them. Daniel Defoe, like others who wrote on this subject, attributed this behavior to illiteracy, obstinacy, and fatalism. It may also have been a preference for accepting uncertainty (the lottery of the microbe) over the predictable hardship (destitution by unemployment). There are also intriguing echoes with colonized people’s resistance to imperial health and environmental diktat, which was usually arbitrary, unscientific, and often achieved nothing beyond a display of state power. 

Daniel Defoe saw illiteracy, obstinacy, and fatalism in the reckless indifference of poor people to the dangers of contagion. It may also have been a preference for accepting uncertainty (the lottery of the microbe) over the predictable hardship (destitution by unemployment). 

The nineteenth-century socialists’ silence on public health is doubly puzzling because their rivals on the left, the radical democrats were vocal. In the 1848 “springtime of the peoples,” while Marx and Engels were writing the Communist Manifesto, a young physician named Rudolf Virchow (1821–1902) was compiling a report on an outbreak of typhus in Silesia. Virchow came to be known both as the father of pathology and the founder of social medicine, but he was also a pioneering physical anthropologist; his studies of the size and shapes of the crania of different people made him conclude that there was no scientific basis for claims of racial superiority or inferiority. His medical practice radicalized his politics; his report on Silesia argued that medical interventions alone had little value, but rather social advancement through education, democracy and prosperity. Virchow joined the 1848 uprisings as a democrat with the slogan, “Medicine is a social science, and politics is nothing but medicine at scale.” 

Like many erudite men of science of the era, Virchow’s medical views are hard to classify today. He was generally sympathetic to von Pettenkofer, though disagreed with him on cholera which he considered a contagion; he admired Koch, though oddly enough, disputed the role of the tuberculosis bacillus. Fundamentally, Virchow was a libertarian who believed that democracy, education and progress would eliminate disease. Evans, in Death in Hamburg, credits him with the crucial insight: “What Virchow’s theories made explicit was the indissoluble connection between medical science, economic interest, and political ideology.” 

Virchow’s voice in the Hamburg chorus poses questions that resonate today. Contemporary liberals (in the U.S. usage of the word) are discomfited by the politics of pandemic. They lean toward social health on the grounds of equity but shudder when epidemiological risk management through quarantine and travel restrictions aligns with racist exclusionary policies. American liberals are reassured by the civil servants dedicated to science—Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases, is today’s paragon of the virtuous deep state—but disturbed by the totalitarian implications of disease surveillance and control. The infection-control state is Max Weber’s military-bureaucratic state on steroids, requiring uniform sanitary habits throughout the population. 

The father of pathology and the founder of social medicine, Rudolf Virchow joined the 1848 uprisings as a democrat with the slogan, “Medicine is a social science, and politics is nothing but medicine at scale.” 

In his book, Disease and Democracy (2005), Peter Baldwin, describes how, in the later twentieth century, as chronic, non-infectious, and “lifestyle” diseases took over from infectious diseases as the main threats to health in industrialized countries, responsibility for health was shifted from states to citizens: “every man his own quarantine officer.” Baldwin poses the key question: “Can there be a democratic public health?” He doesn’t think so: “In the era of governmentality, public health remains one clear area of statutory control where the average law-abiding citizen might expect to feel the iron first through the velvet glove.” 

Baldwin’s skepticism was a riposte to AIDS activists who believed that their own mobilization against the “gay plague” had not only accelerated the science but bent the arc of political history towards emancipation. President Ronald Reagan initially ignored the AIDS outbreaks among gay men, Haitians, and hemophiliacs, and was deaf to the demands of AIDS Coalition to Unleash Power, whose acronym ACT UP reflected their methods. Finally his Surgeon General C. Everett Koop, a fully credentialed conservative, and Anthony Fauci, just appointed to the job that he holds today, convinced Reagan to act. 

In May 1990, ACT UP mounted a protest at NIH to bring awareness to the public the biomedical research in combating HIV-AIDS. Image: NIH History Office / Flickr 

The country’s HIV and AIDS policies, and subsequently global policies too, were unprecedented in the history of public health responses to an incurable, sexually-transmitted disease targeting stigmatized groups—the words “innocent victims” applied to hemophiliacs and children born with HIV were the exception that proved that initial rule. People living with HIV and AIDS became involved in medical trials and policymaking. Activists offered to trial new drugs, arguing that they had nothing to lose by shortcutting the usual safety testing. They insisted on voluntary and confidential testing to protect their rights. In Africa government responses often ceded the agenda to civil society organizations and invariably included them in planning, and the international agencies set up to respond—UNAIDS and the Global Fund to Fight HIV/AIDS, Tuberculosis and Malaria—pioneered a model of global health governance based on human rights and inclusion. 

Why epidemiologists should think like communities 

I had a walk-on role late in the HIV and AIDS drama, in collaborative research on the politics, security and social aspects of the pandemic. And, writing in the spirit of Virchow, I insisted that there can only be a democratic public health. Like many others, I was inspired by the physician and medical anthropologist Paul Farmer; his book Infections and Inequalities (1999) is a manifesto for a partnership between social medicine and radical politics. The associations between poverty, inequality, and ill health and exposure to epidemics are well established and do not need to be emphasized here. Conscious of Aylward’s warning not to apply the “lesson” of the last epidemic to this one, I will make just one cautious, epidemiological point: there is in fact intriguing evidence that “people’s science” can play a crucial role in blunting epidemics and ensuring they don’t recur. 

As chronic “lifestyle” diseases took over as the main threats to health in industrialized countries, responsibility for health was shifted from states to citizens: “every man his own quarantine officer.” The key question became: “Can there be a democratic public health?” 

This is not to say we should romanticize folk medical wisdom: we shouldn’t. People’s epidemiology has more than its share of superstitions: numerous practices that are at best harmless and at worst dangerous or even fatal. Still, decades of people’s experimentation and observation have generated some real scientific breakthroughs, the emblematic example of which is smallpox variolation. An enslaved African American, familiar with the widespread African practice of injecting tissue from the pustules of smallpox patients into healthy individuals to produce a much less virulent version of the disease, introduced variolation to Massachusetts settlers in the early eighteenth century. Its efficacy was so convincing that George Washington inoculated his soldiers en masse. In 1798 the procedure was adapted by the English physician Edward Jenner in the form cowpox variolation, which he named vaccination. 

There is also a demonstrable recent record of people’s science hastening the end of an epidemic. The case in point is Ebola in West Africa in 2014. Epidemiological models, which accurately charted the early, exponential growth phase of the epidemic, failed to predict its rapid decline. The models projected only burnout or a long taper as public health responses slowly reduced transmission, and not the remarkably rapid decrease that actually occurred. In his book Ebola: How a People’s Science Helped End an Epidemic (2016), the social anthropologist Paul Richards argues that the deficiency in the modeling is best explained by changes in intimate social behavior that could neither be captured by models nor even fully explained by people who were themselves altering the critical risk behaviors. 

Anthropologists themselves didn’t connect the dots at the early stage of the outbreak. They had researched funerals and funeral rituals, but not the real danger point for contagion, which was the preparation of the body for burial. Family care for the sick was the other main context of transmission. Community health workers, social anthropologists, and epidemiologists had to speak to one another, understand each others’ knowledge, and find ways of communicating it. As Richards shows, the communities quickly learned to think like epidemiologists and adapted new safer body-handling practices, and the official top-down policies followed afterward. Post hoc modeling of the epidemic trajectory confirms that the best simulation of the decline is based on the widespread adoption of a community-based strategy for screening and travel restriction, which has the advantage that it requires a 50 percent compliance rate to be effective. The author of the review concludes, “We know of no other similarly validated explanation for the end of the outbreak.” 

Though we should not romanticize folk medical wisdom, there is intriguing evidence that “people’s science” can play a crucial role in blunting epidemics and ensuring they don’t recur. 

Each pandemic is different, but the logic of political action is much the same. Where political interests align with scientific advice, that advice becomes policy. This is where we can legitimately learn lessons. In the case of 1918, the lesson learned by the world leaders gathering to found the League of Nations was that international health is a problem that demands international cooperation. Smallpox was eradicated by exactly this sort of multilateral initiative in the 1970s. Measles, like smallpox, is caused by a virus that has only human hosts and so too could be eradicated, but to the wealthy nations that funded international health programs, it was regarded as harmless rite of passage for young children, even though it killed millions of them in poor countries every year. Because funds and political backing were available, the UN targeted polio instead: also a devastating disease but one far harder to eradicate, because the virus can exist in the wild. 

In the case of HIV and AIDS, what “worked” was that pressure compelled governments to acknowledge their epidemic and respond, and public clamor forced pharmaceutical companies to bring down the cost of antiretroviral drugs so that treatment regimes at scale did not bankrupt African governments. Less unequivocally helpful was the pressure for strictly voluntary testing, which was codified as international best practice, even though compulsory or routine tests—in which the patient must specifically request not to be tested—could have helped prevent infections. And, as with any complex institution, political incentives came to align with the interests of the institution sometimes at the expense of the problem to be solved. The result was that the UN pushed for similar policies and metrics for every country, even though each individual country’s epidemic was different: some were centered on homosexual sex, others on heterosexual sex, yet others on intravenous drug use, and all had different social mores and networks. But it was simpler to standardize the package. In my book AIDS and Power (2006) I concluded that in African democracies, political incentives were structured to manage AIDS rather than to contain it: to provide treatment maximally and prevention barely sufficiently. 

The 2003 SARS outbreak is the immediate precursor to COVID-19 in both pathogenesis and politics. It holds two political lessons. First, it shamed China’s government, but not enough. The government concealed the initial outbreak and reacted too late—the coronavirus spread across the world sparking local outbreaks, notably in Canada. Commentators speculated that SARS could be the crisis that cracked the Communist Party’s authoritarian control. In a volume reviewing the outbreak, Tony Saich asked whether SARS was “China’s Chernobyl or Much Ado About Nothing?” Saich reserved judgment, concluding that the Chinese authorities ought to learn. And seventeen years on, Saich’s assessment of the response to COVID-19 is “no, they didn't learn from the SARS epidemic.” 

Well-articulated political demands shape the politics of public health. Democracies can demand public health. 

Implicit in this conclusion is that Premier Hu Jintao didn’t pay a political price for his flawed response to the disease. China developed its medical laboratories but did not create incentives for health workers to become whistleblowers: the default option for low-level officials receiving bad news was to please their superiors by insisting that all was still well. Second, the speedy suppression of SARS removed the market for pharmacological products that could treat coronaviruses or inoculate against them. Capitalism has no incentive for pre-emptively responding to a future global public bad (as the economists like to say). 

From this we distill the elementary and wholly unsurprising lesson that well-articulated political demands shape the politics of public health. Democracies can demand public health. 

In the next-to-last act of the Hamburg drama, Virchow ends the scene. He has already posed the key question—whether it was material interest, political ideology, or medical science that determined the outcome. And although none of the protagonists answered the question—neither the cholera vibrio, not von Pettenkofer and the city merchants and lawyers, not Koch and his Emperor, and not even the dissonant chorus—Virchow makes the claim that social emancipation and democracy will finally overcome cholera. 

Thinking critically in a pandemic 

What does all this mean for COVID-19? We face a new virus with uncertain epidemiology that threatens illness, death, and disruption on an enormous scale. Precisely because every commentator sees the pandemic through the lens of his or her preoccupations, it is exactly the right time to think critically, to place the pandemic in context, to pose questions. 

The clearest questions are political. What should the public demand of their governments? Through hard-learned experience, AIDS policymakers developed a mantra: “know your epidemic, act on its politics.” The motives for—and consequences of—public health measures have always gone far beyond controlling disease. Political interest trumps science—or, to be more precise, political interest legitimizes some scientific readings and not others. Pandemics are the occasion for political contests, and history suggests that facts and logic are tools for combat, not arbiters of the outcome. 

Political interest legitimizes some scientific readings and not others. Pandemics are the occasion for political contests, and history suggests that facts and logic are tools for combat, not arbiters of the outcome. 

While public health officials urge the public to suspend normal activities to flatten the curve of viral transmission, political leaders also urge us to suspend our critique so that they can be one step ahead of the outcry when it comes. Rarely in recent history has the bureaucratic, obedience-inducing mode of governance of the “deep state” become so widely esteemed across the political spectrum. It is precisely at such a moment, when scientific rationality is honored, that we need to be most astutely aware of the political uses to which such expertise is put. Looking back to Hamburg in 1892, we can readily discern what was science and what was superstition. We need our critical faculties on high alert to make those distinctions today. 

At the same time, COVID-19 has reminded a jaded and distrustful public how much our well-being—indeed our survival—depends upon astonishing advances in medical science and public health over the last 140 years. In an unmatched exercise of international collaboration, scientists are working across borders and setting aside professional rivalries and financial interests in pursuit of treatment and a vaccine. People are also learning to value epidemiologists whose models are proving uncannily prescient. 

But epidemiologists don’t know everything. In the end it is mundane, intimate, and unmeasured human activities such as hand-washing and social distancing that can make the difference between an epidemic curve that overwhelms the hospital capacity of an industrialized nation and one that doesn’t. Richards reminds us of the hopeful lesson from Ebola: “It is striking how rapidly communities learnt to think like epidemiologists, and epidemiologists to think like communities.” It is this joint learning—mutual trust between experts and common people—that holds out the best hope for controlling COVID-19. We shouldn’t assume a too simple trade-off between security and liberty, but rather subject the response to vigorous democratic scrutiny and oversight—not just because we believe in justice, transparency and accountability, but also because that demonstrably works for public health. 

It is joint learning—mutual trust between experts and common people—that holds out the best hope for controlling COVID-19. We shouldn’t assume a too simple trade-off between security and liberty, but rather subject the response to vigorous democratic scrutiny. 

As we do so, it is imperative we attend to the language and metaphors that shape our thinking. Scientists absorb (fundamental) uncertainty within (measurable) risk; public discourse runs along channels carved by more than a century of military models for infectious disease control. By a kind of zoonosis from metaphor to policy, “fighting” coronavirus may in the worst case bring troops onto our streets, security surveillance into our personal lives. Minor acts of corporate charity, trumpeted at a White House bully pulpit, may falsely appear more significant than the solidarities of underpaid, overworked health workers who knowingly run risks every day. Other, democratic, responses are necessary and possible: we need to think and talk them into being. 

Perhaps the most difficult paradigm to shift will be to consider infectious agents not as aliens but as part of us—our DNA, microbiomes, and the ecologies that we are transforming in the Anthropocene. Our public discourses fail to appreciate how deeply pathogenic evolution is entangled in our disruption of the planet’s ecosystem. We have known for decades that a single zoonotic infection could easily become pandemic, and that social institutions for epidemic control are essential to provide breathing space for medical science to play catch-up. Our political-economic system failed to create the material incentives and the popular narrative for this kind of global safety net—the same failure that has generated climate crisis. 

This is the final, unfinished act of the drama. Can human beings find a way to treat the pathogen not as an aberration, but as a reminder that we are fated to co-exist in an unstable Anthropocene? To expand on the words of Margaret Chan, WHO director at the time of SARS, “The virus writes the rules”—there is no singular set of rules. We have collectively changed the rules of our ecosystems, and pathogens have surprised us with their nimble adaptations to a world that we believed was ours.