Join us April 17-19 for the BioLogos national conference, Faith & Science 2024, as we explore God’s Word and God’s World together!

Forums
By 
Heather John
 on July 29, 2020

A World Without Vaccines: A History of Smallpox

A science historian gives us a glimpse into severity of smallpox and the road to its eradication in America to depict what life was like before vaccination existed to help protect populations from disease.

Share  
Twitter
Facebook
LinkedIn
Print
1 Comments
1 Comments
smallpox virus under microscope

In the summer of 2020, as we vigorously debate public health measures to combat the COVID-19 pandemic, many hope that tools in use since at least the Middle Ages, such as quarantine and isolation, can give way to more modern ones, such as vaccination. Today, we have a large number of vaccinations available to us, but what was it like to live in a world without vaccines?

Vaccination is a relatively new means of controlling disease; the first vaccine was developed a little over two hundred years ago, to prevent smallpox, and the vast majority of vaccines were developed only in the last hundred years. Change over time hasn’t affected only approaches to disease management—understandings of the nature and causes of disease have also changed.

We take for granted the “germ theory” of disease, which is the idea that infectious diseases are caused by microorganisms, such as bacteria and viruses. This idea was not widely accepted until a little over a hundred years ago, in the late 1800s. Other explanations for disease causation, which continue to influence people’s thinking today, included explanations focused on individual bodies, such as humoral theory (focused on fluid imbalances within the body) and constitutional factors unique to each person; explanations focused on the environment, such as miasmatic theory (focused on foul air, such as from dead bodies and other decaying organic matter) and filth theory (focused on dirt and hygiene); and explanations which posited mismatches between an individual’s constitution and atmospheric conditions1.

The potential for diseases to be contagious (i.e., passed between people) had been discussed for centuries, but this did not imply an understanding of germs as a source of “contagion.” As with other prophylactics and therapeutics in use today, the development of vaccination predated an understanding of why or how it worked.

A disease without a cure

Smallpox vaccination was so successful, with smallpox declared as eradicated in 1979, that we have to remind ourselves of how devastating it was.2 Smallpox survivors, after enduring three to four weeks of illness with flu-like symptoms and pustules covering the body, faced potential blindness or scarring—lifelong reminders which affected at least 20 percent of the population of early modern Europe.3 In parts of Western Europe into the early 1800s, smallpox was so ubiquitous and deadly that children’s lives were not counted until they had survived it.4 Estimates of the number of deaths among those who contracted smallpox ranged from 7% to 30% in general populations with prior smallpox exposure, to percentages approaching 50% or greater among pregnant women, fetuses, and populations with no prior exposure.5

a man lays on a bed covered in smallpox sores

Measures to counter smallpox were developed and adopted in times and places where it was perceived as a tangible problem. Prior to the development of vaccination, there was a difference in responses to smallpox in places where the disease was endemic, versus those in which it was epidemic.6 In populations where smallpox was endemic—where it was ever-present—the disease took its toll on the youngest members of the population (in a time when the mortality of infants and young children was already high), and it was understood that those who survived the disease did not acquire it again. In populations where smallpox was epidemic—where it appeared and disappeared periodically—the disease affected those who were older and had already survived the more vulnerable period of early childhood. The drive to combat the disease in populations where the disease was epidemic was thus stronger than that in populations where the disease was endemic.

In the pre-vaccination era, there were two options for countering smallpox.7 The first, was to separate the sick from the well. That is, isolation of the sick, quarantining those suspected of having been exposed to the sick, or flight of the presumed well from areas of sickness.

The Rise of Inoculation

In the case of smallpox, a more specific method was also available—inoculation. Inoculation with smallpox (variola) involved deliberately exposing oneself to smallpox pus or scabs, via a skin incision or nasal inhalation, with the hope that the resulting disease (expected to last days to weeks) would be milder than naturally-acquired smallpox, and confer protection from future illness from smallpox. Inoculation carried significant risks—death from smallpox and/or spreading smallpox to others—but it was known to be less risky than naturally-acquired smallpox, and had been practiced for centuries in Asia and Africa before it was introduced to Europeans in the early 1700s.8

Why did the long-standing practice of inoculation gain currency among Europeans (particularly the British) at that period in history? Two smallpox deaths within the royal houses of Europe in the early 1700s impacted succession plans, and led Britain to the formal study of inoculation as an effective method for combating smallpox.9 This study was followed by the use of inoculation in England, including by the children of the royal family.10

Around the same time, in the American colonies, smallpox was an epidemic, not endemic disease, which meant that when the disease appeared, the impact was dramatic.11 Death rates among Native Americans who were first exposed to the disease after European colonization of the Americas were extremely high, potentially over 50%.12 There were outbreaks in North America in the late 1750s and early 1760s during the Seven Years’ War and

Jonathan Edwards

American Theologian, Jonathan Edwards

Pontiac’s War, which included discussion by the British of deliberately spreading smallpox to Native Americans.13 In the face of epidemics and increased chance of natural infection, inoculation became more compelling as a method of disease control, despite its risks.

This was the case for the great theologian and minister Jonathan Edwards. Living in the midst of epidemic smallpox in 1752, Edwards encouraged his son-in-law, Aaron Burr, Sr. (father of the future vice president) to get a smallpox inoculation on a trip to Great Britain.14 When Edwards had an opportunity to be inoculated himself in 1758, he opted to do so, and unfortunately died from the acquired smallpox.

Smallpox Inoculation in America

Notwithstanding its inherent dangerousness, in the pre-vaccination era inoculation was a calculated risk that some saw as worth taking, and others saw as problematic. This can be seen in the choices of prominent Revolutionary-era Americans. Abigail Adams (whose husband, John Adams, had been inoculated several years prior) was inoculated in July 1776.15 While in the midst of an inoculation period which was known to be contagious, rather than secluding herself, she joined the crowd listening to a public reading of the newly-completed Declaration of Independence. It was not unusual for the inoculated to potentially expose others to smallpox, and many objected to inoculation out of justified fear that it would spread the disease.

Inoculation was not widely practiced in colonial America because of legal restrictions (due to concerns for its potential dangerousness), as well as the time and expense required (3-4 weeks of potential illness, at a cost of hundreds of dollars in today’s money for the inoculation itself).16 Combined with the fact that smallpox was a disease with epidemic outbreaks in colonial America, this meant that George Washington’s colonial troops were particularly susceptible to smallpox.

George Washington had personal familiarity with smallpox; when he was 19, he had contracted it after reluctantly accepting a dinner invitation from a family where smallpox was present, and survived the disease with lifelong smallpox scars on his nose.17 As the head of the American revolutionary forces, he feared that the British would use smallpox to their advantage. The British troops were more likely to have already had smallpox than the Americans, leaving the Americans more vulnerable to the disease: he feared that his troops would acquire smallpox from the outbreaks that were occurring in the colonies, or even that the British would use smallpox as a weapon of biological warfare, intentionally spreading it to the Americans.18 Despite his concerns that inoculation would temporarily incapacitate his forces or spread smallpox, he calculated that the potential gains were greater than the risks, and ordered the inoculation of the American troops—thus, the first mass immunization effort in American history was spearheaded by one of the nation’s founding fathers.19

The Emergence of Vaccination

The safety of immunization significantly improved with the development of vaccination. Edward Jenner, a physician in England, published in 1798 the results of experiments he had done with cowpox, showing that inoculation with cowpox (vaccinia), conferred protection from smallpox, at significantly lower risk than inoculation with smallpox (variola).20 Vaccination for smallpox did not become widely available, or systematically used, until the 20th century.21 It was the first, and remains the only, human disease to have been eliminated by vaccination.

Edward Jenner administers first vaccine

Decision-making around approaches to combatting disease have always been complex. In the case of smallpox, we may be tempted to assume that the choices were easier or more straightforward.

We see smallpox as a vanquished foe; this is a perspective that was unimaginable in the past, but beginning to see it as a possibility was a precondition for the eventual conquering of the disease, via concerted effort. Efforts are currently underway to eliminate or control other diseases, such as polio, and vaccination is a key part of those efforts. Polio, which is highly contagious and can cause life-long paralysis (as it did to Franklin D. Roosevelt), has been eliminated from some parts of the world, but has not been entirely eradicated, so ongoing vaccination is vital for maintaining disease control.

We see smallpox as uniquely deadly; this viewpoint had long been widely accepted, but did not lead to immediate or widespread adoption of effective preventive measures. Simple methods such as isolation didn’t keep Abigail Adams from a historic public gathering. The safe and effective method of vaccination, for smallpox and a myriad of other diseases, was not broadly adopted until the 20th century, during a period in which public faith in institutions, including science and medicine, were at a high. We are in a different historical moment now, but that faith in those institutions helped make the misery and death caused by many infectious diseases an almost-forgotten memory in some parts of the world. And perhaps we can agree on questioning George Washington’s decision to go to a dinner party where smallpox was on the menu.

1 posts about this topic

Join the conversation on the BioLogos forum

At BioLogos, “gracious dialogue” means demonstrating the grace of Christ as we dialogue together about the tough issues of science and faith.

Join the Conversation

About the author

Heather V John

Heather John

Heather John is a physician, with a psychiatry practice focusing on psychotherapy, in the Philadelphia area. She trained as a historian of science and medicine, with doctoral work on the history of medical education, and research interests in the histories of disease, religion, and activism, as seen in controversy over the use of the term “leprosy.” She has degrees from Stanford, the University of Texas Health Sciences Center at San Antonio, and Yale, with clinical training at Northwestern and UCLA.