As we finally exit the long slog that was 2020 and enter the new year, the topic on everyone’s mind right now is vaccination. With the recent FDA approval of both the Moderna and Pfizer/BioNTech vaccines, many of our long-standing questions and concerns about vaccination have come to the forefront of public consciousness—especially since these new vaccines are pioneering a relatively novel method of vaccination. Vaccination itself may seem like a relatively recent endeavor, a product of 20th-century science and public health initiatives. And while the first laboratory-concocted vaccine wasn’t created until 1879, the legacy of vaccines and inoculation against deadly pathogens stretches back to as early as the year 1000.
Historians believe that the practice of inoculation against infectious diseases dates all the way back to China in the year 1000. A practice known as variolation, which involved rubbing weakened viral material into an open wound, was practiced by healers in China, India, Africa, and the Ottoman empire to treat cases of smallpox. The healers would collect samples of infectious material from relatively mild cases of the disease, dry the samples out, and carry them around in their pockets for a couple of weeks before using the material for variolation. The result was a relatively weak form of the smallpox virus that would infect the patient. It would typically cause a few days of mild sickness. but the patients would not die. And following the initial infection, they would be permanently immune to smallpox.
Variolation was a technique primarily studied in the East until around the 1700s. In 1721, Lady Mary Wortley Montague was living in Constantinople with her husband, a British diplomat, and her family. She had recently recovered from a severe case of smallpox that left her disfigured just before moving to Turkey with her family. When she heard about the practice of variolation from a local practitioner, she leapt at the chance to have her children inoculated against the deadly disease. Montague’s experience spurred interest in smallpox inoculation back in Britain. Shortly after, James Jurin, Secretary of Britain’s Royal Society, conducted an expansive survey of the efficacy of variolation. Perrot Williams and Richard Wright, who were doctors in Wales at the time, soon discovered that variolation had been a widespread tactic used by commoners in Wales as far back as the early 1600s. Similar processes known as “buying the pocks” were discovered to be in practice in Scotland and across mainland Europe.
In the following century, variolation gained popularity across Europe and into the American colonies. It was not without its risks though, with a mortality rate around 2 percent—still a marked improvement to the average 14 percent mortality associated with the disease. But to put that in perspective, the mortality of smallpox variolation was about on par with the current average mortality of Covid-19. Most modern vaccines have mortality rates as low as 0.1%, with severe adverse reactions typically relegated to elderly and immunocompromised patients. Despite the risks, variolation was still the most effective protection against smallpox for many years and it was widely used. Eventually, a safer more effective form of smallpox inoculation was developed by physician Edward Jenner, using the brand new technique known as vaccination.
In 1796, Edward Jenner, who was himself inoculated against smallpox as a child, made the observation that several dairymaids who had been infected with cowpox never contracted smallpox. Cowpox is much milder than smallpox, so many of these dairymaids were saved from the disfigurement and mortality associated with the smallpox infection. Seeing this apparent immunity, Jenner hypothesized that the two diseases must be related in some way such that exposure to one provides inoculation against the other.
To test his hypothesis, Jenner recruited an 8-year-old boy to serve as his test subject (certainly not up to modern child-protection standards, but it was hardly a contentious issue at the time). He used infectious cowpox material he acquired from a dairymaid to inoculate the boy who subsequently developed a mild sickness. A couple of months later, Jenner exposed the boy to a fresh smallpox lesion. The boy experience no symptoms of developing smallpox, indicating that the cowpox inoculation was successful at providing immunity. Jenner ran several subsequent tests on the efficacy of cowpox using other young test subjects, including his 11-month-old son. The results supported his original hypothesis, and he was able to publish a paper on the study in 1798. In this paper, he coined the term “vaccine” from the Latin root vacca meaning cow.
While there was initially pushback against this new technique of inoculation, it eventually gained popularity, replacing variolation completely by the mid-1800s. Through vaccination and strategic surveillance, smallpox was eradicated globally by the end of the 1900s. Jenner’s discoveries were considered foundational to the modern understanding of immunology—the study of how the body fights disease. Jenner’s initial smallpox vaccine also introduced the concept of viral attenuation. Attenuation is the weakening of a pathogen’s virulence so that it is viable enough to produce immunity, but not viable enough to cause major harm or transmission. Jenner’s smallpox vaccine was attenuated because it contained a weaker, animal-borne version of the virus. Later, a live attenuated rabies vaccine was created by cultivating the virus in an incompatible host—chicken embryos or mice in this case. This method was inefficient and could not be done sterilely. But in the 1940s and 50s, the introduction of in vitro cell cultures made efficient laboratory virus attenuation possible. The MMR vaccine was developed in this way, isolated from an infected patient and attenuated through successive cell culturing.
Vaccination with live attenuated viruses still has its downsides though—a greater risk of developing infection symptoms and difficult, lengthy development. Another strategy for vaccination is using pathogens that are inactivated (essentially “dead”). This strategy had a lot of success early on for bacterial diseases like cholera and typhoid, but it proved harder to develop an effective inactive vaccine for viral diseases. The first real successful inactive viral vaccine was the inactivated polio vaccine (IPV) developed by Jonas Salk. Salk grew the poliovirus in the kidney cells of monkeys and inactivated it with formalin. While this vaccine saved countless lives, it did come at a large cost—nearly 1500 monkeys had to be sacrificed for every one million doses. Later on, enhanced cellular culturing techniques made it possible for the vaccine to be manufactured in vitro in immortal human cell lines.
In recent years, vaccination research has turned to the study of subunit vaccines—vaccines containing a specific protein, saccharide, or genetic component of a virus. These subunits engender a relatively mild immune response, so they often require boosters and chemical adjuvants, which stimulate the immune system. These vaccines can still create lasting immunity, and they can often be cheaper and easier to produce than whole inactivated vaccines.
Next week, we’ll dive deeper into the world of immunity and explore how it guards you against infection. Check out my last blog post on vaccines from last spring and last month’s series on influential women in science! Comment on this post or email me at contact@anyonecanscience.com to let me know what you think about this week’s blog post and tell me what sorts of topics you want me to cover in the future. And subscribe below for weekly science posts sent straight to your email!