This summer has seen a surge in discussion over biosafety. Should
we still be storing smallpox? Is the risk of bioterrorism greater now in the
post-genomic era? Should we artificially increase virulence in the lab to be prepared
for it possibly occurring in the environment?
On Tuesday the Penn Science Policy Group discussed the issue of biosafety as it relates to potential uses of biological weapons and risks of
accidental release of pathogens from research labs.
The idea of using biological weapons has existed long before
cells, viruses, and bacteria were discovered. Around 1,500 B.C.E the Hittites
in Asia Minor were deliberately sending diseased victims into enemy lands. In 1972, an international treaty known as the
Biological Weapons Convention officially banned the possession and development
of biological weapons, but that has not ended bioterrorist attacks. In 1982 a
cult in Oregon tried to rig a local election by poisoning voters with salmonella.
Anthrax has been released multiple times - in Tokyo in 1993 by a religious group,
and in 2001 it was mailed to US congressional members.
And recently, an Italian police investigation claims the existence of a major
criminal organization run by scientists, veterinarians, and government
officials that attempted to spread avian influenza to create a market for a
vaccine they illegally produced and sold.
Possibilities for bioterrorism are now being compounded by advances
in biological knowledge and the ease of digital information sharing. Which begs the question: should
we regulate dual-use research, defined as research that could be used for
beneficial or malicious ends? Only in the last few years have funding agencies officially screened proposals for potential dual-use research. After two
research groups reported studies in 2012 that enhanced the transmissibility of
H5N1 viruses, the US Department of Health and Human Services created a policy
for screening dual-use research proposals. These proposals have special
requirements, including that researchers submit manuscripts for review prior to
publication.
We debated whether censorship of publication was the
appropriate measure for dual-use researchers. Some people wondered how the
inability to publish research findings would affect researchers’ careers. Ideas
were proposed that regulations on dual-use research be set far in advance of
publication to avoid a huge waste of time and resources. For instance,
scientists wishing to work on research deemed too dangerous to publish should be
given the chance to consent to censorship before being funded for the study.
In addition to concerns of bioterrorism, public warnings have
been issued over accidental escape of pathogens from research labs, fueled by recent incidents this past summer involving smallpox, anthrax, and influenza.
Some caution that the risks of laboratory escape outweigh the benefits
gained from the research. Scientists
Marc Lipsitch and Alison Galvani calculate that ten US labs working with
dangerous pathogens for ten years run a 20% chance of a laboratory acquired
infection, a situation that could possibly create an outbreak. On July 14, a group of academics called the Cambridge Working Group
release a consensus statement that “experiments involving the creation of
potential pandemic pathogens should be curtailed until there has been a
quantitative, objective and credible assessment of the risks, potential
benefits, and opportunities for risk mitigation, as well as comparison against
safer experimental approaches.”
In defense of the research is the group Scientists for Science, which contends, “biomedical research on potentially dangerous
pathogens can be performed safely and is essential for a comprehensive
understanding of microbial disease pathogenesis, prevention and treatment. ”
Our discussion over this research also demonstrated the divide. Some pointed out that science is inherently unpredictable, so estimating
the possible benefits of research is difficult. In other words, the best way to
learn about highly pathogenic viruses is to study them directly. Another person
mentioned they heard the argument that studying viruses in ferrets (as was
done with the controversial influenza experiments in 2012) is safe because those
virus strains don’t infect humans. However, Nicholas Evans, a Penn Bioethicist
and member of the Cambridge Working Group, said he argued in a recent paper
that claiming research on ferrets is safer because it doesn’t infect
humans also implies that it has limited scientific merit because it
is not relevant to human disease.
There seems to be agreement that research on potentially
dangerous and dual-use agents should be looked at more closely than it has
been. The debate really centers on how much oversight and what restrictions are
placed on research and publications. With both Scientists for Science and the
Cambridge Working Group accruing signatures on their statements, it is clear
that the middle ground has yet to be found.