Walking the Tightrope Between Risk and Reward in Studying Deadly Viruses
The lifting of a three-year ban on gain-of-function research revives the critical goal of averting disaster — but may also increase the risk of accidental or deliberate pandemics.
Whether we know it or not, the virus that might cause the next major deadly disease epidemic might be circulating in nature right now.
It might be spreading among birds, or monkeys, or pigs — and all it will take to infect humans may be a small random mutation in one of its genes.
If we want to prevent such a threat, or even respond quickly when it emerges, there’s one way to do it: Study potentially dangerous viruses in secure research labs and make genetic tweaks to determine what changes might pose a risk.
Besides teaching important lessons about the virus or other microbes, this could boost readiness to respond to an outbreak and develop new treatments or vaccines without having to start from scratch when a disease is already spreading.
But then again, those tweaks could prompt a human-caused pandemic if some of the altered microbes escape the lab or are obtained or grown by terrorists or rogue governments.
A renewed effort
For the past three years, research on viruses with pandemic potential such as avian influenza has all but stopped as scientists, ethicists and public health experts examined both sides of the coin.
This week, however, the green light went back on — but with a “caution” sign and specific requirements put in place via a National Institutes of Health notice to researchers.
University of Michigan virus scientist Michael Imperiale, Ph.D., has been in the thick of the debate since it began more than five years ago. At that time, European and U.S. scientists showed that they could slightly change the virus that causes bird flu, allowing it to gain the ability to spread between mammals through the air.
The ensuing debate over this kind of research, called “gain of function,” led to a ban on NIH funding for it in 2014.
At the time, Imperiale served on the National Science Advisory Board for Biosecurity, which advises the NIH’s policy officials on issues related to the security of biological materials. More recently, he advised the National Academy of Sciences about options for managing the information that such research might produce.
He welcomes the new NIH policy on funding for work on what the agency calls “enhanced potential pandemic pathogens” — just as he welcomed the NSABB report that recommended much of what NIH has decided to do.
“It’s good to lift the moratorium on gain-of-function research,” explains Imperiale, who is the Arthur F. Thurnau Professor of Microbiology and Immunology in the U-M Medical School as well as U-M’s associate vice president for research policy and compliance. “I’ve been really worried that something will pop up in nature, and we’ll be behind the eight ball.”
When he announced the new policy this month, NIH director Francis Collins noted that some gain-of-function research projects had in fact been allowed to proceed after careful review during the last three years.
That kind of review for proposed research seeking NIH funding now has the force of policy. Imperiale notes that this meshes with what the NSABB recommended last year — and reflects a position that he and others have long supported.
“The bottom line is: If it’s an important scientific question, let’s figure out a way to do the work safely,” Imperiale says. “We have to look at each project case by case.”
The grant application process is the perfect time to weigh the scientific and public health justification for the work against the potential risks if the altered microbe leaves the lab, he says. That way, researchers seeking to do the work can make their case and the reviewers can raise concerns or set limits.
Boosting security measures
But on the other end of the scientific process, after the research has produced results, another thorny issue arises: the release and distribution of such sensitive information.
Imperiale and Stanford University professor David Relman wrote a paper for the National Academy of Sciences earlier this year about what options might exist. They looked not just at gain-of-function research but at any scientific work that produces results that could be put to both good and nefarious uses.
Their work was prompted in part by a debate over if and how the bird flu research results could safely be shared with scientists without tipping off potential terrorists or those who might seek to weaponize the altered virus.
In the end, the NAS consensus report on such “dual use” research concluded, based in part on Imperiale’s input, that the federal government’s ability to control the dissemination of information from such research is limited — and nonexistent when it comes to research not funded by federal agencies.
Instead, it called for a review process similar to what the NIH has just announced: to examine the potential course of action before research is funded. It also called for more international policymaking, and more education of researchers about biosafety and biosecurity when working with infectious organisms.
Striking a balance
In the end, Imperiale says, the new NIH rule will allow researchers at well-equipped facilities with appropriate safety infrastructures to move forward.
As for the potential use by those with evil intent? He notes that it requires a high level of resources to do this kind of work without endangering the researchers themselves. But vigilance against such threats will always be important — which reinforces the need to study these organisms through established research institutions.
“This has been a good example of the effort to balance scientific and societal needs, but it dragged out for a long time,” says Imperiale, who worries that young scientists are likely deterred from pursuing this research due to the ongoing debate. “We need to be more nimble as a scientific policy community.”