The origins of the SARS-CoV2 virus matter a great deal because the two conjectured possibilities require widely different responses.
If the virus came from nature, virologists can carry on bringing wild viruses back into their laboratories and continue to manipulate them in the hope of preparing for new epidemics. China can assert the Covid pandemic was a natural phenomenon for which its government bears no responsibility. The national media can say that it was right all along to dismiss the lab leak as a conspiracy theory and maintain that no self-scrutiny is needed.
If the virus leaked from a laboratory, on the other hand, the Chinese authorities should be held accountable for the pain they have inflicted on the world. Enhancing a virus’s properties—so-called gain-of-function research—should probably be halted immediately until a functioning regulatory system has been devised, different from the one now in place. Journalists and editors would doubtless wish to ask how they let the wool be pulled over their eyes for so long and so effectively.
The evidence that the Covid virus escaped from a lab presently falls short of absolute proof. But it’s compelling enough to raise the question of how hazardous new scientific techniques should be regulated. The gold standard for handling such research risks is the Asilomar conference of 1975, convened to consider the new ability to transfer genes from one species to another via recombinant DNA. The leading scientists who organized the meeting wanted the possible hazards to be publicly discussed, along with rules under which the novel research might safely proceed. They then decided, over the protests of many researchers, to impose stiff initial safety standards, with the idea that these strictures could be relaxed if the hazards proved less serious than feared. And this is what happened. By regulating themselves, the molecular biologists gained public trust and made outside regulation unnecessary.
Gain of function—the enhancement of a virus’s natural ability to infect people or cause disease—is another novel technique of obvious possible hazard. Anthony Fauci, former director of the National Institute of Allergy and Infectious Diseases, and Francis Collins, former director of the National Institutes of Health, have long been proponents of such research. But no Asilomar-type public discussion has guided its regulation.
Instead, control of the technique was kept inside the NIH, though since 2017 a committee in the office of the Secretary of Health and Human Services has been required to review projects. However, this P3CO committee, as it is known, can review only those projects that the NIH identifies and submits to it. Most of the projects it should have submitted, in the view of Richard Ebright, a long-time critic of gain-of-function research, were not flagged by the NIH, including the manipulation of SARS and MERS-related coronaviruses by the Wuhan Institute of Virology. Indeed, the P3CO committee has reviewed only three projects in the last five years. In addition, Ebright notes, “the HHS P3CO Committee has operated with complete non-transparency and complete unaccountability. The names and agency affiliations of its members have not been disclosed, its proceedings have not been disclosed, and even its decisions have not been disclosed.”
Could better regulation have helped avoid the Covid pandemic? The research that the NIH funded in Wuhan does not seem to have received adequate scrutiny. Nor is it clear how Fauci’s office, based in Bethesda, could properly supervise the safety of the hazardous research his agency funded in Wuhan, especially when the funding was channeled through an intermediary, the EcoHealth Alliance of New York.
An obvious question is why the NIH, before letting any of its grantees initiate gain-of-function research, didn’t hold an Asilomar-style conference to get the best possible scientific input into how the research should be conducted.
The Cambridge Working Group, composed of biologists critical of gain-of-function experiments, warned of the dangers of enhancing viruses in 2014, saying that laboratory creation of new viruses “could trigger outbreaks that would be difficult or impossible to control.” The group recommended an Asilomar process to assess the risk and assure the highest level of safety. According to Ebright, a member of the group, Fauci and Collins said it was a great idea but that it would have to be organized through NIH to be effective. The Cambridge Working Group accepted the plan, but then nothing happened.
“In retrospect it seems clear that Fauci and Collins, from the start, wanted only to sideline the Cambridge Working Group and never intended to move forward with the process,” Ebright says. He now believes that virologists’ intense opposition to extra safety rules makes any form of self-regulation unworkable. A policy of “unenforceable frameworks and sham simulacra of self-regulation” has not worked, in his view, and should be replaced with reviewers independent of the NIH and backed by force of law.
Virologists are well aware of the threat of new regulation bearing down on their field. A recent article in the Journal of Virology, with 156 signatories, poured cold water over the lab-leak hypothesis and its “paucity of evidence,” complained about the “ill-informed condemnation of virology” that has resulted, and sang the praises of gain-of-function research, which has been “an extremely valuable tool in the development of vaccines and antivirals.” The article then listed the extensive regulations to which virologists are already subject and urged that no more be added. “Regulations that are redundant with current practice or overly cumbersome will lead to unwarranted constraints on pandemic preparation and response and could leave humanity more vulnerable to future disease outbreaks.”
Regulation is indeed costly and inefficient, and it’s much better for groups to regulate themselves than to struggle under the heavy hand of outside overseers. The problem is that virologists have missed their chance. However bothersome the regulations already in place, they failed to prevent the NIH grantees at the Wuhan Institute of Virology from manipulating SARS-related coronaviruses in BSL-2 safety conditions, which should have been against the rules but was not. The evidence for a lab leak is not sparse, as the 156 virologists assert, but pretty substantial. The benefits of gain-of-function research are disputable; critics say they have so far been zero, with limitless attendant risk. Ebright is surely correct: the time has passed for virologists to be allowed to regulate themselves.
Other scientific communities are better-led. One can hear active public discussions about practical uses of the Crispr technique of gene editing, and of gene drives, another fraught technology. These scientific groups are acting openly and ethically; outsiders need not intervene. Gain-of-function research, however, got off on the wrong foot. A better consensus must be developed about the terms on which it can proceed.
Photo: nisara Tangtrakul/iStock