Some of the hardest questions in the process of scientific discovery aren't about science, but philosophy.
A good illustration of this truism is the unanimous recommendation by the National Science Advisory Board for Biosecurity (NSABB) that two leading journals not publish certain details about experiments with a version of the H5N1 virus, also known as "bird flu."
The board's concern is that the information about the experiments, which involved genetic mutations that made the virus much more virulent than the versions seen in nature, could lead to a bioterror weapon. But the reaction by the editors of the journals Nature and Science to the proposed de facto censorship of research results was, as described by a Washington Post report, "chilly." An expert not on the NSABB was quoted as saying that that the recommendation was "ridiculous" because the risks the results present to humans are remote.
The board's recommendation doesn't go directly to the journal publishers, scientists or their institutions (one is the University of Wisconsin, the other in The Netherlands), but to the Department of Health and Human Services (DHHS), which in turn can only urge the journals to withhold information. One sticking point that gives the U.S. government leverage is that the research in both institutions was apparently done using federal funds. But the U.S. surely doesn't want to look like it's trying to keep the data to itself.
Beyond these short-term calculations, the incident reveals a deep philosophical divide about biological research that could threaten national security and public health, one that I have observed for years as an unpaid advisor to several government agencies including the NSABB (but not having to do with the current studies): How risk-averse should the life sciences community be in an era of asymmetric warfare? In the twentieth century nation-states found biological weapons to be pretty useless and unmanageable, but non-state actors and rogue states might still find them of interest.
Unlike physicists, whose modern discipline grew up in an atmosphere of the deepest possible relevance to national security during World War II, the culture of life scientists is not so woven in with security concerns. An interesting exception was scientists' self-imposed mid-1970s moratorium on recombinant DNA research, but that didn't last very long and didn't involve terrorism fears but rather environmental risks. Obviously that has changed somewhat since the October 2011 anthrax attacks, but international treaties have imposed successful prohibitions on novel development of biological weapons (BW) since 1970. The most extensive effort to develop innovative BW was the secret and illegal Soviet program that continued right up to the end of the cold war. Further complicating the picture is the fact that quite a bit of funding for biology has resulted from post-9/11 worries, particularly in the form of secured laboratories for research on potentially dangerous pathogens.
Still, the default position of biologists is usually that more publicity is protective rather than threatening, that in the long run secrecy works against security rather than for it. Published results, it is generally thought, will help the scientific community and responsible authorities prepare for threats that might be posed by new knowledge well in advance of any actual attempts to use them.
Aiming at a compromise, before accepting the NSABB recommendations the two journals have asked for detailed plans from government that would enable "responsible scientists" to have access to the experiments' details. How such a program would operate, especially in the hotbed of shared information created by the Internet, is not at all clear. What is clear is that the biological science community is very far from adopting anything like a "prudential principle" that would put the burden of proof on those who say there is no risk to a new source of knowledge.