Making a superbug that can infect thousands of people is easier than ever. Is there anything governments can do to prevent terrorists from learning how to make a devastating bioweapon?
When flu scientist Ron Fouchier of Erasmus University in Rotterdam announced in September that he had made a highly contagious, supervirulent form of the bird-flu virus, a long chain of political events unfolded, mostly out of the public eye. Fouchier told European virologists at a meeting in Malta that he had created a form of the H5N1 avian flu — which is naturally extremely dangerous to both birds and mammals, but only contagious via birds — that was both 60 percent fatal to infected animals and readily transmitted through the air between ferrets, which are used as experimental stand-ins for human beings. The University of Wisconsin’s Yoshihiro Kawaoka, one of the world’s top influenza experts, then announced hours later that his lab had achieved a similar feat. Given that in some settings H5N1 has killed more than 80 percent of the people that it has infected, presumably as a result of their contact with an ailing bird, Fouchier’s announcement set the scientific community and governments worldwide into conniption fits, with visions of pandemics dancing in their heads.
Within government circles around the world, the announcement has highlighted a dilemma: How do you balance the universal mandate for scientific openness against the fear that terrorists or rogue states might follow the researchers’ work — using it as catastrophic cookbooks for global influenza contagion? Concern reached such heights that U.S. Secretary of State Hillary Clinton made a surprise visit to Geneva on Dec. 7, addressing the review summit on biological weapons. No American official of her stature had attended the bioweapons summits in decades, and Clinton’s presence stunned observers.
Clinton told the Palais des Nations audience that the threat of biological weapons could no longer be ignored because "there are warning signs," including "evidence in Afghanistan that … al Qaeda in the Arabian Peninsula made a call to arms for — and I quote — ‘brothers with degrees in microbiology or chemistry to develop a weapon of mass destruction.’" (Al Qaeda in the Arabian Peninsula is the terrorist group’s Yemeni-based affiliate and perhaps its most aggressive arm today, with connections to a number of ambitious plots.)
Then, in what has widely been interpreted as an allusion to the superflu experiments, Clinton added, "The nature of the problem is evolving. The advances in science and technology make it possible to both prevent and cure more diseases, but also easier for states and nonstate actors to develop biological weapons. A crude, but effective, terrorist weapon can be made by using a small sample of any number of widely available pathogens, inexpensive equipment, and college-level chemistry and biology. Even as it becomes easier to develop these weapons, it remains extremely difficult … to detect them, because almost any biological research can serve dual purposes. The same equipment and technical knowledge used for legitimate research to save lives can also be used to manufacture deadly diseases."
By the end of 2011, few governments or scientific committees were satisfied with the actions that had been taken to date to limit publication of the methods Fouchier and Kawaoka deployed, and most were frankly frightened. The Fouchier episode laid bare the emptiness of biological-weapons prevention programs on the global, national, and local levels. Along with several older studies that are now garnering fresh attention, it has revealed that the political world is completely unprepared for the synthetic-biology revolution.
So far, the rules — weak and inconsistent as they may be — have never been broken. Neither the Dutch virologist who created the roughly 90 percent mammalian transmissible form of superkiller H5N1 bird flu, nor the researchers who published a botulism cookbook — and not even the scientists who re-created the horrible 1918 flu virus or the fellows who constructed a polio virus from scratch — broke any existing rules. In every case, the researchers consulted with approval committees, sent their papers off when asked for review to various government committees, and then published their work openly in major scientific journals.
The problem is that there are no consistent, internationally agreed-upon regulations governing synthetic biology, the extraordinarily popular and fruitful 21st-century field of genetic manipulation of microorganisms. The chief agreement governing bioweapons work is the Biological Weapons Convention (BWC) which was created in the early 1970s as a bilateral accord between U.S. President Richard Nixon and Soviet Premier Leonid Brezhnev. Entered into force in 1975, the BWC now has 165 states that are party to it. Clinton’s now-infamous Dec. 7 speech in Switzerland was to a BWC gathering. The institution’s current president is Paul van den Ijssel of the Netherlands, Fouchier’s home country. Van den Ijssel advocates "ambitious realism" in pursuit of policies that can make the BWC an effective instrument for control of dangerous science, terrorism, and biological weapons. His ambition is to modernize the BWC, giving it long-sought teeth for verifying weapons violations and monitoring compliance. Currently the BWC is toothless.
At the BWC’s creation in 1975, biologists were just beginning to figure out so-called genetic engineering, moving genes from one bacterial species to another, typically using viruses as the vehicles on which the targeted gene hitchhiked from cell to cell. It was tedious work that was prone to contamination, and few political leaders had even a vague comprehension of what the scientists were up to. As a result, in its conception the BWC framed the bioweapons question in classic nation-state conflict terms. In many ways the original BWC bore more resemblance to nuclear weapons treaties than anything else, imagining stockpiles of vats full of dangerous microbes under the possession of national armies and "weaponized" to be hurled at enemy territories in some vague concept of biological warfare. The conceit was so crude and nightmarish that most political leaders and their intelligence advisors for decades dismissed the entire biowarfare notion as a ridiculous fantasy. The most common cry from skeptics was that no country would use biological weapons because they might kill more of their own people than the toll the microbes would take among the enemy. The microbes, it was thought, were uncontrollable and therefore unusable.
As I describe in detail in my new book, I Heard the Sirens Scream, the 9/11 attacks and 2001 anthrax mailings shook political establishments worldwide out of their complacencies. The United States, in particular, has spent trillions of dollars over the last decade in anticipation of bioterrorism, buying vaccines, treatments, alleged detection devices, and protective gear for civilian and military first-responders; staging drills and war-games scenarios; and practicing mass-casualty care in hospitals all over the country. On the civilian side alone, 2010 spending topped $5 billion, most directed to the National Institutes of Health (NIH) and Centers for Disease Control and Prevention (CDC) for research on specific microbes.
At the behest of President George W. Bush’s administration, the CDC created a list of organisms and biotoxins considered possible weapons and encouraged a vast research-and-development effort. The numbers of biodefense centers, featuring high-security laboratories and stockpiles of the world’s deadliest microbes, mushroomed over the last decade from an easily named handful to hundreds around the world, far too many meeting Biosafety Level (BSL) 3 or 4 standards. (Most biology research is conducted in lower-security facilities, but many universities and governments now have BSL-3 setups.) The flu experiments at Erasmus and the University of Wisconsin were executed in such settings. Researchers wear basic protective gear, and the actual microbes are held behind a glass barrier in a specially vented negative pressure space that sucks up all errant germs into a filter system. As added protection, the researchers breathe air that is pumped into their masks from a separate, safe source.
BSL-4 facilities are far more difficult to work in, more costly, and theoretically more secure. The scientists wear spacesuits and toil inside a facility that is itself nested inside at least one other secure layer. All air, food, water, and products are hygienically processed going into the lab and cleaned or destroyed rather than exiting the facility. Only the humans may freely leave the laboratory’s confines. Yet despite all the security and protections provided by BSL-3 and -4 facilities, leaks and accidents have happened.
Remarkably, influenza research of all kinds — including creation of superbugs — is classified as BSL-3, and the Erasmus and Wisconsin facilities did their work in basic vented labs located on campuses. Fouchier did not blithely wade into his flu experiments, as some news reports have claimed, but followed all rules governing biosecurity in the Netherlands. In 2008, the Royal Netherlands Academy of Arts and Sciences released its Code of Conduct for Biosecurity, stipulating what types of science, under what conditions, can be executed and published by Dutch researchers. Fouchier very strictly adhered to the Dutch code.
Because Fouchier and Kawaoka are funded by the U.S. NIH, their research also had to meet American biosecurity guidelines. And it did — at least, as those codes are currently conceived.
The rules governing such American research were largely created after the 2001 anthrax scare. Following the attacks, then-Secretary of Health and Human Services Tommy Thompson ordered creation of a cross-government committee to address the dual-use conundrum, finding a way to deter terrorist or other malicious use of scientific discoveries without impeding the pace of basic discovery and invention. The National Science Advisory Board on Biosecurity (NSABB) was the outcome, formally created in 2004. In its original charter, signed by Thompson, the NSABB was supposed to review all questionable research — every so-called dual-use study — before experiments were executed. The NSABB was supposed to recommend special precautions, including prohibiting some experiments, and referee decisions regarding ultimate publication of discoveries. In the post-anthrax political environment, Thompson wanted a very tough NSABB, even if it meant some scientists would believe their work was constricted or censored.
By the time, however, that the NSABB convened in late 2011 to review the Fouchier and Kawaoka cases, the board’s mandate had been pared down considerably. In a new charter signed by Secretary of Health and Human Services Kathleen Sebelius in 2010, the board functioned in a strictly advisory role, offering no review of experiments themselves. Its primary clout was over publication of the results once the experiments were performed. The scientists who served on the NSABB were themselves opposed to any pre-experimental regulation and had only modest faith in the powers of publication restriction. In 2007, the NSABB advised weakening its own authority, arguing that "a code of conduct can make good people better, but probably has negligible impact on intentionally malicious behavior."
Britain’s Research Councils advised a similar policy in 2007, admonishing the government of then-Prime Minister Gordon Brown that, "systems should be based on self governance within the academic community." Similar advisories flowed from scientific expert bodies to governments across Europe, Japan, India, China, South Korea, and several Latin American countries. It seemed scientists wanted no additional oversight over dual-use research and no limits on publication of their discoveries.
"The rules governing the publication of research results follow from the rules for the performance of research," states the Dutch code. "Here too, publication is the rule and non-publication the rare exception."
Following Fouchier’s dramatic September speech in Malta, both he and Kawaoka submitted their studies for publication to the American journal Science and Britain’s Nature. The NSABB intervened, asking the journals to refrain from publishing pending the board’s review. Shortly before Christmas, the NSABB advised that publication of the papers was OK so long as the actual methods used to create the superbugs were excised or so obscured as to be useless guidance for would-be terrorists. That put the entire burden of ethics and global dual-use biosecurity on the shoulders of the editors of these journals. Government punted, instructing publishers to please use their heads.
Bruce Alberts, the current editor of Science, faced similar instruction from the U.S. government in 2005 when Stanford University’s Lawrence Wein and Yifan Liu, then also of Stanford, submitted a paper titled, "Analyzing a bioterror attack on the food supply: The case of botulinum toxin in milk." The authors carefully analyzed the expected human kill rates produced by inserting botulinum toxin at various stages of milk production in the United States, from the actual milk farm all the way to supermarket shelves lined with cartons. "We have a reasonably accurate estimate of the number of people who could be poisoned," the authors wrote — as many as 568,000 victims, with death rates unknown but undoubtedly frighteningly high.
Bush administration officials were appalled and pleaded with the editor of the Proceedings of the National Academy of Sciences — Alberts, at the time — to decline the paper. As Bush security experts scrambled to find a legal way to force classification of the paper, Alberts noted that the then-new NSABB was not yet ready to offer advice. He was on his own. Alberts opted to publish, concluding, "If the types of calculations and analyses in the Wein and Liu article are carried out only by government contractors in secrecy, not only are the many actors in the U.S. system who need to be alerted unlikely to be well informed, but also the federal government itself may become misled."
The Fouchier and Kawaoka papers have yet to be published. While Alberts and his Nature counterpart mull their options, policymakers ought to consider what a bizarre predicament we are in. Why should such weighty decisions rest on the shoulders of editors? Every time serious dual-use conundrums have reached government, political leaders have demurred and ultimate decisions have similarly fallen to publishers. In every known case, publishers have, as can be expected, opted to publish. This happened in 2001 when Australian scientists accidentally made a 100 percent lethal form of mousepox, the rodent equivalent of smallpox. It also happened when an American team used that same method to make superdeadly cowpox and other pox viruses. Similarly, publication was the choice for a lab-modified version of the 1918 flu virus, ultralethal forms of SARS, a man-made polio (published with a detailed how-to section), and dozens more potentially dual-use discoveries.
In their defense, the relevant scientists and editors argued that there was no evidence that evildoers made use of any of this information. In response to this view, Stuart Nightingale, a biosecurity consultant to the U.S. Department of Health and Human Services, recently wrote in the Journal of the American Medical Association, "this does not mean, however, that such articles have not been or will not be used to do so. Well-organized, valid information with the imprimatur of respected peer-reviewed journals could be especially valued by a malevolent actor over any information that might be available on the Internet."
Outside of police states, though, censorship is impossible to enforce and ultimately useless within scientific circles. No professional group is as cybersavvy as scientists, save the actual computer-industry coders. Indeed, the Internet was originally created decades ago to encourage the exchange of information among scientists. Most researchers have tight collegial relationships with their peers, among whom discoveries are shared almost instantly. Methods, samples, reagents, and the basic intellectual tools of science are freely exchanged, and scientists who opt out of this fluid process are shunned, even condemned, by their peers. This is true at all tiers of the scientific process, from the senior-investigator level all of the way down to undergraduates toiling inside campus laboratories for school credit. Electronic information leaks, gets hacked, or "disappears" all the time. It is profound folly to imagine that global biosecurity can be attained through censorship. Even the NSABB decision to allow publication with methods omitted misses the point: Most of us (I include myself) already know how, in broad terms, Fouchier made his supervirus, and dozens of leading scientists all over the world know the work in sufficient detail to replicate it.
Still, recognizing the limitations of current codes and the BWC, some members of the European Union now advocate policing of science. A movement is afoot to allow police authorities to examine lab notebooks and scour laboratories across the continent on a routine, proactive basis. In a controversial editorial in the December edition of the European Molecular Biology Organization’s journal, editor Howy Jacobs argued, "Some might argue that the state has no place in an academic laboratory, but I believe the threat is real enough that this blanket appeal for trust and virtue is insufficient as a response.… No security system can be perfect. But democratic societies and responsible scientists need to be vigilant and proactive."
Jacobs’s plea is not likely to find many adherents among biologists, who as a group strongly believe in sharing information. The social norm of sharing is at its most extreme among self-described "life hackers" and "DIY (do-it-yourself) synthetic biologists." By definition, these biologists think that science ought to, in the Internet era, be a vast collective enterprise for the good of humanity, wherein thousands of researchers toiling inside home pseudo-labs, colleges, or enormous professional facilities work together to solve pressing problems. They are trying to turn algae into genetically modified solar collectors, use viruses as switch signals in tiny biocomputers, make vital food crops drought- or pest-resistant, create living art from genetically modified assemblages of organisms, and cure diseases by growing genetically altered cell colonies that can be surgically implanted or injected into ailing people. Some adherents to the DIY biology movement insist that their collective amateur laboratories are akin to the garage days of the development of Apple and Microsoft hardware and software in Northern California. From a scientific viewpoint, it would be hard to name any time in the history of biology as exciting as this.
Even in traditional pharmaceutical, biotechnology, and academic environs, the synthetic-biology movement, coupled with extraordinary advances in genetic sequencing, have upped the ante on both what is possible and what constitutes "dual-use" potential. A decade ago, sequencing the human genetic blueprint was a monumental feat costing millions of dollars, executed in hundreds of labs around the world. Today an individual’s genetic blueprint can be fully sequenced in a couple of days at a cost of about $1,000; biotech company Illumina advertises the service at $4,000. New technology coming out of the pipeline will bring that time and cost down more than 90 percent this year. Sequencing far smaller microbes is now so cheap and easy that deciphering the deadly details of plague or AIDS can be performed by, as Clinton phrased it, anybody with "college-level chemistry and biology." A perfectly functional DIY synthetic-biology lab, complete with gene sequencer, costs about $25,000 today; it will go for $5,000 soon.
Industry is moving full-bore on synthetic biology as well. Maverick biologist J. Craig Venter, famous for setting up a private company that raced the NIH to be the first to synthesize the human genome more than a decade ago, violated no rules in 2010 when his private research institute published detailed how-to steps for inserting the genome of one species of bacteria into a different species, creating the ultimate Trojan horse that could sneak by human immune system defenses to deliver a lethal cargo. The experiments were so bold and dramatic that in 2010 the U.S. Presidential Commission for the Study of Bioethical Issues was tasked with finding guidelines to control private and industrial synthetic-biology experiments. It opted instead for free, unfettered science, except for first out-of-laboratory uses of man-made organisms.
In October, Nature published the genetic blueprint of Yersinia pestis, the bacterium that caused Europe’s 14th-century Black Death. This followed a long list of microbial blueprint publications, complete with detailed analysis of what, genetically speaking, makes the bacteria or viruses tick: Here is the virulence sequence that kills human cells, scientists point out; these nucleotides control transmission from one human cell to another; thanks to these genes the microbe can evade the human immune system; and so on. The whole point of the work is to demonstrate how microbes infect and destroy cells, including those of human beings.
Political leaders can no longer relegate questions about bioterrorism, biological accidents, bioweapons, or bio-homicide to scientific review panels or, worse, journal editors. It is time to rethink both the BWC and the various biosecurity codes countries have created, without resorting to doomed calls for censorship.
In a 2007 speech to the NSABB, C. Kameswara Rao of India’s Foundation for Biotechnology Awareness and Education almost pleaded with his American and European counterparts. India’s burgeoning pharmaceutical industry is now taking in $2 billion annually, and enterprises akin to DIY biology have sprouted up from Bangalore to Mumbai. What might have once been considered unthinkable to Indians became ugly reality with the 2008 terrorist attacks in Mumbai. More than 300 people were injured and 172 died in the worst mass-casualty event in modern Indian history. Developing countries like India, Rao argued, are in desperate need of international guidelines, global governance of dual-use research, and basic know-how. Wealthy countries must, he stated, "share and provide state-of-the-art technical know-how" on biosecurity and surveillance of violations. There is a desperate need to globally "coordinate and monitor diagnostic, preventive, and remedial action." And an international funding agency must be created to support such preventive action in developing countries in order, Rao concluded, "to prevent human tragedy for want of technical know-how and financial resources." Rao was calling for nothing less than a massive global effort to train government institutions in poor and middle-income countries in the legal, biological, and public health tools necessary to control and respond to release of dangerous man-made contagions, whether deliberate or accidental.
Yanzhong Huang, senior fellow at the Council on Foreign Relations, finds a similar state incapacity to limit and surveil biothreats in China. In his estimation, Beijing has no capacity to prevent "biocrimes" or limit the synthetic-biology activities of its mushrooming biotech industry and academic science. Between them, India and China comprise much of the world’s population and economic growth, have the lion’s share of the new biotechnology and drug industry that could potentially execute dual-use research, and lack regulatory capacity to monitor such developments. That ought to worry all of us, whether we are in Beijing and Bangalore or Boston and Bangkok.
Developing countries’ concerns put the World Health Organization (WHO) in a particularly difficult position on the H5N1 experiments and larger biosecurity issues. As H5N1 spread throughout Asia from 2003 to 2009, Indonesia experienced the majority of human cases and deaths, and virologists were eager to obtain samples of the flu viruses circulating around that vast island nation in order to comprehend why. The government declined to share viral samples, citing several concerns chiefly related to vaccine development, patents, and profits. After years of difficult negotiations, Indonesia and the World Health Assembly, the WHO’s governing body, last year agreed to guidelines permitting sharing of both viruses and the profits derived from them. The resulting Pandemic Influenza Preparedness Framework is a fragile agreement governing the WHO’s emerging-diseases and flu activities.
In an unusually harsh statement on Dec. 30, the WHO condemned the H5N1 experiments and demanded that the methods used to obtain superbugs remain secret, but also cited concern that any further restrictions on the flow of scientific information might undermine the fragile flu framework. Noting the extreme dangers posed by H5N1, which since 1997 has killed 60 percent of infected human beings, the WHO said, "Research which can improve the understanding of these viruses and can reduce the public health risk is a scientific and public health imperative" that requires open sharing of all viruses and information.
Meanwhile, bird flu is back, causing human and bird infections and deaths in Hong Kong, mainland China, India, Bangladesh, and Egypt. A Shenzhen bus driver died of H5N1 on Dec. 31; the source of his infection has not been determined. Nature carries out its own mutations. Indeed, all five of the mutations that were the key in Fouchier’s experiments to transforming garden-variety bird flu into a supercontagious mammalian killer have already occurred separately in nature. Yes, the birds and viruses have already done it — but not with all five mutations in a single viral strain. The biological clock is ticking. In late December, the U.S. CDC issued a warning, noting that yet another flu threat looms, combining the 2009 H1N1 "swine flu" with a H3N2 influenza now circulating in American commercial pig farms. The naturally occurring recombinant flu had infected a dozen Americans by Christmas.