Thirty years ago, Nancy Connell was working to get thousands of scientists, including Nobel laureates, to sign a pledge presented at the United Nations, an agreement that said they would not knowingly engage in research that would lead to the development of a biological weapon.
Connell was a graduate student at Harvard University at the time and it would be her first foray into a long career advocating for greater scientific responsibility regarding research into Global Catastrophic Biological Risks.
In August, Connell was appointed to the National Institutes of Health’s National Science Advisory Board for Biosecurity (NSABB), where she hopes to reframe the debate surrounding issues of dual use in research and the role of information hazards.
Connell also serves as a senior scholar at the Johns Hopkins Center for Health Security and a visiting professor in the Department of Environmental Health and Engineering at the Johns Hopkins Bloomberg School of Public Health.
Connell is a member of the Centers for Disease Control and Prevention’s Biological Agent Containment Working Group in the Office of Public Health Preparedness and Response. She has also served on more than 15 committees at the U.S. National Academies of Sciences, Engineering, and Medicine, including Trends in Science and Technology Relevant to the Biological and Toxin Weapons Convention (2010), and Review of the Scientific Approaches Used During the FBI’s Investigation of the 2001 Bacillus anthracis Mailings (2011). She is currently chairing the National Academies of Sciences’ components of a series of international science and technology workshops, supported by the EU and the UN, designed to explore regional advances and activities related to implementation of the Biological Weapons Convention (BWC). Many of these regional activities focus on young scientists’ roles in fostering biosecurity in their laboratories.
Homeland Preparedness News recently interviewed Connell about her goals for the NSABB and her current research into Global Catastrophic Biological Risks (GCBRs).
Your research projects analyze novel biotechnologies that have to do with GCBRs in ecosystems. Could you explain some of the current research projects you’re working on?
So, my first big project at the Center is thinking about manufacturing capability; what kinds of capacities we have to respond to a truly catastrophic pandemic or other kind of biological event. We know a fair amount about seasonal flu, and every year manufacturing fires out the appropriate strain to combat that year’s combinations of strains. But what if one of those strains would actually become pandemic and spread around the world? Last time that happened was in 2009, and what we’re trying to do is evaluate current capacity at a global level and figure out how to increase manufacturing capacity to develop new vaccines against emerging pathogens.
So, we’re thinking about flu as a model, but we’re also thinking about what the world would do with an unknown virus or some other kind of infection that suddenly took over the world.
Are you looking at simulation models? How do you do this kind of research?
For this project, we really want to think about manufacturing at the level of bioreactors. There are so many steps along the way that many of these create a strong bottleneck in the manufacturing process. We spent a long time talking to people in the field of vaccine manufacturing.
There are also the regulatory issues we examine. Of course, the FDA is the gold standard for the world for safety and regulatory prospects, but the question is how will we deal with the months and months it takes to prove that a vaccine is safe and effective if hundreds of millions of people are dying during a pandemic.
Certainly for highly contagious pandemic level viruses and in some cases bacteria, we have a lot of work to do.
It’s hard to develop a safe and effective vaccine for these situations, and we could probably list on the fingers of two hands the effectiveness of the vaccines we have.
There’s a huge push to organize vaccine platforms that would be a system in which once you isolated the organisms that are causing this infectious outbreak, you could very quickly figure out what parts of the DNA of this organism could be used to very quickly create a DNA or RNA-based vaccine. This could reduce the development time by months.
You’ve worked with young scientists globally. Is there a region you’re particularly excited about?
I think young scientists across the world are starting to wake up and think about biosafety and biosecurity early on in their careers. And this is really a universal phenomenon, and it’s so encouraging for someone like me whose been in the business for a long time. I spent the first 20 years of my basic science career trying to find ways to connect with scientists to make people aware of the impact of their work and their responsibility as scientists in the area of biological weapons and biosecurity. And I think now, because of the outpouring of interest in every region I’ve travelled to, one really has the sense now that biosafety and biosecurity, such as dual-use research, has now become part of the curriculum around the world.
You have a long-standing interest in the policies regarding biocontainment work. What are some of the primary concerns regarding dual use?
There are millions of examples, and technically I think the term ‘dual use’ is imprecise. Almost anything anyone does in science has the potential to possibly cause harm. And so, in my view, it’s an unfortunate term that was applied to a real ethical quandary. If you’re working on a project that has the potential to lead to the development of a biological weapon, should you not do the work? One of the things I’ve been thinking about a lot lately is that term and how it gets in our way of thinking about these things. It’s binary. Ether something is dual use or it’s not, and if that’s how we’re going to think about governance, then it just becomes a massive problem. In my view, the term ‘dual use’ should be replaced by a risk-based analysis because all the experiments we do have a certain level of risk.
Could you explain your work on the anthrax attacks of 2001?
One National Academies committee I was on was the FBI investigation on the analysis of the anthrax attack, which was finished in 2010. We were analyzing the FBI’s forensic capabilities and the research that went into the case. It was the longest and most expensive investigation in FBI history and we reviewed more than 6,000 pages of data.
Here was the question: These spores show up in the mail and where did they come from?
People had been thinking about microbial forensics but there hadn’t really been a case yet. In 2001, DNA sequencing was in its infancy. With the advances in technology now, if this investigation happened today it would be an entirely different affair. So, it took them several months to get the sequence of the attack string and then to go back and figure out what lab it came from. It was fascinating.
The FBI based much of its claim on the DNA sequencing results, but there were other things they tried to look at, too. One, for instance, was carbon dating. They used that to determine the spores were newly grown and were only two years old from the date of the crime. That says something very large: the spores didn’t come from a Soviet stockpile from the black market, or something. It was a new event. Someone had made this stuff in a lab and had dried it and processed it and put it into envelopes and sent them out.
I have to say that although I’ve been thinking about biological weapons my whole career, I couldn’t believe the anthrax attacks were happening.
What kinds of situations do you hope to examine on the NSABB committee?
The last big issue the NSABB dealt with was a series of experiments supported by the NIH that were highly controversial. A lab in Wisconsin, and one in the Netherlands, transferred bird flu, H5N1, to ferrets, which equivocally express the disease like humans — they’re the closest thing to a good human model in animals. It was a very important experiment from a public health standpoint.
H5N1 is 60 percent lethal in humans. The other thing they did was, basically, they converted it to a transmissible human disease in the lab. This blew up the world of science.
The scientific world is still divided into two camps, one of which says the experiment shouldn’t have been done. And the other camp is saying everything has to be done. Anything that you can think of to help science and help us combat disease should be on the table.
And that’s why I want to be on the NSABB. I want to explore the possibility that there’s an experiment that shouldn’t be done. I’m agnostic here. I have no ideas. The implications that there are experiments that should be forbidden is enormous. What does that mean for science? Who decides? Is it international? What is the U.S.’s role in thinking about this?
So, this whole international area of governance is new and it’s burgeoning.
If something horrific happens, I’m not even talking about intentionality, but as a side effect of some brilliant thing someone is doing in nature, then, what will be the response? Will the control of scientific exploration be top down rather than bottom up?