By Georgia Ray
This was a panel discussion involving four speakers discussing biotechnologies and the potential for mis-use, and the challenges of regulatory oversight.
Jesse Kirkpatrick– GMU’s assistant director of the Institute for Philosophy and Public Policy.
Chris Oehmen– PNNL cybersecurity expert
Gregory Koblentz– The GMU Biodefense program’s very own director.
Megan Palmer– A senior research scholar at Stanford’s Center for International Security and Cooperation.
Oehmen kicked off the panel by drawing parallels between the fields of cybersecurity and biosecurity. First he noted how we tend to misinterpret them – we draw on our classic metaphors for defense and security, and imagine building a castle or a fortress with physical walls, to guard that which we want to keep safe. But information is not a physical material. Its transfer is not cleanly constrained by energy, time, or physical space as physical matter is, and nor are the defenses we build. Oehmen suggests we replace this with a resilience-based model, taking other approaches to thinking of security in flawed systems. What assumptions are we relying on when we think of biosecurity? Are they true?
Koblentz discussed his work on Editing Biosecurity, a multidisciplinary study on gene editing technology and biosecurity issues. He criticized reliance on “agent-specific” models – security measures that ban specific agents, like smallpox or Clostridium botulinum. In the age of genetic engineering, those boundaries fall apart – what does this system do with a disease that is a genetic combination of smallpox and an unlisted agent? Or a normally-harmless E. coli with an inserted plasmid that codes for botulinum toxin?
The problems within biodefense are growing, because advances in biology broaden the “attack surface” of offensive attacks – the number of possible avenues and routes through which an attack might come, each of them requiring its own defense. Biodefense isn’t simple anymore.
What do we do about this? Koblentz suggests we:
- Talk and work with labs to develop safety procedures
- Measure tacit knowledge, etc, to understand ease of use for technologies
- Engage stakeholders
- Improving guidelines to account for changing uses of technology
- Prepare for flexible, decentralized platform technologies and the risks they carry
Megan Palmer talked about using the International Genetically Engineered Machine competition as a workshop for biodefense policy. iGEM is a competition in which students (from high-school to graduate school) genetically engineer organisms with specific properties for prizes.
Her studies include biosecurity interventions, since IGEM is wildly popular and today’s iGEM participants will become tomorrow’s lab scientists. But the iGEM teams are also sample groups for testing policies. Each iGEM team is a microcosm research group. iGEM is low-stakes, compared to professional groups, which have more stake in their work and more dangerous organisms. Her team also easily creates incentives in iGEM, such as special prizes for teams with innovations in biosafety. Anecdotal evidence suggests that students in the program are meaningfully engaging with biosecurity. Of course, Palmer and her team are currently looking at metrics of engagement too.
Lastly, Koblentz pointed out the absence of any permanent institute examining risks or promises for new technology, suggesting that this would be a useful tool in keeping biosecurity up to speed with biotechnology developments.