Governing Biotechnology: From Avian Flu to Genetically-Modified Babies With Catherine Rhodes
Future of Life Institute Podcast - Un pódcast de Future of Life Institute
Categorías:
A Chinese researcher recently made international news with claims that he had edited the first human babies using CRISPR. In doing so, he violated international ethics standards, and he appears to have acted without his funders or his university knowing. But this is only the latest example of biological research triggering ethical concerns. Gain-of-function research a few years ago, which made avian flu more virulent, also sparked controversy when scientists tried to publish their work. And there’s been extensive debate globally about the ethics of human cloning. As biotechnology and other emerging technologies become more powerful, the dual-use nature of research -- that is, research that can have both beneficial and risky outcomes -- is increasingly important to address. How can scientists and policymakers work together to ensure regulations and governance of technological development will enable researchers to do good with their work, while decreasing the threats? On this month’s podcast, Ariel spoke with Catherine Rhodes about these issues and more. Catherine is a senior research associate and deputy director of the Center for the Study of Existential Risk. Her work has broadly focused on understanding the intersection and combination of risks stemming from technologies and risks stemming from governance. She has particular expertise in international governance of biotechnology, including biosecurity and broader risk management issues. Topics discussed in this episode include: ~ Gain-of-function research, the H5N1 virus (avian flu), and the risks of publishing dangerous information ~ The roles of scientists, policymakers, and the public to ensure that technology is developed safely and ethically ~ The controversial Chinese researcher who claims to have used CRISPR to edit the genome of twins ~ How scientists can anticipate whether the results of their research could be misused by someone else ~ To what extent does risk stem from technology, and to what extent does it stem from how we govern it?