JOHN DANKOSKY, HOST:
This is SCIENCE FRIDAY. I'm John Dankosky. Ira Flatow is away. Our gadgets are getting pretty fancy these days. The smartphones, the smart watches, tablets and fablets. You might say we've reached the pinnacle of the computing revolution, now that we all have computers in our pockets. But the next big thing, the next revolution is happening in biology.
Synthetic biologists like Craig Venter can already build life from scratch using a few bottles of chemicals to synthesize an organism's DNA. But it's not just big shots like Craig Venter. High school and college students are already learning these tricks, too, how to tinker with life, how to redesign bacteria to do new things. They debuted some of their creations last weekend at the international Genetically Engineered Machine competition in Cambridge, Massachusetts.
Of course, not everyone is entirely comfortable with this idea. Should just anyone be able to reinvent life in his basement? Is there any way to monitor these experiments to make sure they're done responsibly? That's what we're going to talk about this hour, biosecurity in the age of synthetic biology. So call us. Our number is 1-800-989-8255. That's 1-800-989-TALK.
If you're on Twitter, tweet us your questions by writing the @ sign followed by SciFri. If you want more information about what we're going to be talking about, go to ScienceFriday.com. You'll find some links to our topic. Let me introduce our guests. Peter Carr is senior staff and synthetic biology lead at the MIT Lincoln Lab at the Massachusetts Institute of Technology. Welcome back to SCIENCE FRIDAY, Dr. Carr.
PETER CARR: Thank you, John. Thanks for having me back on the show.
DANKOSKY: Kenneth Oye is with us, director of MIT's program on emerging technologies and an associate professor of political science and engineering systems there. Welcome to SCIENCE FRIDAY, Dr. Oye.
KENNETH OYE: Thank you very much, John.
DANKOSKY: And Laurie Garrett is a senior fellow on global health at the Council on Foreign Relations here in New York. She's also a winner of the Pulitzer Prize for her coverage of the Ebola epidemic in Zaire. Welcome to SCIENCE FRIDAY, Ms. Garrett.
LAURIE GARRETT: Great to be back.
DANKOSKY: Well, let's start with you, Peter Carr. You're the director of judging at the iGEM competition that we were mentioning in the opening here. So give us a taste of what sorts of projects the students are presenting.
CARR: Oh, thanks. Appreciate the chance to talk about that. Last weekend, we had the grand final competition for the collegiate level division of iGEM, which was the first and historically biggest division within iGEM. There is a separate younger, high school division that had its championship earlier in the year. We had so many great projects at iGEM this year.
So iGEM students look at how they can use synthetic biology to make a difference with real problems the world is facing. And a crucial part of iGEM and the process the students go through is the training of the next generation of responsible synthetic biologists who think about how the world affects their work and about how their work affects the world.
So for just a few quick examples, the team from Imperial College London, they were programming bacteria to use trash as a food source, turn it into useful plastics and plastics which can then themselves later be broken down again and recycled through the same kind of process. Students at UC Berkeley looked at all the chemicals and petroleum products that are used to make our blue jeans blue.
They came up with a way to get cells to make those same dyes instead, trying to replace harsh chemicals and nonrenewable resources. And they did that with the hopes of having a more eco-friendly, sustainable process. The team from Paris Bettencourt explored multiple ways to combat drug-resistant tuberculosis, and the MIT team engineered new ways for cells to communicate between themselves, passing communication packets, if you will, back and forth.
And one application for that kind of work is to produce useful kind of mini tissues for research, such as liver or pancreas tissue. Just a few more, there's the team from the University of Illinois, Urbana-Champaign, they were working on engineered probiotics that would, working in the gut, would consume a compound that comes from red meat that increases the risk of heart disease.
DANKOSKY: Well, and these are really interesting and so I just want to turn to Ken Oye quickly and ask him. Now you vet the students' projects at iGEM to make sure that there are no safety issues, because they're dealing with some pretty complex stuff here.
OYE: There are safety issues and what we're trying to do here, if we step back just a little bit, iGEM is a great mechanism for diffusing technology, for educating kids all over the world on how to do biological engineering. At the same time, of course, as capabilities diffuse, you want to be able to package with those capabilities, responsibility, attention to safety concerns, attention to security concerns.
So what we've been doing for the last several years is looking at every project, every project of all of these teams scattered all over the world. We've been looking at their description of what they're doing on little Wikis. We've been looking at their declarations and we've been trying to first push them to be in contact with local biosafety officers at their universities for review.
Many folks don't even know what their duties or responsibilities are, and local authority comes first. But we're also trying to make our own judgments on what to be worrying about and what we've done is set up a process where you have graduate students and more senior authorities looking at each of the projects. If there's any question, then passing on the issues to a higher level set of folks who are actually part of the formal professional vetting processes, people at the UN or people on the governmental regulatory bodies in Canada and the U.S.
And so, again, part of this is trying to build a culture of safety while screening. First concern is always safety and you want to make sure that the kids are safe. The second concern, though, is building off of what we find as we're talking to the students and looking at what they're doing and trying to then figure out what other actions should be taken.
So, for example, we had a very clear sense that a number of the students doing advanced biological engineering didn't really understand what it meant to be doing safe or unsafe work. It wasn't a question of maliciousness but knowledge. So public regulatory authorities, people in Canada and the U.S. and members of our little group, worked together to develop training materials to help them better understand what they should be worried about and what maybe is safe.
Then Ed You from the FBI and Jessica Tucker from HHS, Piers Millett from the UN have also been coming in for the past couple years, educating on biosecurity. Now, John, you've never seen anything weirder than a group of 200 to 300 nerds, and that's a fair way to characterize these guys, sitting around in an auditorium on a Saturday night, in fact one of the Saturday nights was Halloween, with nothing better to do than listen to the FBI, HHS and the UN do briefings on biosecurity, and sitting there enthralled for two to three hours.
Another part of what we've been trying to do is much more boring. iGEM and synthetic biology have, at their core, the idea of creating registries of parts, fragments of DNA that could be repurposed and reused. So if you have 20,000 biological parts created and contributed by the teams, how do you know what's in there? And so for a couple years, we really wanted to know a little bit more about what those sequences were.
And, again, public-private partnerships, Craig Venter's company, SGI DNA, donated services to be screening every one of those 20,000 parts to figure out what organisms they came from and confirm what we'd been told and also to check on what the function of the little bits and pieces of DNA were. This was actually better than industrial standard and to have a company donating those services in the context of a student competition was really quite exceptional.
DANKOSKY: And we may be talking a bit more about that registry of parts coming up a little later on. I want to bring in Laurie Garrett, though. You wrote the cover story for this month's issue of Foreign Affairs dealing with this very idea, the promise and some of the perils of the synthetic biology revolution. This whole idea, it excites people, but it also makes people pretty uncomfortable, Laurie.
GARRETT: Well, first of all, I want to say, I wish, Peter, that I was young enough to be a competitor in iGEM. It's very, very exciting. And, Ken, I know your work very well. And I'd wish that everybody involved in the synthetic biology world and in the gain-of-function science world and in the metagenomics world was following as arduous and specific a set of security and safety standards and thinking about the ethics as carefully as what you're trying to do with the iGEM kids.
Unfortunately, that's not the case, and it's certainly not the case overseas. And we put a lot of time and effort into trying to understand how the rest of the world views this whole effort to do what many people call directed evolution. We're really actually pushing the boundaries of natural selection, making it not natural selection but human selection of nature or manipulation of nature.
And though there's been conversation about genetic engineering since the mid-1970s, the revolution really started about five or six years ago. And the moment that Craig Venter made the phi X 174 phage, and then, you know, in 2010, made the first actually completely synthesized life form, and did what I call 4D printing - meaning he got it to actually self-replicate. And now we see the marriage of 3D printing and genomic sequencing emerging.
And the idea that someone can be creating a sequence for a pathogen in Los Angeles and sending it via email to a 3D printer in Islamabad, and someone in Islamabad may print it out and turn it into something quite dangerous, well, this changes the whole picture, so that while we have a giant construct around safety, special pathogens, the monitoring for bioterrorism and nasty sales and distribution of the sort of precursor compounds that people would be using in the lab to make good or bad - beneficial or otherwise - compounds, they all kind of assume, or are built on the notion that we have known items to monitor.
But now, what we're really talking about is biology equals information. And once biology equals information, you're into all the problems that we have trying to regulate anything in cyberspace, which is essentially close to impossible. It reminds me of the debate we had a year and a half ago regarding the human-manipulated H5N1 bird flu viruses. So, three different laboratories - first one in Wisconsin and one in Rotterdam, and then some months later one in Harbin, China - deliberately altered H5N1 bird flu to give it the capacity to be airborne transmissible between mammals.
That would mean that what was once just a bird flu virus could be very, very dangerous to you and me, at least in theory. And when all of this came to light, it suddenly became clear the problem was the information. How did you do it, and what were the sequences?
DANKOSKY: Well, let's talk more about that information when we come back from our break. We'll also meet an FBI special agent who's paid his dues in science labs, too. Stay with us.
(SOUNDBITE OF MUSIC)
DANKOSKY: This is SCIENCE FRIDAY. I'm John Dankosky. And we're talking this hour about biosecurity with my guests Peter Carr, senior staff and synthetic biology lead at the MIT Lincoln Lab at MIT. Kenneth Oye is director of MIT's program on Emerging Technologies. Laurie Garrett is senior fellow on Global Health at the Council on Foreign Relations here in New York.
I want to bring in another guest. He's trained in biochemistry and molecular biology, but he's also an FBI special agent. It's not your typical resume for an FBI agent. Ed You is a supervisory special agent at FBI's Weapons of Mass Destruction Directorate, Biological Countermeasures Unit in Washington, D.C. And he joins us from NPR in Washington. Welcome to SCIENCE FRIDAY.
ED YOU: Thank you. It's a privilege to be here.
DANKOSKY: You have a background in biochemistry, molecular biology. How did you end up at the FBI doing this work, first?
YOU: Well, I can honestly say that I probably speak for a lot of agents who joined the FBI. It was a post-9/11 effect. It was looking at not only from the science point of view, when I first entered the lab, at contributing to society, but what else could I do. And a few years after I completed my work and spent some time in the biotechnology sector, decided to hang up the lab coat and put on a badge.
DANKOSKY: You heard Laurie Garrett, just before our break, talking about some of her concerns about this new world of synthetic biology. Give us a sense of how you look at this from the standpoint of the FBI. What is it you're looking for? What are some red flags that might be raised?
YOU: So, the point was already made that we're dealing with a field that is moving incredibly quickly and evolving. So one of the things that we wanted to make sure of - and it's the mission of the FBI, in particular the WMD directorate, is one of proactive measures. So how do we prevent something from being misused or exploited, especially in the realm of synthetic biology?
So rather than trying to derive a finite list of things to be on the lookout for, we decided to go out and engage the scientific community because we realized: Who better to be in a position to potentially identify something that might be of concern than the experts themselves? So the - one of the activities, one of which that we're really proud of, is becoming actual sponsors of iGEM. And the FBI have been sponsors since 2009.
And as a result, we have been able to be there at the world championship and provide the biosecurity workshop - which Ken Oye graciously mentioned - and really directly engage the students and charge them with not only to have fun, but also to be guardians of science, make sure that they continue to do the great work that they're involved in, maintain the integrity of science, but also expand their role and responsibility to make sure that science is never exploited, abused or misused.
And so we know we're dealing with a generation that, we're getting them now, and we're dealing with future scientists, future CEOs of companies, future policymakers, perhaps. And so we're instilling in them the sense of not only greater responsibility but a better understanding of security. So, as they evolve and grow in their careers, they'll have this base understanding to serve them well.
DANKOSKY: And Peter Carr, do you think that this is the idea - to get these students to be responsible practitioners and essentially police themselves?
CARR: I think that's a major component of it. I don't think any of us would say that it ends there but, like Ed was saying, it's a huge way of reaching much further than you could with, say, strictly technological monitoring measures, or anything else that you might think of conventionally.
DANKOSKY: Now, Ken, you were mentioning before, these teams coming from around the world, and Laurie mentioned that, you know, there are different standards and security standards in different parts of the world. Do you notice that, different cultures of safety surrounding this?
OYE: There is variation. Some areas are more attentive than others. One of the things that we've discovered, however, is that when we find variation in behavior and performance and adherence, pushing and poking the teams in areas that may not be known for their attention to biosafety and biosecurity can have interesting side effects.
So, to give one little example, there was an iGEM team a couple years ago in Beijing. I'll just say it. And they did a little project. They had technical work, but they also did a little project on the side to be checking on whether local regulations and local standards for monitoring, ordering of potentially sensitive materials were or were not adequate.
So these undergraduates set about ordering sensitive materials for delivery to their home addresses with personal credit cards. And they discovered that too many of the orders would have been filled. Now, OK, note caveat: They did not take delivery. They canceled the orders, but the information that they gleaned through this experiment, they fed back in.
And we don't have a clear indication of whether local authorities responded or not. It seems likely that, frankly, they did. But it's an example here of how raising consciousness, even among undergraduates, may have trickle-down effects. And I hate that expression in economic contexts, but here it makes some sense, trickle-down effects that can potentially have some effect on policy. It's building a community worldwide that can have at least some potential for improving practices.
DANKOSKY: Of course, another way to read that experiment and a similar experiment that the Guardian newspaper did about eight years ago. They tried ordering smallpox DNA sequences. It actually got the attention of a lot of people. Laurie Garrett, that's the other side of this. Yes, indeed, these may be examples of students learning responsible practices. It may also, again, send up some red flags for you and others.
GARRETT: Well, I think we need to step away from the iGEM context on this, because in many ways, it's the best case scenario. If you step away from it and really look at what's going on in the world, you find out that, first of all, there is no internationally agreed-upon set of definitions for any of the key things we're talking about. What's a biosafety laboratory, high security level four? What's a biosafety laboratory, level three? What is - what sorts of things are allowable or make sense to do in one level of lab security versus another?
What sorts of compounds should or should not be available on the Internet, or should or should not be shared between laboratories without the knowledge of local law enforcement or safety officers? There are no standards from country to country that are the same in the world today. And, in fact, we figured out that even inside the European Union, there are no shared definitions for standards from one European country to another.
Moreover, again, as we've emphasized over and over again, most of the regulatory apparatuses that do exist focus on known organisms and pathogens. In other words, you're looking for the nasty microbe, as opposed to the sharing of information that is simply DNA codes swapped from one place to another, and can be instantly turned into a potentially dangerous microbe.
And I think the standards of safety - I mean, China's been brought up. There's a classic example. We had a first SARS epidemic in 2003 that emerged, essentially, from bats through civets to humans. We had a second SARS outbreak in Beijing in 2005. That came right out of sloppiness in a local laboratory. And we have seen that samples of the bird flu viruses have disappeared from Cairo during the Arab Spring when looters raided that laboratory, probably to steal the centrifuges. But there's no accountability for what happened to the viruses that were in that lab.
So we have to be really careful not to let the extraordinary and wonderful standards and genius and creativity that is iGEM be - sort of assume that it can be translated to what are the realities of what's going on in adult-run laboratories around the world right now.
DANKOSKY: Well, Ed You, maybe you can pick up on this and talk about why these security lapses and maybe flush out a bit more how different countries deal with these issues, what you know about it and how we can, you know, perhaps secure things a little better.
YOU: Right. So, as law enforcement, we actually have been engaging our international partners and also some international companies that deal with synthetic biology. So one of the things that the FBI has been able to do is, some of the things that we've implemented here in the U.S. we've been able to share internationally as best practices. So it may not fall in the realm of standards or regulations; that's not within our purview, but in the interest of safety and security we are able to share some of our experiences and some of the things that we've been able to put into place here in the U.S.
And just to kind of highlight a couple of examples, you know, it was mentioned about passing DNA sequence information to companies. Well, that - the reporter's stunt a few years ago, where he ordered the sequence - short sequence of smallpox through the mail, well, since then, companies have now been screening incoming orders and the customers to determine whether or not they're legitimate orders. And, here in the U.S., the FBI has actually established liaisons with synthetic DNA companies.
And so now if they do encounter something suspect, they can contact their local FBI WMD coordinator - which is a local expert - so we can help assess and vet those suspect orders. And we've now been engaging other international law enforcement agencies, again, to share this type of best practice. But what we've actually found - and this is where there's a crossover from real world to iGEM - is that not only are we pushing out this message of security and responsibility, the really surprising - and, honestly, now that I think about it, it shouldn't be - is that we're now seeing teams actually addressing and coming up with security solutions.
So for example, there was a team from Virginia Tech and Ensimag France jointly developed a software tool to actually help companies do a better job of screening incoming DNA orders, and it was actually quite effective, so much so that we actually had the team come down and give their presentation at FBI headquarters, and here was a group of teenagers showing that U.S. government, you have this policy in place for screening guidelines, here's where we've developed a tool to show that we can address that. Here's the gaps in the policy and, oh, by the way, here's how you can improve it. So it was a remarkable event and we're starting to see that happen more and more. So as they get this understanding and ownership of the technologies and a better understanding and have ability to assess what some of the concerns might be, they're now also in a position to come up with some very effective solutions that might be translated.
DANKOSKY: All that said, I mean I think a lot of people might suggest that there should be more Ed You's out there, that the FBI might need more people like you that are trained in the field policing this world.
YOU: Right, so again, that's a really good question. We've also been engaging, as I mentioned, our partners internationally. The FBI WMD directorate has personnel seconded to Interpol, so that also helps with our international outreach. And those WMD coordinators that I mentioned, here in the U.S. we have one of those local experts in 56 of our field offices and we also have one stationed in Singapore and one in Tbilisi, Georgia.
So we provide international assistance.
DANKOSKY: Let's go to the phones quick. Let's go to Kevin, who's calling from Belmont, Massachusetts. Hi, Kevin, go ahead. You're on SCIENCE FRIDAY.
KEVIN: Hi, thank you.
DANKOSKY: Go ahead, you're on the air.
KEVIN: I'm calling kind of a little bit off the subject but I've had several discussions about biological engineering and things of the same nature you're talking about, and something that always comes up regardless of the person's religious nature is that this kind of science is playing God. And I was curious as to, like, everybody's view on what that means in their field and does that actually play a part in their research or their decision making, because I personally believe that religion is a separate (unintelligible) should not be involved in religion. And maybe considered but not change decisions, so some people believe that we shouldn't be doing this at all because playing God is something we shouldn't do.
DANKOSKY: Well, let's put this playing God question - thank you very much for the question - to Peter Carr, because I'm sure this is something you get all the time. This is less about the security and more about the morality, but it's a big question in your field.
CARR: Oh, it certainly is. You know, I'm involved in a collaboration that has been working to reprogram the genetic code of a simple organism, something that arguably falls into that - in fact, there was a great Jimmy Kimmel spoof on our research that came out a couple weeks ago. I say great; not everybody loved it, you might imagine.
But it was very tongue-in-cheek and that theme was, oh, we - the scientist playing my good friend and collaborator said we have definitely not sinned against God here. This is a great time, by the way, for me to point out that my opinions are my own and not those of my employer. The - in particular, whether you're making the argument playing God or not, regardless of your religious background, I believe it is absolutely the case that informed ethics have to play a part in the research decisions that we make.
If you look back over the history of human beings tinkering with living creatures, we go way back throughout agriculture to selective breeding that has had a dramatic effect on the resulting organisms. And I believe that traditional genetic engineering, if you will, starting in the '70s, and the new increased acceleration of that, which is what I would call synthetic biology, are all building on that framework, that it has, I think, less to do with moving a particular function around from one organism to another than it does with asking what's the consequence of that.
DANKOSKY: And hold on for one second. I'm John Dankosky and this is SCIENCE FRIDAY from NPR. And we're talking with Peter Carr and Kenneth Oye from MIT. Laurie Garrett is senior fellow on global health at the Council On Foreign Relations here in New York, and Ed You is a special agent in the FBI, and we're talking about biosecurity. Peter Carr, I wanted to put this to you. Let's say we destroyed the last known samples of the smallpox virus but the genetic sequence is still out there.
Is anything ever truly dead or extinct if you have the DNA sequence? Can you make it again?
CARR: Based on first principles, you can make a good argument that if we have an accurate DNA sequence, along with some additional information, that nothing simple will ever be truly dead. So even before the Venter synthesis that Laurie was mentioning, there was the polio synthesis and before that an HCV synthesis, I believe; and part of the point of the polio experiment was to demonstrate that even if polio is eradicated worldwide, that it's still possible to resurrect some of those things.
CARR: It gets much harder when you go to complicated organisms though.
DANKOSKY: And Laurie, and that's it. It's an information problem is one of the things you write about in your article. If the information is out there to truly recreate these things, it's never gone; it would be very hard to stop.
GARRETT: Well, especially since now lots of people are talking about trying to dig up from extinct species DNA information and resurrect species, and we're seeing with climate change the revealing of long frozen, essentially extinct from our point of view, species in the permafrost, under the glacial fronts and so on. We don't really know what we're doing with all those.
I would like to quickly say something about the playing God question from a foreign policy point of view, if you wouldn't mind, and that is that we need to always keep in mind in this country that anything to do with religion and science has ramifications overseas that are not intuitively obvious to Americans because we don't necessarily think about the problems the same way.
All you have to do is consider that 33 people have been shot or targeted for assassination in Pakistan alone because they were trying to do the terrible task of vaccinating children for polio, and this is a large distortion of a set of realities and principles that derive, you know, making these children and the polio vaccinators tools in an ongoing dispute between extreme Islamists and those that would try to do public health and reasonable good.
So when you start doing anything that smacks of directing evolution, altering the course of nature, you will step into religious conflict. You've got to be aware of that.
DANKOSKY: And Laurie Garrett, we have to leave it there and I know you have to leave us. Laurie Garrett, senior fellow on global health at the Council on Foreign Relations here in New York. Thanks so much for joining us.
GARRETT: Thank you.
DANKOSKY: When we come back, lots more on synthetic biology and biosecurity. Stay with us.
(SOUNDBITE OF MUSIC)
DANKOSKY: This is SCIENCE FRIDAY. I'm John Dankosky and we're talking about building life from scratch and what that might mean for biosecurity. With my guests Peter Carr, senior staff in synthetic biology lead at the MIT Lincoln Lab. Kenneth Oye is director of MIT's program on emerging technologies; Ed You is supervisory special agent in the FBI's weapons of mass destruction directorate, biological countermeasures unit in Washington, D.C.
And we're taking your calls at 1-800- 989-8255. Gabrielle is in Cleveland. Hi, Gabrielle. Go ahead. You're on SCIENCE FRIDAY.
GABRIELLE: Thank you. I am thinking - I am a medical doctor and in order for you to practice, you have to go through very stringent standards, and I think that any lab anywhere in the world should be in international treaties in which they - because this is extremely dangerous, as dangerous as dynamite that we have right now. This is much more dangerous than nuclear missiles because in order to make nuclear missiles you have to have a huge amount of money and you have to be able to launch them and things like that.
But this anybody can do in their little lab. So the labs have to be, the ones who are in business, they should be licensed and the standards should be international because this is a very deadly threat to the world. And they should pass exams and they should be checked periodically and they ought to have bookkeeping in which they tell exactly who they sent the material to and this person who signs up to buy the material has to show kind of a valid identification.
That way you can track them as they track the money. If there was an outbreak - something in the neighborhood, in the country, they could figure out this is something that came from that lab, they can track it to them.
DANKOSKY: And Ken Oye, what do you think about that? Is Gabrielle on the right path here?
OYE: So Gabrielle and Laurie, earlier, were drawing attention to great variation in national standards in practices and customs. And I agree with both the caller and with Laurie that there is a need for harmonization of standards, there is a need for working towards maybe what you could call a conception of shared security. You can't have biosecurity without shared security.
The question is how do you go about doing it. So if you look right now, talking first maybe about the shipment of parts, you have something called the Australia Group guidelines and convention that some, but not all, countries subscribe to. And you want to have broader coverage. Sure would be nice if countries that are becoming more important in this area, like China, were parties to the Australia Group.
There is also ambiguity in what people mean by key things. What is a part associated with infectivity or pathogenicity, and there is some ambiguity on that. And there's a real need to be bringing together, and this is both politics and technology and science, they all come together, I would argue with Laurie that there is a great need for there to be a role, for countries to be coming together probably under UN auspices to be investigating, analyzing and addressing these issues.
As to the licensing of individual practitioners, I'm even going to go one step further and one step back, okay? So we were talking about how new printers, technologies could result in the translation of information into stuff. And if you look to those printers that may be synthesizing or creating DNA beyond the consortia that Peter spoke about, maybe it would be a good idea if those printers were licensed as we would license automobiles or we would license aircraft or we should license guns in this country.
You turn to individuals and as skills diffuse it's pretty hard to imagine taking the certification and - the board certification standards of the AMA and really having them work even within this country or worldwide. But even working towards, let's call it common education or engagement, I think would be helpful. Now, this may sound crazy and so I want to apologize in advance and I'm going to put Peter and Ed and others on the spot, but look at a world where you have the technology diffusing.
Do you want to have those countries and the students and the firms in those countries part of an international community doing this work when they have the technology anyway? So for example, if there were, hypothetically speaking, an iGEM team in Iran and it wanted to participate in the international competition, to be bundled in with others, to be mixing it up with others in good ways, to be sitting there with Ed on Halloween evening, is that good or bad?
Because right now, the way that our laws operate, they could not do so in any practical way unless you want to be running afoul of export control regulations.
DANKOSKY: Well, and to be...
CARR: So Gabrielle's point...
CARR: ...is a good one.
DANKOSKY: But to pick up from Laurie, who had to leave us, but who might very well ask Ed this very same question, Ed, is that enough? If we had an iGEM team in Iran, it may be self-policing, it may change things in some ways but is that truly enough to deal with all the things our caller Gabrielle was asking about, about licensing, about just more, stricter standards?
YOU: Right. So, again, coming at it from a law enforcement and a proactive aspect, this - diffusion is the key term here. Things are moving so rapidly and it's spreading so quickly that it's - the genie is already sort of out of the proverbial bottle. So one of the reasons why we go out and conduct engagement, not only within the U.S. but we're doing this internationally now, is - so the reference to the cyber-world and security in that world was mentioned.
So to use that, build upon that analogy, our approach is how do we go about building a world that is filled predominantly by the white hats, so that the ones that would utilize and leverage the technologies and capabilities for beneficial purposes, and from that same community to be able to overwhelmingly outnumber the black hats, so those who would hack or abuse or misuse those very same technologies.
But to further the goal than that, then it's almost a requirement of a paradigm shift done by security when it comes to information, that it goes beyond just looking at containment and laboratory space, but maybe the next generation of synthetic biologists will have to develop - what does an anti-virus program look like in the future to potentially detect such nefarious activity? We're going to be depending on this next generation to come up with some of those ideas.
DANKOSKY: Ed You is a supervisory special agent in the FBI's Weapons of Mass Destruction directorate and Biologic Countermeasures Unit in Washington, D.C. He joined us from D.C. today. Ed, thank you so much.
YOU: Thank you.
DANKOSKY: Thanks also to Kenneth Oye, director of MIT's program on emerging technologies. Thank you, Kenneth.
OYE: Oh, thank you, John. Great pleasure.
DANKOSKY: And thanks to Peter Carr, senior staff and synthetic biology lead at the MIT Lincoln Lab at the Massachusetts Institute of Technology. Thank you, Peter.
CARR: Thank you, too.
(SOUNDBITE OF MUSIC) Transcript provided by NPR, Copyright NPR.