Pandora’s Gamble with Alison Young | Crooked Media
Sign up for Vote Save America 2024: Organize or Else, find your team, and get ready to win. Sign up for Vote Save America 2024: Organize or Else, find your team, and get ready to win.
June 20, 2023
America Dissected
Pandora’s Gamble with Alison Young

In This Episode

We don’t have definitive proof of how COVID-19 emerged. But several agencies in the US Intelligence community have concluded that a leak from the Chinese government-run virology research institute just miles from where it was first discovered is the likely source. And lab leaks happen more often than you probably think. That last part is something we don’t talk about enough. Abdul reflects on scientific safety–and the governance and transparency we need to protect it. Then he interviews Allison Young, a veteran journalist and author of the new book “Pandora’s Gamble,” about the history and risks of laboratory accidents.

 

TRANSCRIPT

 

[Sponsor note] [music break]

 

Dr. Abdul El-Sayed, narrating: An appeals court reinstated a provision of the Affordable Care Act that requires health insurers to cover preventive care services. After the departure of the White House COVID-19 coordinator, they’re struggling to fill a permanent pandemic preparedness role. Over 60 laws targeting LGBTQ+ Americans have passed in statehouses across the country. This is America Dissected. I’m your host, Dr. Abdul El-Sayed. [music break] I want you to think back to your science class in high school. You know, clunky safety glasses, bunsen burners, frog dissections. Basically our cover art. Chances are your science teacher spent a good bit of time talking about technique. Either that or, well, some kid in your class probably lit their sweater on fire. We like to think of science as being sterile, precise, perfect. But as your own experience in the science lab probably showed you, accidents happen, particularly when you’re dealing with dangerous chemicals, biological agents and fire. Professional science labs are a long way away from your science class in high school. But even there, accidents happen a lot more than we care to admit. But it’s one thing to start a little fire. It’s another to accidentally start a global pandemic. 

 

[clips of different news reporters] There is new intelligence that’s likely to rekindle the unsettled debate over the origins of COVID. The US Department of Energy now says COVID 19 most likely came from a laboratory leak in China. New report tonight on the origins of COVID 19. The US Energy Department, now differing from some other government agencies, now saying that COVID may have originated from a Chinese lab leak. 

 

Dr. Abdul El-Sayed, narrating: To be clear, the Department of Energy and other agencies in the intelligence community have concluded that COVID, quote, “most likely emerged from a lab leak.” While we don’t have definitive proof of how in fact it emerged, what we do know is that it happened in Wuhan, a city that houses the Wuhan Institute of Virology, a Chinese government laboratory that was doing research about coronaviruses. And we know that the Chinese government very quickly destroyed the evidence we would have needed to identify where, in fact, it emerged and hasn’t been very forthcoming with its own data at all. We also know that some of the research being performed in the laboratory had the unintended consequence of making the coronavirus strain more transmissible in mice. That report came from the National Institute of Health, meaning our federal government, which funded an organization called EcoHealth Alliance that was doing research on coronaviruses in Wuhan. The NIH holds that that wasn’t the aim of the research, and the US based nonprofit failed in their responsibility to report their findings to the NIH as required by the terms of their funding. But this raises two broad issues. The first is whether or not, in fact, we should be doing research that has the chance of creating viruses that are more transmissible or deadly than what occurs in nature. Proponents of the so-called gain of function research hold that this kind of research is critical because it allows us to test prevention or treatment against these kinds of viruses. But the risks should be, well, obvious. In some respects, the issue we’re talking about today parallels the topic we covered last week. If so many AI researchers believe that AI has some non-zero probability of destroying humanity, why in God’s name do they keep doing it? The thirst to know what’s just beyond our grasp, it’s fundamentally human. Couple that with the knowledge that you could be the first to know something. Well, curiosity and hubris can be irresistible. That gets us to the second broad issue and the subject of our show today. Accidents happen even in the best labs with the most highly trained personnel. And gain of function research simply ups the ante on the risks inherent in those accidents. There’s a long history of deadly microbes escaping from labs and making people sick. But you probably haven’t heard about them. In fact, it goes all the way back to World War Two, which we’ll get to in our conversation with our guest today. But more recently, in 2013, a scientist at the University of Wisconsin’s Influenza Research Institute had a fingerstick exposure to a lab engineered strain of H5N1 quote, “bird flu.” In the course of an experiment. Rather than a secured quarantine facility, though, the scientist was sent home to quarantine in their apartment. What? The incident set off alarm bells in Washington, ultimately leading to a pause in gain of function research. But then, right after the pause was lifted, in 2019 a similar incident occurred in the same exact lab. Rather than learn the lesson and get it right the second time the scientist who was exposed to this potentially pandemic causing virus was prematurely released from the required quarantine. All of this raises questions about the lax oversight over seriously risky research. And that goes back to an implicit conflict of interest in the way we fund, govern and enforce scientific safety standards. See, scientific research is inherently competitive as research labs usually housed at universities race to be the first to discover new information. That competition is usually a good thing because it incentivizes leading edge research. But it’s not just the prestige that drives this research. It’s the race for funding. Most biomedical research is funded by the NIH, which, beyond the funding to cover the cost of research labs themselves, also funds the departments and schools that house them. Those quote “indirects”, as they’re called, are critical to the well-being of major medical centers around the country. And guess who’s in charge of guaranteeing the safety and security of the labs? The medical centers themselves. But like we talked about, they have no incentive to slow the research that literally keeps the roofs over their head. Leaving that kind of oversight to the folks with that big of a conflict of interest should force us all to ask some really big questions. Doesn’t the public have a stake in all this? After all, we’re the folks who pay taxes to fund that research, and we’re the folks who will suffer a pandemic that could arise from faulty safety practices. Those are some of the big questions our guest today is asking in her new book, Pandora’s Gamble. Alison Young is a veteran journalist. She’s been covering lab accidents for more than a decade. In her new book, she explores the ethics, possibilities, and concerns behind gain of function research. She also thinks about the broader questions of who should have oversight responsibilities over scientific safety and how should that happen. Here’s my conversation with Alison Young. 

 

Dr. Abdul El-Sayed: Can you introduce yourself for the tape? 

 

Alison Young: My name is Alison Young. I am an investigative journalist in Washington, D.C.. I am the author of Pandora’s Gamble and I am also a professor at the Missouri School of Journalism. 

 

Dr. Abdul El-Sayed: So you wrote a book about lab leaks. It’s, you know, a really harrowing, frustrating, eye opening look at what happens when scientists make mistakes. And it’s not necessarily mistakes in the strategy or or interpretation of their science. It’s mistakes in just the tactics of their science. Um. And I, you know, as a as a as a budding young scientist, we were always taught that lab procedure is critical to keep you safe and to reproduce great science. Um. But people do make mistakes. And I wanted I wanted to ask you to start with a description of Camp Detrick, because I think it is an important moment where the risk of mistakes becomes a lot more dire. What happened at Camp Detrick? What was it? 

 

Alison Young: So Camp Detrick um back in, in the time of World War Two, was where the United States had at that time an offensive biological weapons program. There was essentially a biological arms race going on around the world, and the United States felt that it needed to develop these kinds of weapons as a way of protecting themselves. And so in the 1940s, up until the program was ended by Richard Nixon, there were these labs that were developed at what is now known as Fort Detrick, just north of Washington, D.C. And one of the big challenges was when you’re working with biological weapons, biological agents, you run the risk of the people who are working with them becoming infected. And so Pandora’s Gamble profiles the man who is considered to be the father of modern biosafety, a man by the name of Dr. Arnold Wedum. And his job was to try to figure out a way to keep the people who were working in these biological research labs from becoming infected. 

 

Dr. Abdul El-Sayed: So just step back, because I think that’s going to be astounding to a lot of listeners. So the U.S. basically ran a biological weapons laboratory during World War Two. And so much of what we now understand about basic biological safety comes out of making sure that our bioweapons laboratory was a safe place for people to work? 

 

Alison Young: That’s very true. And and I mean, that is one of the things that’s pretty stunning is all of these decades later, many of the fundamentals and and even the technology, obviously, it has advanced in some ways, but the basic principles were developed during this sort of post-World War two era from the U.S. biological weapons program. 

 

Dr. Abdul El-Sayed: Can you walk us through what some of those look like? I mean, for most folks, they’ve never worked in a lab. What does it mean to keep folks safe? I think people know about white coats, which are actually super dirty, but people know about white coats, they know about gloves and goggles. But what does it mean uh to work in a safe lab? What are the kinds of things that investigators are expected to do to stay lab safe? 

 

Alison Young: Sure. And all of these were new concepts back then and and what and they’re used this way today. So part of it is remembering that you’re working with organisms that can be aerosolized. So many activities that go on in the laboratory create these infectious plumes that you can’t see. That’s everything from maybe pulling um pulling a stopper out of a vial that can perhaps put microorganisms into the air. Surfaces can become contaminated. And so there are a set of um procedures and equipment that are designed to keep people safe. So that’s everything from the idea of wearing dedicated clothing into a lab, showering in and out, not wearing your same footwear when you’re leaving the lab, it’s restricting who’s going into the lab. It’s working with microorganisms in what are called biosafety cabinets. And so these are, you know, essentially like boxes with with directional airflow that that the idea is to have a curtain of air that keeps organisms inside the safety cabinet. But they studied things at at Camp Detrick at that time, which were fascinating, even sort of looking at whether beards put men at risk of transmitting bacteria outside of labs. The importance of hand-washing, the desire to keep people from just delivering messages into labs. Back in the early days, they didn’t have necessarily windows that you could or or a telephone system into the lab where you could get a message to someone without somebody without training tracking in and out of the facility. Um. And over time they developed larger concepts of things for the entire room to have directional airflow so that when you’re opening and closing doors, you’re not having the potential for these aerosols to move about buildings. And and in those very early days of of microbiological research, there were some considerable outbreaks that occurred simply because of airflow moving around buildings, people being sickened with Q fever on different floors of a building then where the research was actually occurring. 

 

Dr. Abdul El-Sayed: Hmm. So two things I want to ask a little bit more about some of these outbreaks. But one of the things we take for granted these days is the degree to which the materials we use can be disposable. You know, back before the advent of plastic. I know that sounds kind of crazy, but before the advent of plastic, everything that you worked on was glass. So the expectation was you couldn’t just throw it away. For the most part, you needed to clean it. And at that time, right, what we understood first about about these biological organisms was somewhat new. I mean, just the distance between us and that time is longer than the distance between that time and the advent uh and the acceptance of the germ theory of disease. Just want to put that out there for folks, right? You’re talking about 60 years from the general understanding and appreciation of germ theory. And right now it’s been 70 years since World War Two. So we’ve come a long way. And they didn’t really have the means of being able to throw things away and do it securely uh as we do now. So it really was a much more difficult uh scenario. Can you tell us about some of the the lab leaks that that happened out of America’s biological weapons laboratory in World War Two? 

 

Alison Young: Sure. One of the things that surprised me the most is when you think about laboratory accidents and people becoming infected um in labs. You think about scientists. But the reality is, is that there are all kinds of people who are working in these labs who are at risk. And so during that biological weapons program era, you had people who were doing what you’re talking about. They were dishwashers who were handling the glassware, the beakers and and the pipettes and all of the different kinds of petri dishes. They had to be washed. And so you would have these these dishwashers who would be working with these materials, um and they would become infected with the various agents that were under study. You had construction workers in these labs who were working maybe on the air handling system up above and because these were not sealed labs in those times, they became infected. There were people who were opening the mail that was delivering specimens from various labs, and they didn’t have protections, nor was there necessarily the kinds of protective materials used to to transport um these specimens. And they would also become infected. Um. But it wasn’t just the people in the labs. And of course, the microbiologists also became infected with these various organisms. But you also had sometimes these people becoming infected at home. There are there are I reviewed a bunch of archival materials from Camp Detrick, and they showed instances where in one case, a woman who whose husband worked with anthrax, um she ended up with this lesion on her face, and they initially thought it was a pimple and didn’t think much of it. And over time, it kept getting worse and worse and worse. And it turned out that that somehow or another, he had brought anthrax spores home. Um. He didn’t become infected, but she did. So there were a variety of ways. There were also even people who were infected working um in in the waste disposal plant um at Camp Detrick. Um. So there were a variety of people. And and one of the things that I wasn’t really prepared for is there were a handful of deaths um during that era that had received publicity over the years. But what had not been reported before my book was the number of people, 400 or so people who were very seriously sickened, some of them with absolutely debilitating recurring illnesses that affected the rest of their lives. Um. And the other thing that was pretty interesting as well is that during World War Two, a number of those who were being sickened were women who had come to work in the biological research labs because of the opportunity it afforded to them during that era. 

 

Dr. Abdul El-Sayed: Wow. So you wrote this book not just to tell us about our history, but to speak to a present that a lot of folks don’t really know much about. If you’re watching the public discussion about science, particularly in the post-COVID era. This should hearken to a conversation right now we’re having about gain of function research. What is gain of function research? Why is it done? And and what is the what is the what is the consequence of our doing it? 

 

Alison Young: Sure. So gain of function research is a term that actually now is is debated as to what it should actually be called. Part of the problem is how things are defined. Um. But gain of function research or gain of function research of concern is the act of scientists manipulating pathogens in ways that makes them more dangerous than what’s found in nature. And that can be making them more transmissible. It can make them more deadly or causing more severe disease or perhaps being able to infect species that they don’t otherwise infect in nature. And all of those things are potentially more dangerous. And the reason for doing it, um the proponents say that this kind of scientific research will help understand how pathogens, particularly viruses, may mutate on their own in nature and perhaps give us a head start on creating various vaccines or treatments or tests uh that without this information, we couldn’t wouldn’t have. That said, those who are the opponents of this kind of research question whether there has been any meaningful benefit, I mean, concrete benefit from this research beyond understanding general understanding and science. And so it’s been incredibly controversial going back to when some of the first experiments really raised public awareness of this issue back in late 2011. To this day and so most people may not have ever paid attention in the general public to the idea of gain of function research of concern until the whole debate erupted with Anthony Fauci and uh Rand Paul. 

 

Dr. Abdul El-Sayed: So this is the of course, the implicit risk is that you do gain of function research. You create something that didn’t otherwise exist in nature and where it was supposed to inform you about what you otherwise might do if you found something similar in nature. It becomes the thing that hops into nature that creates the problem in the first place. You know you, the premise of your book is that this this kind of thing isn’t just theoretical. It happens a lot more than we would care to talk about. And I want to ask, in the course of the gain of function research that is done, how often has it actually hopped out of the laboratory? And, you know, as you think about it, we can get to this, but do the risks outweigh the benefits? 

 

Alison Young: That is, I think, one of the biggest questions out there right now as far as a public policy question is that risk benefit question. So when it comes to lab accidents that have potentially started an epidemic or outbreaks, the good news is those have been relatively rare. Over the history of lab accidents there have been a number of outbreaks that have been connected to laboratory research. So for instance, and these are not necessarily gain of function research because that’s a relatively new technology, but it’s what is of concern here. So you have when the first SARS virus um before COVID 19, there was another severe acute respiratory syndrome virus that started a very concerning epidemic, um and it was brought under control through intensive public health measures, which is fabulous. But after it was stopped, that outbreak was stopped. That SARS virus escaped labs multiple times. And it goes to show how it can get out. There was a 1977 seasonal flu epidemic that swept the world. And when scientists looked closely at that strain of the virus, it looked like it had been frozen in time from decades earlier. And the leading theories on it are that it was some sort of a lab accident and it literally had been frozen in time um or that it was some sort of a vaccine trial that that, you know, had a problem with it. But either way, it was a research related incident. And so there have been a number of these kinds of outbreaks that have occurred over time. And that’s the reason why the U.S. Government Accountability Office, which is the nonpartisan investigative arm of Congress, it has warned for decades that the growing number of labs and the growing number of labs that are manipulating these pathogens in ways that make them more dangerous than what’s found in nature are putting us at at risk of a catastrophic accident. [music break]. 

 

[AD BREAK]

 

Dr. Abdul El-Sayed: You wrote about one event that happened in 2019 in Wisconsin. And I think it’s it’s really instructive because it involves the most likely space where this is going to happen, which is a university laboratory. It involves a potentially dangerous pathogen. And it illustrates, you know, just by its circumstances how happenstance this kind of thing can can be. Can you tell us a little bit about that 2019 event? 

 

Alison Young: Sure. So in in 2019, the there’s a lab at the University of Wisconsin, Madison, that has been at the forefront of doing gain of function research of concern. And in fact, this particular lab is one of two in the world. Um. The other was in the Netherlands. That kicked off the entire gain of function debate back in 2011, and there ended up being a moratorium on federally funded research in the United States for a number of years. And finally, this research was allowed to resume and the University of Wisconsin was working with the very same virus that had caused all of this controversy. It’s it’s a um a mammalian transmissible strain of H5N1 influenza. So in layperson’s term, this is um a uh an avian or a bird flu virus that has been of great concern because of its potential to cause a pandemic if it were able to spread easily between humans. And this particular virus had been created in the lab. And it was capable of spreading through the air between ferrets. And the reason that’s important is ferrets are the stand in for humans. They are the animal model for studying how a flu virus might behave in humans. So in in December of 2019, the University of Wisconsin was working with this very controversial flu virus. And they had three scientists who were in a very, very high tech lab. It’s a biosafety level three lab, it’s the second highest level. And they actually had enhanced this lab. They had even extra procedures in place. These scientists were wearing full body Tyvek suits and double gloves and and most importantly, they were wearing something called a PAPR, which is a powered air purifying respirator. It’s it’s sort of a facemask and it has what almost looks like a vacuum cleaner tube going down their back to a blower unit. And it basically supplies them with a constant supply of filtered safe air. And the reason that’s important is because of the potential for infectious uh aerosols being in the environment of the lab. And so they’re working with these ferrets that are in a transmission study. Um. And and they’re taking specimens from them. There are these three scientists who are in there. One of them is a trainee um who is standing back and watching um as all of this is going on. Um. And then all of a sudden the trainee realizes that the hose to their respirator unit that’s supplying them with this safe, clean air has detached. And so the hose is dangling there in the laboratory’s air, the air that this device is designed to protect them from coming into contact with. Um. And so this immediately caused them to begin the exit procedures to safely leave the laboratory. They’re calling the emergency number. This this trainee is told to begin the quarantine procedures because the idea is, is that if you’re potentially exposed to something, you don’t obviously want to come into contact with your friends, your family and others. But somewhere along the line, the University of Wisconsin decided to end that quarantine process. And part of what what I found in reviewing all kinds of of documents from both state and federal records requests, as well as doing interviews with the NIH, um there were a variety of efforts that were made to not report this incident as required. 

 

Dr. Abdul El-Sayed: Wow. So any listener of the podcast knows I love science, and by definition, I also really, really appreciate scientists. And one of the frustrations that I often have with the way we think of ourselves as scientists is that we assume ourselves to be so free of bias that when our own bias hits us in the face, we ignore it because we are, of course, people who are unbiased. And one of the biases of science is to do more science. And you can see the incentives at play here. This is a laboratory that was already in the public eye because of a question about gain of function research. There is probably a mistake that was um, you know, no mendacity involved, just just an honest mistake made. And they didn’t want to jeopardize the science. So there were a lot of efforts not to disclose. And that’s the kind of thing that leaves people not trusting science. And so we’re in this situation where the bias, of course, folks to know is so powerful that sometimes it keeps us from doing some really dangerous things. So the obvious analog here based on an episode we recently just did is AI. You have a number of AI researchers who will tell you that there is a non-zero probability that they call P doom, that this is going to uh end or at least severely disempower humanity. And yet the power of curiosity is so great that they keep progressing forward. And it’s also the power of capitalism being very great. And in this case, you see the same sort of need to know that you can see in the short term making a set of decisions that could cost us dearly in the long term. How do you feel like the science community has dealt with these risks? Because of course it is scientists, albeit making um what are honest mistakes, putting this kind of science in a position where it could hurt a lot of people and at the same time it’s scientists who claim a monopoly on decision making power about what science gets done that are pushing this forward despite it. How how should the public debate play out? I know how it has played out and we can talk a little bit about that, but how should have the public debate played out if we would have had a mature conversation about this um earlier on? 

 

Alison Young: Yeah. So you raise a number of really important issues um and everything from what what level of participation should the general public have in these decisions, um should it only be the scientists involved in the conversations, and I think there are are many people who believe that the public, because it does have a stake in safety and we’ve all seen how um pathogens can sweep the globe pretty pretty rapidly and in effect, everyone’s lives. Um. I think one of the biggest challenges to having an informed debate besides the politics, which is a whole nother area, is the secrecy in this in this arena. Um. I have reported on laboratory accidents for 15 years. I mean, many people right now are only now thinking about the issues of biosafety and lab accidents because of the questions about where did COVID 19 come from. Um. But but this has been an ongoing issue and an ongoing problem for as long as I’ve been covering it. And it’s incredibly difficult to find out about lab accidents. And and there are a number of incentives at play not only those that you’ve talked about from the scientific community, but also from the regulatory community. Um. There is very much a patchwork quilt of oversight of labs. And what there is, is largely dependent upon the labs policing themselves when it comes to biosafety. And number two, where there are some actual regulatory as opposed to guidance in this area, the agencies that are overseeing this work. They often are funding the work that they’re overseeing, which is a potential conflict of interest, and they often operate their own labs and have their own incentives for not necessarily having their incidents and accidents and embarrassing situations out there in the public view. And from a public policy perspective, in so many other areas of of public oversight of issues that involve public safety. There is transparency because it is one of the most important things for accountability as well as for public trust. And and transparency is something that is very much lacking in this arena. 

 

Dr. Abdul El-Sayed: And I just I want folks to appreciate a couple of things. The competition implicit in science, especially high stakes science like this, is gigantic. And it’s not that there’s a winner or a loser. There’s a first to, it’s a race, and nobody wants to be the one who came in just second. And the problem with that is that incentivizes, by definition, a level of haste that can be dangerous. The second piece of it is when institutions themselves are in this race, they have a implicit conflict of interest, and they trust that they necessarily put in the quality and tactical quality of the science that is done wherein there is this sort of agreement that hey you guys are going to do a good job. Right? And they’re like, yeah of course we are. So it’s like, okay, so we don’t have to regulate you like this, right? Right. Okay, cool. Yeah, fantastic. Because here’s the thing. It all comes back to money. The labs that win these races get more funding, and the funding isn’t just for those labs. There are huge levels of what we call indirects that come from those winnings, those those grants and those indirects fund those whole institutions. Right. And any any academic medical center, it is getting a large proportion. I just want you think about this, um there are only about at most, 250, maybe 300 medical students in a class. They’re not making that much money off medical school tuition. Medical schools make their money off of NIH indirects. And so they don’t want to put those indirects at risk. And so you have a if you have a uh a high flying laboratory that’s producing the best knowledge, they’re competing for the biggest grants, they’re getting the biggest indirects. And that medical center is thriving. That’s how that industry works. And look, I mean I, you know, to in some degree, we’re not talking about a lot of the mendacity of other industries that we sometimes take to the woodshed here. But in all of that, what can be lost sometimes is the risk. And then on top of that, no scientist is ever going to come out and be like, I admit that I’m a pretty sloppy scientist, right? [laugh] Bad things can happen in my lab. And so, you know, because of that, the assumption of excellence that, of course, any uh precise scientist will take in their work. There is, of course, the risk that you’re just not going to get the kind of oversight that you need. So there has to be some level of independence. And then on top of that, every time one of these things happens and does not go uh well reported or openly addressed, people lose trust in not just that institution, but they lose trust in science itself because they’ll say if you aren’t willing to disclose something that could have hurt me, how do I trust you to tell me what could have helped me? And I think that this is something that we all need to be thinking a lot about. And there just needs to be some objective third party that’s able to engage here. Now, you brought up COVID, and I was I was I wanted to get to that sort of toward the end of our conversation because, of course, it is the uh elephant um or the raccoon dog in the room. And obviously you have an outbreak stemming in Wuhan where there happens to be an institute of virology like just down the street in a time where there is heightened tension between the United States and and China. What do we understand about the research that was done at the Wuhan Institute of Virology and the possibility that COVID could have evolved there? Now I find the raccoon dog evidence relatively convincing, but it doesn’t necessarily imply that that raccoon dog could not have been infected by something that came out of the Institute of Virology. I think you, one has to, you know, think through all of the potentiality here. And calling the coincidence does not make or does not a conspiracy theory one make. I’m just saying that one has to be asking all of the possible questions, especially considering the fact that when you read any of the reports, the degree to which the sci– the the the Chinese government has tried to obfuscate with some like absurd theories like COVID was trucked in in foods from other countries. You know like so why didn’t they have an outbreak where they were freezing it? But like anyway, all of this forces us to ask the question, so what kind of research was done at the Wuhan Institute of Virology? What is the evidence that supports or goes against the possibility that it could have uh actually uh leaked from that lab? 

 

Alison Young: So I mean one of the, there are two leading theories for how COVID 19 began. I mean, the the obvious way uh for the pandemic to begin is the way that basically the vast majority of all pandemics began, which is that it’s some sort of jump from of a novel uh virus, from an animal to a human or through an intermediate animal to a human. Um. And so that there is is one sort of school of thought that this virus um somehow came through the wild animal trade from possibly raccoon dogs or some other susceptible animal that was sold at a at a wildlife market where animals were sold for food in Wuhan. Um. The other prevailing theory is that this somehow came from a research related accident that could be the collection of specimens from wild bats in the southern parts of of China, where viruses like this um are found because they’re not found in wild bat populations up in up in the Wuhan area, or that it came from some sort of a cross-contamination or one of the many ways a worker could become infected inside a lab, or possibly that that the waste stream had materials that that got outside the lab. There is circumstantial evidence, I think, for both of these theories out there. And and nothing has been proven so far. And the biggest issue is that there has never been a forensic independent investigation. The Chinese government has not allowed for that. And so, you know, whether we’re going to find out how this pandemic began all these years later is is a question. The lab in Wuhan not only was sort of one of the leading um uh facilities doing coronavirus research and collection of of coronaviruses in the world. So so there is the coincidence that it’s located in the same city where this um where the first cases of the pandemic began. Um. It also was engaged in doing manipulation of um and in research that had the potential to to create lab manipulated viruses um in the world of coronaviruses. So it was doing that kind of work. It was part of a proposal that was never funded. The specific proposal where they actually talked about creating viruses as part of a coalition that involved some U.S. partners and some other international partners to create viruses with a specific feature um that that the COVID 19 virus has. You know, the question is, is did they go on to do that kind of research on their own later? The the director of the Wuhan Institute of Virology’s Biosafety Level four lab has written papers uh raising concerns about larger biosafety uh issues in Chinese labs and the lack of of funding um that they need for certain safety issues. There were cables from members of the U.S. Embassy in China um a few years before um the pandemic began raising concerns, after talking to people inside the Wuhan Institute of Virology who were concerned about their biosafety training. And then lastly, the other concern about the lab in Wuhan is that it was known to be doing research at what’s called biosafety level two with coronaviruses. Um. And and one of one of the challenges in this in this um kind of research is that labs, for the most part, do their own risk assessments as to what kinds of safety precautions they take with various kinds of experiments and various kinds of pathogens. Um. And there is debate about whether coronaviruses like the ones that they were, you know, finding in the wild and bringing back to Wuhan whether that work should be done um at Biosafety Level two, where you do not have the same kinds of precautions and equipment uh to keep people from becoming exposed. And I will say that one of the other things I think that’s important to note, we’ve we’ve talked about laboratory accidents and and so one of the ways, an obvious way that a pandemic can begin is if you’re working with a brand new virus that no one has ever seen before or you’re manipulating viruses and one of your workers becomes ill, if that particular virus can be asymptomatic, that you are infected, but you don’t show obvious symptoms and you go home and you potentially infect your family or someone else, you can start a chain of transmission um that way. And one of the things that that the research has shown over time and that they found way back in those days at Camp Detrick is that lab accidents when when people become infected in labs, most of the time they have no idea how they became infected because they have no recognized accident that occurs. They just simply are going about their business and they somehow encounter the pathogen, either because maybe they didn’t take their protective gear on and off carefully. Maybe they touch their face, maybe, you know, they tracked it on their clothing or they inhaled something. Um. But I think that’s something that’s really important to remember is that most of the time when those infections occurred, they had no idea that an accident had occurred. So you could become infected and and not even know that something had happened. 

 

Dr. Abdul El-Sayed: Yeah, I want folks to just think about how often I mean, most the vast majority of us have been infected by COVID, knowingly or unknowingly. And if you got COVID during the pandemic, chances are you might have known who you got it from. But high probability is you actually don’t know how you got it. We’re talking about a virion that is even smaller than the smallest living thing. And there’s a whole debate about whether or not viruses are actually alive. And so you’re not necessarily going to know if you got it. So you’ve got to imagine you’re doing this research and you go home one day and then three days later your you know mother in law is ill. And the connection that you’ll make, high a high probability you don’t make the connection, because of course, we all know that COVID symptoms are nonspecific, right? You don’t really know that it’s COVID outside of it being a COVID pandemic, which of course if you’re you know, patient zero, not knowingly. You don’t know that you just started off a pandemic. So this is the this is the danger. And, you know, I just want folks to remember that the prevailing theory of a large part of the intelligence community, including the Department of Energy that oversees a lot of our national laboratories is that, in fact, COVID started via lab leak. So it is it’s difficult to hear this. But I do want folks to understand that when accidents happen and you’re now dealing with virus particles that you’ve intentionally engineered to potentially do great harm so you could understand how to address that great harm that potentially could yield to where we are now. Look, good evidence that there was COVID genetic material and raccoon dog genetic material had mixed, but again, is really hard to ignore a lot of these coincidences. So you obviously wrote this book, and I can only imagine some of the vitriol that um you’ve had directed at you from folks who would argue that your work demonstrates fully that, you know, this is one [?], one big conspiracy theory, or folks who say that you’re putting our science way far behind, how dare you impugn the good name of of science? But you wrote this book for a reason. So I want to ask you, what was the intended goal for you in this intervention on the public conversation? What do you want folks to understand? And then what do you want them to do because they understand it? 

 

Alison Young: You know, I wrote this book because I have this odd sort of specialty where I’ve been writing about these lab accidents for 15 years. And and and I’m not the only person who has been raising um a flag about these kinds of incidents. There are groups like the GAO who have warned about these issues. There was a time in Congress when both Democrats and Republicans were holding hearings together, expressing concerns about lab accidents. I wrote this book because this is a really important moment in time. Our technologies are allowing more people and more institutions to do increasingly risky experiments with pathogens. Um. And as someone who I I talked to for the book note, at some point, this isn’t going to just be confined to labs. Um. There’s a lot of press going on right now about the debate over AI. But actually, AI also plays a role in the whole sort of biotechnology risks that are here. I think it’s it’s really important for the public to educate themselves that the COVID lab leak hypothesis was branded from the start as a conspiracy theory is is really unfortunate. And it’s something that that I don’t think should have happened based on on all of my years reporting in this arena. And so I wrote this book so that hopefully people will inform themselves that there’s far more to this debate than, you know, some of the sound bites that they’ve seen on on television. And and it’s an issue that affects every single one of us, because whether or not COVID 19 came from a lab, there are enough signs saying that that we are at risk of a catastrophic lab accident in the future. And there are safety issues that need to be addressed, but there needs to be policymaker and politician interest in doing that, and that will come from public interest. 

 

Dr. Abdul El-Sayed: Well, we really appreciate you uh writing the book and sharing your experience with us. Our guest today was Alison Young. She is the author of the new book Pandora’s Gamble about the risks of lab accidents uh in the past and in the future. Uh. Thank you once again for your time. 

 

Alison Young: Thank you. [music break]

 

Dr. Abdul El-Sayed, narrating: As usual. Here’s what I’m watching right now. Last March, an appeals court in Texas. [laugh] You should always watch out when you hear an appeals court in Texas, ruled that a requirement of the Affordable Care Act, that health insurers cover basic preventive services like prep medication to prevent HIV was unconstitutional. The argument the court upheld is that the law gives too much power to an independent expert panel to decide which preventive services must be covered. The ruling took immediate effect, leaving millions without coverage they had thought would be included with their health insurance. But the ruling was appealed to the Fifth Circuit Court, which hasn’t ruled on the constitutionality yet, but issued just recently a stay that left the provision mainly intact. Get this 62% of the American public supports the provision, and it’s widely held as one of the most important pieces of Obamacare. But, well, conservatives have stopped at nothing to attack the law and the health care it provides millions of people. The Court of Appeals is expected to issue a ruling later this year. And we’ll be watching. Former White House COVID 19 coordinator Ashish Jha resigned his role to go back to civilian life as dean of Brown University’s School of Public Health last week. After the COVID 19 public health emergency sunsetted last month. The plan was to replace his role with a permanent pandemic preparedness director role. The catch is, well, they couldn’t really find anyone to fill the role, and that’s because the job comes with little budget or clout, particularly as the country moves past the last pandemic. But this highlights the central challenge of prevention. A prevention paradox. It’s that we all would rather prevent something bad from happening than deal with it after it happens. But unless it’s actively happening, well, prevention rarely rises to the priority we should give it if we’re serious about preventing anything. That means that unless you give a role like this, the kind of positionally, power, prestige and resources it takes to act, it’s a no win situation. You don’t have the resources you need to prevent a pandemic. And if a pandemic does happen, well, you’ll be blamed for not having prevented it. Not a great look, but neither is this. Over 60 laws targeting LGBTQ+ Americans have passed in state houses across the country. They’ve targeted books, participation in school athletics, clothing, bathroom choices and pronouns. All of it has led the Human Rights Campaign to call for a state of emergency for LGBTQ+ Americans and has cast a pall over Pride celebrations around the country this month. Anti-trans legislation is a fundamental assault on basic constitutional rights. It’s designed to stigmatize a group of Americans comprising less than 1% of the population, all for political gain. The mental health consequences among trans folks, particularly kids, is staggering. We’d hope to bring you an episode digging deeper on gender affirming care, what it is and what it’s not in time for Pride Month. Due to some unforeseen scheduling challenges, you’re going to have to wait a couple of weeks, but it’ll be worth it. Watch this space. That’s it for today on your way out. Don’t forget to rate and review. It really does go a long way. Also, if you love the show and want to rep us, I hope you’ll drop by the Crooked store for some America Dissected merch. [music break] America Dissected is a product of Crooked media. Our producer is Austin Fisher, our associate producers are Tara Terpstra and Emma Illick-Frank. Vasilis Fotopoulos mixes and masters the show. Production support from Ari Schwartz. Our theme song is by Taka Yasuzawa and Alex Sugiura. Our executive producers are Leo Duran, Sarah Geismer, Michael Martinez, and me. Dr. Abdul El-Sayed, your host. Thanks for listening. [music break] This show is for general information and entertainment purposes only. It’s not intended to provide specific health care or medical advice and should not be construed as providing health care or medical advice. Please consult your physician with any questions related to your own health. The views expressed in this podcast reflect those of the host and his guests and do not necessarily represent the view and opinion of Wayne County, Michigan, or its Department of Health, Human and Veterans Services. 

 

1