A first look at lab accidents

If the United States had an effective system of reporting laboratory accidents, a situation like that engulfing the Wuhan Institute of Virology might be avoided. And, obviously, the ends of public health served. And, probably, more accidents avoided in the first place. But the United States does not have an effective system of accident reporting, and there’s little prospect of the National Institutes of Health creating one.

The potential for accidents, and here I include unintended but potentially dangerous outcomes, is the single biggest reason for the existence of the Guidelines. And the potential for those accidents to have (human, animal, or plant) health, economic, and reputational impacts on the public is the biggest reason for the existence of the Guidelines’ Public Access Provisions.

There are a lot of issues to unpack and discuss about lab accidents and their reporting. This first look is only an overview. This is not a post about any specific lab accident, although you can expect such posts in the future.

Several times over the years I have requested accident reports from the National Institutes of Health (NIH), including a request that is presently being processed. I can say that in recent years, the number of accidents reported under the Guidelines has substantially increased. The vast majority of these accidents are minor or even trivial.

But the increase in reported accidents does not necessarily indicate an overall increase in their number. It indicates an increase in the reporting of accidents, especially minor ones.

While the volume of accident reporting at the national level is on an upswing, more generally the recording and reporting of biotech lab accidents in the United States remains wildly inconsistent and fundamentally unreliable. A minority of institutions regularly report incidents to NIH, but many don’t at all. Some IBCs regularly receive reports of accidents and discuss them, others don’t.

A good number of institutions – I’m looking at you, University of Texas at Austin – neither report accidents to NIH nor discuss them during their IBC meetings, leaving no trace of any incidents in the formal federally-linked record. I believe this is not because accidents don’t happen in Austin – even Texans are human, after all – but because they are not reported.

UT Austin is a good example to use to demonstrate the inconsistency in reporting under the Guidelines. A very large public university with extensive labs, according to its IBC minutes and lack of reports to NIH, UT Austin hasn’t suffered even a trivial lab incident in a very long time. Nothing. Not even a false alarm that was recorded.

Meanwhile UCLA, another large public institution, has recently submitted at least two reports to NIH, as has the much smaller University of Miami. Rutgers University has submitted more. And while UT Austin IBC minutes contain no reference to biosafety incidents whatsoever, in early 2020, when Colorado State University suffered a string of lab accidents, they were repeatedly discussed by its IBC, as they should be, especially at an institution doing as much risky research as CSU!

Were UT Austin’s researchers immaculately conceived while scientists in Los Angeles, Miami, New Brunswick, and Fort Collins are sinners like the rest of us? Certainly not. What’s going on here is that UT Austin and the many institutions that operate similarly are deliberately suppressing information on lab accidents and … this is the even scarier part … they may lack effective on-campus reporting systems or host a research culture that strongly discourages reporting.

This leads to the unfortunate situation that, when reporting on lab accidents, the more transparent institutions tend to catch more flak, while the secret keepers frequently avoid public accountability. Such are the perverse incentives of the system.

Noboby, and I mean nobody, in the US biotech lab oversight system is properly incentivized to report accidents. And it has been this way for as long as anyone can remember. It is an endemic problem to a system that places oversight in the hands of an agency that itself has little to no interest in enforcing the rules with institutions that it seeks to keep as its closest friends.

Indeed, even the “rewards” of the reporting are, in essence, incentives premised on the extremely high likelihood that if you report, then NIH will take no action, and there will be no publicity nor any vigorous inquiry into incidents. Indeed, in the recently uncovered (by your author) case of noncompliance with the Guidelines by the University of Minnesota’s Daniel Voytas, an inventor of gene editing technology, NIH did not even bother itself to properly document Voytas’ noncompliance before declaring the case closed in under 24 hours.

The only real incentive in the US biotech lab accident reporting system is fear. On the one hand, fear that the consequences of an accident could become so severe that it would come to public attention. And the fear of an individual that has been exposed that he or she will become ill (though in cases I am aware of, a researcher seeking medical attention has not translated into a biosafety incident report).

If an accident involves a “select agent”, a subset of particularly dangerous organisms regulated by CDC and USDA, then there is the fear of major legal consequences (though even this has not been enough to ensure reporting in the past). But reporting to CDC has a “advantage” from a lab’s perspective, and that’s that CDC’s Select Agent Program is notoriously secretive, and has even denied investigators from the Government Accountability Office (GAO) basic details of accidents.

Fear works in other ways too. Nobody, particularly not a scientist vying for federal funding, wants to be associated with lab accidents. Nor do postdocs and PhD students, who might find their future prospects impaired if they develop a reputation for being klutzy or absent-minded. And if it happens, many bacteria can be treated with antibiotics. And while the lab’s containment is supposed to work, it’s not for scientific purposes that some labs keep antiviral drugs around. A stitch in time saves nine, and may well get you out of having to report an embarrassing needle stick or mousebite. Or worse.

In general, the US lab accident reporting system is a net designed to let the fish swim through. There are many points of failure:

Researchers should report accidents to the lab director (PI)

PIs should report accidents to biosafety officers

Biosafety officers should report accidents to IBCs

IBCs should report accidents to NIH

NIH should take action in many circumstances

NIH should publish documentation on accidents so the data enters the scientific literature

Of course the system sometimes works. I’m not contending that it’s broken every time. But at every one of the levels above, failure can consistently be documented, and at some of them at many institutions the failure is complete.

All of the issues raised in this first post will be revisited – some frequently – in the future.


Sign up for notification of new posts

All information submitted is confidential. It will not be shared with anyone. Please note that you MUST reply to a confirmation e-mail in order for your subscription to take effect.

Leave a Reply

Your email address will not be published. Required fields are marked *