An introduction

If the Federal Register is involved, it’s going to be riveting!

I am a student of history, but I don’t profess to be a historian of the NIH Guidelines nor of biotech research in the United States. Although I do hope that if a history of this obscure topic is one day written that I will earn at least some mention for the work I’ve put in, on and off for twenty years, to preserve the public right to know about the research that it funds at American universities, research institutes and, sometimes, private companies.

The starting point is, really, a deal that was cut over a period of years in the mid to late 1970s between the US government, represented mainly by the National Institutes of Health (NIH), and an assortment of leading biologists representing their disciplines. The deal was about how to govern the safety of genetic engineering experiments, which were starting to happen in earnest at the time.

In simplified terms, the deal was that in return for not being formally regulated under law – Senator Ted Kennedy and allies introduced a bill to do just that – that leading scientists, the “principal investigators” whose names appear on federal grants, would step forward and, with the cooperation of their institutions (e.g. universities), take personal responsibility for research safety. The main mechanism through which this responsibility is exercised at the local level is the Institutional Biosafety Committee (IBC), primarily composed of professors, and especially in more recent times, safety staff.

The document that lays down the “rules” as such for IBCs was emitted, and is managed, by NIH. Called the “NIH Guidelines”, for most of its history, the Guidelines’ formal title was the NIH Guidelines on Research Involving Recombinant DNA Molecules. With the more recent emergence of synthetic biology, and subsequent revisions, the NIH Guidelines are now formally titled the NIH Guidelines for Research Involving Recombinant or Synthetic Nucleic Acid Molecules.

From the beginning, IBCs were required to keep minutes of their meetings and make them available to the public. As the new system evolved in the mid-70s, and after pressure from the public and some scientists, it was decided that the public’s position in the system, and its transparency, should be made stronger.

A requirement that IBCs have public members, sometimes called outside or unaffiliated members, was put in place and, in late 1978, the requirement for minute keeping was expanded to oblige IBCs to release a far wider range of records (the Guidelines uses the word “documents”) , such as funded grant proposals, lab inspection reports, and other IBC related items.

The Federal Register of 22 December 1978, explaining the scope and intent of the Public Access Provisions of the NIH Guidelines (highlight added).

This was the birth of what are now known as the “Public Access Provisions”, particularly the current Section IV-B-2-a-(7), which remains largely untouched to this day, over 40 years later, despite many revisions to the Guidelines in subsequent years.

If IBCs bothered to follow the requirement for outside members at all – and many didn’t – they quickly figured out how to avoid dissenters. Typical “outside” members these days include retired professors, professors at nearby colleges, and local public health officials (who often have many links to the institution). Sometimes you see clergy or others who are unlikely to have subject matter expertise or others who are quite likely to have personal allegiances to the institution.

Public members of IBCs have been captured by institutions, with no objections from NIH, and are thoroughly neutralized as a means by which the public can interact with IBCs in an unbiased fashion. Which leaves access to information – Section IV-B-2-a-(7) – as the primary means by which outsiders can learn of biosafety practices and incidents at research institutions.

In the 1980s and 90s, genetic engineering became more routine, and labs certainly didn’t advertise their obligation to be transparent. This caused the public role in IBCs to wither and, by the early 2000s, the public access provisions were being forgotten.

After 9/11 and the anthrax letters, ironically almost certainly sent by a deranged scientist at a US biodefense lab, money poured into biodefense research. Concerns were raised about the security of this research whereupon, after the usual government convened panels populated by the usual suspects, it was decided that IBCs should add to their biosafety responsibilities by becoming a bulwark of what is termed “biosecurity” (meaning protecting dangerous biological agents from theft and misuse).

It was about that time when this author began trying to reclaim the Public Access Provisions from their state of nearly complete disuse. I performed two national surveys of IBCs over a period several years in the 2000s, asking each and every IBC in the United States for its meeting minutes. (USA Today later replicated my approach and turned it into a major investigative series, disappointingly without the slightest credit to its inspiration. )

What the 2000s surveys uncovered was stupendously awful. While big shot professors in Washington claimed that their IBCs were robust and responsible, I discovered that this was pure fantasy, a lie peddled so that the universities and their unenthusiastic overseers at NIH could pretend that the “biosecurity” problem was addressed and billions in biodefense checks could keep rolling off the taxpayer’s press.

What I documented included that many IBCs didn’t even exist. At all. Including IBCs at institutions handling very dangerous infectious diseases. They did not meet, and they did not exercise their responsibilities. And NIH, which never wanted the role of enforcer, did not care. I will write more on on the salient aspects of this work in future posts.

While there are exceptions, in the 2000s and today, university administrators and principal investigators – people who are generally not accustomed to rigorous public accountability – bristle at the Public Access Provisions. Refusal to honor requests is rampant. Ridiculous redactions are commonplace, and extreme arrogance in the face of public requests is endemic.

And they almost always want to redact all the professors’ names from records, even though professors stepping up and taking personal responsibility is central to the bargain that created the Guidelines. Anonymity and stepping up to take personal responsibility do not go hand in hand.

Many of these university bureaucrats and hot shot professors refuse to acknowledge their obligations. They simply don’t want to comply. And they will fight. I’ve stood in a courtroom alone with my pro bono attorney to square off against (and defeat) over a dozen lab lawyers (ahem, the University of Texas) crowding their tables to do their utmost to fight against letting the public know what is going on in their labs.

Frankly, it can be repugnant. There is little that I loathe more than public (and private) institutions using public money that are subject to public information requirements that turn the public’s resources against the population and fight for secrecy.

I have earned a reputation for being pugnacious and even rude to university counsels and professors in pursuing records under the Guidelines. That’s not the person that I usually am, but twenty years of dealing with research transparency issues with US labs has taught me that without being very aggressive, you get nowhere. The lawyers can thank themselves for my demeanor.

This year, 2020, as the pandemic strikes and legitimate questions are being asked about its origin, many are thinking again about laboratory safety and security. After a hiatus of several years I’m back in the game. And that is what this website is about.

I firmly believe that transparency in biotechnological research and its oversight is fundamental to both safety and security, including national security. Or, as I would prefer to frame it, global peace and justice.

Transparency generates conditions that promote responsible research conduct and sober consideration of the implications of proposed research. It is also a moral – and sometimes legal – obligation of laboratories and scientists, because mistakes have consequences, because they agreed to transparency to avoid regulation in the first place, and because the research institutions are driven, even saturated, with public funding.

With this as background, we start out ….

close

Sign up for notification of new posts

All information submitted is confidential. It will not be shared with anyone.

One Reply to “An introduction”

Leave a Reply

Your email address will not be published. Required fields are marked *