Yet another IRB horror story

The IRB (institutional review board) is this weird bureaucracy, often staffed by helpful and well-meaning people but generally out of control, as it operates on an if-it’s-not-allowed-it’s-forbidden principle. As an example, Jonathan Falk points us to this Kafkaesque story from Scott Alexander, which ends up like this:

Faced with submitting twenty-seven new pieces of paperwork to correct our twenty-seven infractions, Dr. W and I [Alexander] gave up. We shredded the patient data and the Secret Code Log. We told all the newbies they could give up and go home. We submitted the Project Closure Form to the woman in the corner office (who as far as I know still hasn’t completed her Pre-Study Training). We told the IRB that they had won, fair and square; we surrendered unconditionally.

They didn’t seem the least bit surprised. . . .

I feel like some scientists do amazingly crappy studies that couldn’t possibly prove anything, but get away with it because they have a well-funded team of clerks and secretaries who handle the paperwork for them. And that I, who was trying to do everything right, got ground down with so many pointless security-theater-style regulations that I’m never going to be able to do the research I would need to show they’re wrong. . . .

We’ve discussed IRB nightmares before; see here and here. And here‘s a discussion from Macartan Humphreys on how ethical concerns differ in health and social science research.

21 thoughts on “Yet another IRB horror story

  1. Having read the first several paragraph Scott’s post, I think he greatly misrepresents the patient risks in unreviewed medical studies.

    He essentially keeps writing “Nazi doctors did bad things to people, so now I need to fill out 100 forms that say ‘I’m not a Nazi'”. I recognize that he’s joking (and he does begin the post by saying “jokes aren’t true”), but I think that’s really off the mark about risks to patients in medical studies. In the same reading material he seemed to have little respect for, I think he would realize it wasn’t *just* the Nazis. It was also US doctors (and Japanese and…). With high frequency. Not “someone once did something bad and now we all have to pay the price”, but closer to “at least every decade before the regulations were in place, US doctors did something that was truly despicable that is now a national embarrassment”.

    Moreover, when lots of people start thinking about medical studies, they think “doctors are there because they want to help patients. Yeah there were some pyscho Nazis, but in general there’s no reason they would harm patients”. However, when you’re actually performing a medical study, you start to realize just how valuable the sick individuals are to your research. IRBs are in place to double check that researchers are not getting so excited about their research that the forget about doing what’s best for their patients.

    Of course, IRBs aren’t perfect; ethics are difficult at scale. But I got the impression that Scott felt IRBs were a huge headache in place for a non-existant issue. Huge headache, sure. Non-existant issue, no.

    • I’ve met a number of medical researchers who are very dismissive of ethics. Typically the older generation. But, yeah, they’ll complain that IRB is just getting in their way and blocking their good research. But all the villains in the ethics violations we read felt like they were doing good research.

      The thing I don’t like about Scott’s write-up is that I’m not convinced that he put enough thought into his study design. Every hurdle he describes he describes being kind of incredulous about… “Wait, I’m supposed to anonymize my patient data? Why? The patient data is accessible elsewhere in a non-anonymized record.” or “I should define ‘violent’ as a stopping-point?”

      I’ve had challenges with the IRB, too. But they and I have usually had the understanding that the IRB is often applying standards required for large studies of a particular type (usually evaluating treatments with dangerous side effects) to protect patient’s rights. It sounds like Scott got stuck with an IRB that couldn’t wrap their heads around his little study and the IRB got stuck with someone who couldn’t wrap his head around how his little study was being scrutinized.

      I’m curious why he did a prospective study anyway. Wouldn’t a retrospective medical record review find the results of the screening plus the final diagnosis?

  2. Andrew, Scott and A Reader can all be right. Ethics can be hard, medical studies can be dangerous, people staffing IRBs (I am one) can be well meaning, and, at the same time, IRBs can be wasting a huge amount of researcher time with pointless paperwork and kafka-esque beauracracy. Just because you have a real problem, doesn’t mean your solution is a good one.

    • +1000 this is SOOOO true throughout just… everything. For example, the minimum wage is a TERRIBLE attempt at a solution to a real problem. A Universal Basic Income (not conditional on number of hours worked, and has no effect on labor pricing and hence information transfer) is a much better one, but we don’t have that. Similarly, grant review boards are there to prevent wasting taxpayer money, but it’s grant review boards that approve all the funding for this crappy research we have been discussing for decades on this blog, they become yet another form of regulatory capture… the examples are endless and touch on every aspect of modern society.

      • To which I would add that sometimes a mediocre, or even crappy, solution is the best available in the real world. While I would not claim that IRB’s represent the optimal solution to the real problem of abusive or deceptive research–there is certainly room for improvement of the process–if you seriously try to come up with a better one, you may be surprised just how hard it is to do that.

        In fact, I would say that what makes it hard is that dangerous research is hardly ever attributable to “Nazis” or “bad guys.” It is more often, in my 21 years of experience on IRB’s, a result of naivete and the kind of self-deception that all people are capable of. It is more often a question of truly not seeing the real problems that are there, than it is of wanting to do the wrong things.

        One thing that helped at the institutions where I served on the IRB, was having protocols “pre-screened” by me or another member of the Department who was on the IRB. Because we were very aware of things that the IRB might notice and others might not, we were able to help investigators revise their protocols to resolve the difficulties. This process was voluntary, but most investigators used it because the protocols we reviewed usually sailed through the IRB easily and rapidly. I think some version of this arrangement could be used more widely.

        • Do you think IRB has helped the quality of studies? I haven’t had many IRB reviews, but if I recall there is an evaluation of the study design, no? For example, justifying your sample size.

          I feel like some of the doctors I met who were most vocal about IRB getting in their way were also running kind of poor studies that weren’t accomplishing much beyond getting themselves published. A more permissive IRB would have likely just added more garbage to the published literature.

  3. My impression was that Scott was complaining that the IRB had no common sense. There are appropriate levels of scrutiny depending on the type of study. Using the same standards for a paper survey that one would apply to a risky surgical intervention makes no sense.

  4. My understanding is that IRB is essentially an institution that transfers liability from researchers to their university (correct me if I’m wrong!). If I, as a researcher, screw up, then the university has to foot the legal bill if IRB greenlighted the projected. If that’s correct, I’m not surprised that IRBs tend to err on the conservative side.

    What infuriates me when I deal with IRB is the bureaucratic inability to be flexible. I once did a project in Southeast Asia. I was told I needed an official letter from a government official (on letterhead!) stating that I didn’t IRB to do research there. Good luck finding somebody in an authoritarian (or even democratic) country that would write and sign such a document.

  5. I feel like I was dragged almost to the point of needing to be in a psychiatric hospital myself…

    Extremely cool to know that this practicing psychiatrist takes this attitude towards his patients’ problems.

  6. If this writer doesn’t have the capacity to understand why psychiatric inpatients are a vulnerable population in need of special consent procedures, I’m thinking his being dissuaded from human subjects research is a feature, not a bug…

    • Having fully read the blog now, I think it actually reads like Wasnik-level blog post. Not in that the data makes no sense, but that it reads like an unintended confession.

      As noted by Erin above, very early on the author complaining about how they needed to include informed consent in their study. I understand the author’s point (“we were going to do this anyway!”), but from the IRB side, if someone *doesn’t* include informed consent in a human research study and is incredulous that they even need to explain why, that definitely should be a red flag. From there on out, it reads like a list of confessions; “they wanted us to clearly define protocol, none of our staff actually followed our protocol, we tried to bring on people who never took any of the legally required training, they asked us to secure our data, but we never secure our medical data, we had lost a lot of our consent forms, we clearly violated our own protocols, etc.”.

      But most of all, the author seems to miss the whole point of why ethics are difficult in medical studies. As a doctor, you should always be trying to help your patient, and the patient generally believes that. As a scientist, in most cases you hope that you have lots of “good data”, which is usually bad for your patients. This is exactly the conflict of interest that makes medical ethics so tricky. But the author seems to write that they just have to be not a Nazi and everything will be fine. Quite frankly, if you don’t have an understanding of this issue, I’m totally fine with banning you from medical research, even if that does mean one harmless study was blocked.

    • It struck me that the writer had an excellent understanding of his target population and what was and was not likely to work. Even given some ‘slight’ exaggeration on the writer’s part it would seem that the IRB had no idea of what he was trying to do and instead insisted on applying totally inappropriate standards.

      I have only had one or two encounters with an ethics board and this was for innocuous survey work years ago. The Office of Human Researh knew what it was doing, rubber-stamped my proposals and I was away. But a) we did not have a medical school, and b) this was not in the USA.

  7. Scott’s experiences are sadly common. His hyperbole is likely a reaction to the excess he experienced in IRB review, but the frustrations are legitimate. An unknown number of studies (both well- and poorly-designed) are abandoned because of ridiculously fussy IRBs. Researchers sometimes give up research altogether in favor of other more rewarding activities because the tedious bureaucratic aspects of IRB oversight have become too much for them. Obsessing about fonts, headers, and legalese in informed consents actually make it less likely that participants will read the consent forms.

    My longest IRB review was one year and 8 months. The absurd delay caused my team a failure to be able to recruit sufficient participants to publish. Thus, the study was completed and some internal reports created, but lack of publication means that these participants were exposed to an interview about a sensitive topic for no reason, and our IRB is the sole culprit in eliminating all benefit from the risk-benefit equation. It also wasted our time, hundreds of hours of it.

  8. Once I wanted to conduct an anonymous online survey of college graduates that presented sample sentences and asked participants to say if they noticed a grammar error in the sentence and if so, how much did the error bother them? The survey would have collected no personal information at all. I couldn’t afford to pay every participant, and my university believes that offering a “randomly chosen participants receives X” reward would violate state laws forbidding lotteries, but fortunately I was able to persuade a textbook publisher to provide a reward: a 10% off from the publisher’s bookstore to every participant who completed the survey.

    Unfortunately, my IRB absolutely forbade this research, because the “% off coupons are coercive.” They said the reward had to be a coupon for a particular amount of money. The publisher was not willing to provide that because their online store software could provide % off promotional codes only. So, I gave up. This research would not have changed the world, but it would not have harmed anyone either.

    This was not my first battle with the IRB, but it was my last, because I stopped doing any research that involved human participants after that.

Leave a Reply

Your email address will not be published. Required fields are marked *