"Hi, I'm from the government and I'm here to help—you become infected."
Surgeon and New Yorker staff writer Atul Gawande has a scary op/ed in Sunday's New York Times. As Gawande reports:
A year ago, researchers at Johns Hopkins University published the results of a program that instituted in nearly every intensive care unit in Michigan a simple five-step checklist designed to prevent certain hospital infections. It reminds doctors to make sure, for example, that before putting large intravenous lines into patients, they actually wash their hands and don a sterile gown and gloves.
The results were stunning. Within three months, the rate of bloodstream infections from these I.V. lines fell by two-thirds. The average I.C.U. cut its infection rate from 4 percent to zero. Over 18 months, the program saved more than 1,500 lives and nearly $200 million.
Great. You'd think hearty congratulations are in order for the researchers for finding a cheap easy way to save lives. Not so fast. The Feds have shut the program down. Gawande explains:
…the Office for Human Research Protections shut the program down. The agency issued notice to the researchers and the Michigan Health and Hospital Association that, by introducing a checklist and tracking the results without written, informed consent from each patient and health-care provider, they had violated scientific ethics regulations. Johns Hopkins had to halt not only the program in Michigan but also its plans to extend it to hospitals in New Jersey and Rhode Island.
The Feds' ethics bureaucracy is treating such checklists as though they were experimental drugs being injected into the bodies of patients without their permission. It may be a failure of my ethical imagination, but I can't believe that just checking to see that you're doing what you're supposed to be doing anyway--preventing infections--is somehow unethical.
Whole Gawande op/ed here.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
If the Army (and the rest of DoD) is not already using it I believe a memo could get it in place from top to bottom in a week or two. Sorry, not for USAian civilians though.
Bureaucrats opposed to common sense? Stop the presses! That's never happened before!
-jcr
Quick question: why were the docs not (apparently) wearing gloves?
Captain Chaos -- STERILE gloves. RTFA a bit closer.
Doctors who don't believe in germ theory scare me more than politicians who don't believe in evolution.
Of course if the government were to *mandate* such a checklist it would be another example of heavy-handed government interference in health care which always causes more harm than good.
I have run into this before, believe it or not.
The problem arises from the over-broad definition of "research" used in the medical ethics/"human subject protection" world, which includes the collection of data for (potential) publication, even if the activities that produce the data are otherwise routine.
By collecting the data for publication, under current definitions you turn the entire activity into research. I believe that, if they hadn't collected the data for publication, but had done so purely for the internal quality control activities of the hospitals, this would have been fine, but even that rather benign-sounding interpretation is not altogether uncontroversial. I would, however, have no hesitation approving this program for my hospital clients, with the proviso that we would use any data only internally and not publish anything.
e,
Governments regularly demand that businesses that operate food and beverage establishments have health codes met. Some libertarians may argue that a free market system could achieve the smae minimum standards, but that's not the way things are operating, so you assume that the government minimum must be met.
Apparently, however, if you volunteer to go above and beyond the government minimum, the bureaucracy views that as egrarious a infraction as not meeting the minimum.
I'm sure there is some legal issue that a government paper pusher feels there is a good reason to address before this policy is implemented, but common sense tells you not to shut down a program that helps save lives while the filing method is addressed.
e,
Anyone who says that every act of government interference causes more harm than good is a fool. But few thoughtful people say that. Instead, what they say is that giving government the power to issue such mandates will, on balance, cause more harm than good when you consider the aggregate effect of all the mandates.
If you can figure out a way to get government to pass only helpful mandates, and avoid all harmful mandates, let us know.
I see Dean has explained the potential issue.
Well, RCD, to be fair the current medical ethics rules were put into place as a reaction against events like Tuskegee. A little overreaction and caution given what the medical community found itself capable of is fairly forgivable.
That said, this is obviously a ridiculous ruling, and is nanny overprotection at its worst. Ethical regulations are also supposed to generally obey the rules of common sense, and this is clearly a nonsensical textbook implementation.
What jtuf said
RCD,
I think you pegged it.
Until very recently, I advised a pretty good-sized research institution with three IRBs. As "evidence-based" medicine gains traction and implementation of electronic medical records spreads, the ability and the desire to harvest data about how processes and protocols impact outcomes are growing.
The docs were absolutely horrified to learn that, if you gathered data with the (subjective) intention to someday, maybe, publish your findings, you were violating the law unless every single patient whose data you looked at signed a consent form, in advance, allowing their records to be used for "research."
Needless to say, the rules in this area are lagging way behind current realities, and are really starting to choke off the dissemination and even discovery of better ways to do things.
RCD, I agree, but patient privacy is a very important thing to preserve, no? Just as there are legitimate uses for such data, it can also be used maliciously (esp. if there is access to the source data and charts of individuals); it would be difficult, I think, to regulate such that privacy was respected and yet researchers had easy access to the data that makes their work possible.
R C-
We have to deal with similar rules in teaching. With all the push for "outcomes assessment" and whatnot, more faculty are starting to collect data on their teaching practices (e.g. assign two physics problems that have the same underlying concepts, but different contexts, to see whether students can recognize the similarities). As long as we only use this data internally we're fine, but if we try to present it to outsiders we need IRB approval. The definitions of "outsiders" and "research" become tricky. Is showing this to an accreditation organization OK? Being part of a larger state university system, is somebody from another campus of the state university considered an outsider? The rule seems to be that when in doubt get IRB approval, and even if you don't need IRB approval get the IRB to stamp "doesn't need approval" on it.
I'm fairly new at this school, so I'm trying to figure out whether presenting lots of data on "outcomes assessment" is necessary for tenure, or whether I can get away with just, you know, teaching classes and publishing physics research, and doing some token effort at assessment (so I don't have to spend lots of time with the IRB).
Typical human error rate runs from 10-1 to 10-3 going from non-routine to routine/automatic (frequency of starting the coffee without having put the pot on the burner).
Confirming even routine good practice by a checklist is Quality Assurance. It's why NASA, airlines, nuke plants, and their ilk are required to use checklists, even for routines they've done hundreds of times.
MDs and nurses should have to do the same, although one must use caution. Imposing checklists for items that are truly skill of the trade can actually increase error rate due to frustration or rebellion factors.
Just as there are legitimate uses for such data, it can also be used maliciously (esp. if there is access to the source data and charts of individuals); it would be difficult, I think, to regulate such that privacy was respected and yet researchers had easy access to the data that makes their work possible.
Electronic data systems make it very easy to track who has accessed data, making it easy to target and punish malicious uses of data. The solution in an EMR world isn't blanket prohibitions on research; it is robust auditing and real penalties for illegitimate uses/disclosures.
The rule seems to be that when in doubt get IRB approval, and even if you don't need IRB approval get the IRB to stamp "doesn't need approval" on it.
Same in the medical world. The problem is that this bogs your IRB down to the point where it can't assess and monitor the stuff that really poses a risk. Thus, R.C.'s Third Iron Law: "If everything is a priority, nothing is a priority."
Thus, R.C.'s Third Iron Law: "If everything is a priority, nothing is a priority."
1) I totally agree.
2) What are your first and second iron laws?
Tbone, Would framing such a program as QA get it out from under OHRP oversight?
Dean, I can't see how a practitioner checklist program could produce output data injurious to patients or their privacy--am I missing something?
So, what can we all do about it? What elected official has clout over the OHRP? What exposure making them look ridiculous for this ruling can we add to Reason's? Is there an appeals procedure for Johns Hopkins? Does that provide for the equivalent of amicus curiae briefs? Any suggestions for finding leverage?
2) What are your first and second iron laws?
The First Iron Law: You get more of what you reward, and less of what you punish.
The Second Iron Law: The less you know about something, the easier it is.
Tbone, Would framing such a program as QA get it out from under OHRP oversight?
Not if you are also planning to publish.
Dean, I can't see how a practitioner checklist program could produce output data injurious to patients or their privacy--am I missing something?
Not with the most minimal safeguards re: deidentifying data before distribution, no.
Unfortunately, there is probably no good way to allow any "researcher" to use the data without access to fairly fine-grained information that would not meet deidentification standards. The problem is that the current rules treat disclosure to the researcher on pretty much the same basis as publication in a journal. You can certainly de-identify for the latter but not the former, but the current rules just don't really allow that.
For anyone interested in more detail behind the original program, here's a link to Gawande's article in The New Yorker: The Checklist
A much lengthier treatment of the same subject, by the same author (Dr. Atul Gawande) ran in The New Yorker a few weeks back.
Whoops, Trishb, should have previewed. 🙂
I'm fairly new at this school, so I'm trying to figure out whether presenting lots of data on "outcomes assessment" is necessary for tenure, or whether I can get away with just, you know, teaching classes and publishing physics research, and doing some token effort at assessment (so I don't have to spend lots of time with the IRB).
Phd in optics and you are dicking around trying to get tenure. Go into the private sector already.
The First Iron Law: You get more of what you reward, and less of what you punish.
*ahem* War on Drugs?
War on Drugs?
A great illustration of the First Iron Law. The WOD delivers great rewards to its supporters in the law enforcement and political/bureaucratic communities.
Those who are punished by it tend not to support it so much. Those who profit from it? Oh, yeah.
Quick question: why were the docs not (apparently) wearing gloves?
More than likely because the docs are already washing their hands many times per day, so the docs get into a mode (that any of us could be guilty) where they think their hands aren't dirty enough to spread infection.
Apparently, however, if you volunteer to go above and beyond the government minimum, the bureaucracy views that as egrarious a infraction as not meeting the minimum.
I would say that the government views that you're not meeting the standard, minimum or not.
Am I the only one that sees this (above) as a much greater invasion of privacy.
In the original "research" cited in the article, the doctors were tracking infection levels as a function of cleanliness and check lists. No individual data was (at least apparently) a part of the conclusions reached.
If the institution follows the guidance above (signatures and permissions), then the patients who sign the permission slips have signed away their individual privacy.
It seems to me that the "official" method yields a far greater loss of privacy.
If the institution follows the guidance above (signatures and permissions), then the patients who sign the permission slips have signed away their individual privacy.
I'm not an expert on this stuff, Wayne, but I've never seen a published report with identifying information. Even in an approved study identifying information is still kept confidential. The main purpose of the IRB is to prevent studies that expose patients to risk without consent.
Well, that and to have an IRB to show that there's an official board responsible for something, so that asses can be suitably covered.
Mind you, I don't think that the process in place does a terribly good job of achieving its stated objective, but the release of information is not (to the best of my knowledge) the sort of thing that can happen just because the IRB approves a study.
If you want to worry about confidentiality, worry about all those transfers of records between your General Practicioner, insurance company, specialists, hospital, diagnostic labs, etc. Lots of good stuff floating around there, just waiting for a smart thief.
I once heard a presentation from a touring govt bigwig who explained that failure of us researchers to fill out proper IRBs, even for experiments as harmless as a healthy grad student wearing a plastic bowl on the head for 30 minutes, was as bad as a NAZI DOCTOR LIKE MENGELE!!!
Although I did not say anything at the time I found this personally offensive. It was not failure to properly fill out the required paperwork that characterized the ethical problems with Nazi medical experiments.