As you know, the Boundary Layer blog and citizen-journalist Comradde PhysioProffe have been laying out the case for why institutionally unaffiliated, crowd funded ostensibly open science projects should be careful to adhere to traditional, boring, institutionally hidebound "red tape" procedures when it comes to assuring the ethical use of human subjects in their research.

The first issue that attracted our attention was that the initial submission lacked a document indicating that the study had passed review by an institutional review board (IRB). The authors responded by submitting a report, obtained after the initial round of review, from the Association for the Accreditation of Human Research Protection Programs (AAHRPP)–accredited company Independent Review Consulting, Inc. (IRC: San Anselmo, CA), exempting them from review on the basis that their activity is “not human subjects research.” On the face of it, this seems preposterous, but on further review, this decision follows not uncommon practices by most scientists and institutional review boards, both academic and commercial, and is based on a guidance statement from the United States Department of Health and Human Services' Office of Human Research Protection (http://www.hhs.gov/ohrp/humansubjects/gu​idance/cdebiol.htm). Specifically (and as documented in part C2 of the IRC report), there are two criteria that must be met in order to determine that a study involves human subjects research: will the investigators obtain the data through intervention or interaction with the participants, and will the identity of the subject be readily ascertained by the investigator or associated with the information. For the 23andMe study, the answer to both tests was “no,” ostensibly because there was never any interpersonal contact between investigator and participant (that is, data and samples are provided without participants meeting any investigator), and the participant names are anonymous with respect to the data seen by the investigators. It follows from the logic of the IRC review, in accordance with the OHRP guidance documents, that this study does not involve human subjects research.

Research involving human participants. All research involving human participants must have been approved by the authors' institutional review board or equivalent committee(s), and that board must be named in the manuscript. For research involving human participants, informed consent must have been obtained (or the reason for lack of consent explained — for example, that the data were analyzed anonymously) and all clinical investigation must have been conducted according to the principles expressed in the Declaration of Helsinki. Authors should be able to submit, upon request, a statement from the research ethics committee or institutional review board indicating approval of the research. PLOS editors also encourage authors to submit a sample of a patient consent form, and might require submission on particular occasions.

Obviously, the journal decided to stand on a post-hoc IRB decision that the work in question was not ever "involving human participants" in the first place. This is not acceptable to me.

The reason why is that any reasonable professional involved with anything like this would understand the potential human subjects concern. Once there is that potential than the only possible ethical way forward is to seek external review by an IRB or IRB-like body. [ It has been a while since I kicked up a stink about "silly little internet polls" back in the Sb days. For those new to the blog, I went so far as to get a ruling from my IRB (informal true, but I retain the email) on the polls that I might put up.] Obviously, the 23andme folks were able to do so......after the journal made them. So there is no reason they could not have done so at the start. They overlooked their professional responsibility. Getting permission after the fact is simply not the way things work.

Imagine if in animal subjects research we were to just go ahead and do whatever we wanted and only at the point of publishing the paper try to obtain approval for only those data that we chose to include in that manuscript. Are you kidding me?

Ethical review processes are not there only to certify each paper. They are there to keep the entire enterprise of research using human or nonhuman vertebrate animals as ethical, humane, responsible etc as is possible.

This is why hairsplitting about "controlling legal authority" when it comes to academic professionals really angers me. We work within these ethical "constraints" ("red tape" as some wag on the Twitts put it) for good reasons and we should fully accept and adopt them. Not put up with them grudgingly, as an irritation, and look for every possible avenue to get ourselves out from under them. We don't leave our professionalism behind when we leave the confines of our University. Ever. We leave it behind when we leave our profession (and some might even suggest our common-decency-humanity) behind.

The problem is that all this "ethics" talk conflates two very different issues. The first is a valid concern about whether the research itself is ethical -- things like the Tuskegee syphilis experiment which you brought up earlier is a good example of why that sort of oversight is needed. But all this other stuff that gets lumped into "ethics"about the necessity of hiding data *is* just red tape and has to go away in the modern world of open data. Yes, I know there is lots of absurd FUD about how the sky would fall if people's genome data and like were public -- but you know what? Actual *genomicists* don't buy into it -- starting with Craig Venter, several of them have made their own genome sequence public. And I myself have submitted a sample to George Church's Personal Genome Project, and if they ever get around to sequencing it, I'm going to make my data public as well. It doesn't help science to keep data locked up. The 23 and Me case is complicated because it isn't clear that helping science is the primary reason why people use their service, but certainly that should be the motivation of anyone participating in a publicly funded project.

suppose that by making your sequence public... you find an offspring you didn't know you had. or an offspring of your dad or grampa that nobody knew about. Or that your dad or grampa aren't really your biological sperm donor ancestor(s). then think about the knock on informational effects on your family members that didn't consent to their information, by way of you, being made public?

what about your grand kids or great grand kids who may only be theoretical at the moment?

these are but a few issues. sure, *you* may have thought them through. but it is pretty clear that broad, consumer type appeals like 23andme will have people who have most assuredly not considered such things. and to the degree that they *have* considered such things it may be only through the information that the company (or research team) chooses voluntarily to provide. With their own shaping/tailoring of the risks and implications.

The, err, theological fervor of you all OPEn elEvENTY1!!!! types is obvious and hardly comforting that you can make rational decisions about these matters.

The actual regulations, as promulgated in Section 45 of the Code of Federal Regulations (45CFR101 and subsections) give the definition of "human subjects" quoted above. There are several categories of research that, in one way or another, involve humans, but do not, under law, require pre-review and monitoring by an IRB. These categories are "exempt" - as your quote above specifically notes - from such review.

The reason for exempt categories is that there are some types of research that inherently do not pose dangers to particular individual subjects. Those are statistical or records-based research where particular individuals are not identified. For instance, simply reviewing university grade records collectively, to determine the statistics of GPA distribution or something like that, without identifying individual students, would not pose a danger to any particular student even though the data used originated with student files; if the data are anonymized, nothing can come back to any individual person, so there is no concern about any possible repercussions to individuals. This research is, by law, exempt from requiring pre-review by an IRB. Other similar types of projects are also exempt.

What the investigators and the IRB are telling you, in the quote above, is that the 23andme project in question was using aggregated data already collected for a different purpose, without individual subject identifiers, and thus did not require, and would not normally be given, IRB review under the IRB regulations - not just that it would have passed such review, but that it would not have needed or gotten IRB review at all. That is in fact what the federal regulations in this area say, and it seems likely, from the description, that this project was in fact exempt under those regulations.

I agree with you on the need for formal ethics procedures for privately-operated projects. But it appears to me that this project did in fact comply with the relevant regulations (namely, that it was exempt from them). There is one issue: in an institutional environment, the IRB must still certify that the project is in fact exempt from full IRB review; in this case, there was no such mechanism, and arguably the investigators should have sought an exemption declaration from an independent IRB. But as to the lack of an actual IRB review and continued monitoring, it appears that they were not in violation of the (analogically) relevant regulations, because those procedures were not in fact required for this project under those regulations.

Or think about the case in which you discover that you (or your child) has a genetic disease which has a high percentage of being fatal young and suddenly you find yourself being discriminated against because of these genetic abnormalities.

These discovery issues are important, non-trivial, and dangerous.

I have a friend who can't get a visa to a country because he had a disease that is currently in remission, but might flare up again. He and his wife may have to forgo a great job offer because of this. Imagine that you didn't know you had that disease and now your employer can find it out.

Tuskeegee sounds like a long time ago (it wasn't), but I still remember when people donating blood discovered they had HIV and lost their jobs (or were kicked out of school) for it.

The reason that these IRBs exist is to ensure that these issues are (a) kept private and (b) dealt with appropriately. Even today, fMRI experiments have to have a procedure to handle the situation when they find an abnormality in the scan. I wouldn't want any of these private companies to provide doctoral advice to a patient.

I am not at all questioning (in this post) whether the decision of any IRB is in fact the correct one. I am harping on this part.

arguably the investigators should have sought an exemption declaration from an independent IRB.

This is exactly what I am arguing. They should have sought the ruling in advance of their study. Post-hoc review that decides they didn't need a protocol is not different from post-hoc review that decides that they did (and therefore that they are in big trouble). IMO.

I'm not sure if drugmonkey's critique of PLoS Genetics is valid, but I have other issues. PLoS is correct that, once data is fully anonymized, it is no longer legally considered human. The problem is, were the data were originally collected in a legal manner. For example, you can't anonymize the data from the Tuskegee studies and say it's now ethical to use that data.

By bigger issue with PLoS Genetics is that essentially decided to become a free IRB for 23andme. These are not decisions a journal should make (and perhaps were unqualified to legally make). They should have given a list of questions back to 23andme and either have their IRB fully answer them or strongly suggest acceptable IRBs for 23andme to use if they wanted to publish in the journal. Since 23andme was clearly in an ethical gray area, the onus to prove they conducted an ethical consenting process was on them. I'll add that the mere act of having a consenting process, but not having that consenting process go through an IRB should have been a huge red flag.

The other big issue I have is that, as they state "PLoS Genetics' Editor-in-Chief Gregory S. Barsh is a potential consultant to 23andMe. PLoS co-founder Michael B. Eisen is a member of the 23andMe Scientific Advisory Board." These people were removed from direct decision making here, but was that enough? Would a journal without these conflicts have formed a free within journal IRB in the first place? Why couldn't say just say, "This research is good enough to be published elsewhere. Lets have a journal without any of these conflicts review it instead?"

@DM
Yes, the old "but what about the relatives"? explanation. The problem with that argument is the genome isn't really different from any other piece of personal info in that regard. If you are/were a member of the Communist Party (or even just certain left-wing groups) and publicly admit it, you might hurt the chances of your relatives to get jobs requiring US security clearances (still, even 20 years after the Soviet Union imploded). If you had go through life worrying about every hypothetical consequence to your relatives, you'd never be able to do anything.
@qaz
I don't really buy the "but your employer/potential new country can find out about a disease you have" answer either. If the disease is a serious public health concern, then these people *should* know and the loss of the carrier's opportunities is an acceptable cost. If it is just prejudice (as in the HIV scare because it was really just homophobia in a time where teh gayz=HIV in popular culture), then the anger should be applied to the prejudice rather than to the knowledge.

I also have to wonder if the concern that genetic counselors have about things like 23 and me or yours about the fMRI companies are really about the patients and not because these people fear they might be going the way of the travel agent after Expedia and Travelocity.

"We work within these ethical "constraints" ("red tape" as some wag on the Twitts put it) for good reasons and we should fully accept and adopt them. "

I have a real problem with this statement. As an animal user, I find animal protocols to be patently absurd. They ask me to predict what experiments I am going to be doing for the next 3 years, and to justify my requested animal number by providing power analysis and what not. But I cannot honestly predict what I am going to be doing 2, 3 years from now. So, in effect, I am being asked to lie, and if I don't lie, I don't get approval and can't do my job.

I don't have a problem at all with there being oversight of both human and animal subjects research. However, the form it takes is patently absurd (at least for animal studies; I don't know about human). The Law makes IACUCs ask very stupid and useless questions. Don't expect anyone to "fully accept" being forced to lie in order to do one's job.

"I don't really buy the "but your employer/potential new country can find out about a disease you have" answer either. If the disease is a serious public health concern, then these people *should* know and the loss of the carrier's opportunities is an acceptable cost."

ask me to predict what experiments I am going to be doing for the next 3 years,

Most IACUCs have an amendment process. Doesn't yours?

The Law makes IACUCs ask very stupid and useless questions.

Individual IACUC's interpretations of what they are supposed to be doing vary tremendously. Over time as well. I have existed under very reasonable protocol parameters that are consistent with research that cannot be fully specified in advance.....and I have fought with IACUCs that are ridiculously stringent to the point of actively impeding research. You need to be a bit careful about the differences between the Law (the AWA), Regulation (USDA, has almost the force of law but is not), AAALAC approval (not obligatory but highly desired by the Institution) and the local IACUC's interpretation of what they need to do in accommodating their interpretation of all of the above.

I agree with you that when the process is too ridiculously divorced from the actual conduct of real science that it makes investigators lie and hide what they are doing....on a consistent basis....this is a broken IACUC. I have heard from many colleagues about such situations eventually being resolved even if they persisted for a little bit of time. I don't want to call it self-correcting exactly but the pendulum does swing a bit.

I am not sure that variability from IACUC to IACUC or from protocol to protocol means that the entire structure is invalid though. I might do some fixes, more in the way of national rules and consistency, but I think the overall approach is a good one. Above all else it is critical that the individual investigator does not get to make the calls for him or herself.

IACUCs that I have dealt with do require a three-year plan, but they also require annual updates to indicate changes to the plan. So they ask you to project three years out ab initio, but they acknowledge that plans change, and thus expect annual updates.

DM, have to go on the side of the "haters" here. (I *know* no one is really hating, stop being so sensitive, internet!) You are demanding an extra layer of ethical clearance beyond that which is demanded by law. Since it is an ethical argument you are making, it is an inherently valid argument to make: it would have been "better" if 23andMe has sought IRB approval. Sure. Fine. But...

I have sat on IRBs on one kind or another for several years, and my 2 cents is that today these orgs are seriously hindering medical research. Yes, they are there to prevent egregious lapses, and they do that, god bless them. But obstruction and delay are their only tools and their only goal. My beef is that they are not organized to facilitate ethical med research. They are there to delay, slow and stop unethical research, and in so doing, they create substantial hurdles to the ethical acquisition of medical knowledge.

At our institution, no one blows there nose without asking if it is IRB approved. You may think that a good thing. I have had a clinical trial of mine delayed for months and months over a trivial paperwork issue. I have spent months obtaining a letter from our IRB stating that what I proposed in fact did not require IRB approval. The delays and intimidation have become a serious issue. A simple concept for me that is alien to IRBs is that in some circumstances, *not* performing a research study can be unethical. When IRBs slow research into diseases that are killing people, they themselves can be unethical, IMO.

So, that's why I'm a fan of reforming IRBs to emphasize the facilitation of ethical research.

Of course. So, you are suggesting that I propose 3-6 months worth or experiments, then submit an amendment? And then do it again 3-6 months later? Or sooner if we suddenly hit on a cool experiment that mostly falls within the methods of our existing protocol, but isn't explicitly described (and N justified)? If you think that's a tenable way of running a lab, then you are out of your mind. Yet it would be the only way to be 100% honest.

As for variability in IACUCs, I've seen several, and all of them require that protocols describe the experimental groups and justify the N for each experiment. I'm not sure if the ubiquity (IME) of this requirement is due to some regulation (and I don't CARE if it comes from NIH, USDA, AALAC, or the City Department of Annoyance - we have to follow the rules no matter where they originate) or not. I suspect this particular requirement is very common. But because predicting the exact experiments far in advance is impossible, this section is probably fiction in almost every protocol, everywhere.

Ironic, isn't it, that honesty is probably the one value that scientists consider most important (just look at all the outcry over data fabrication, retractions, etc), yet the structure of these ridiculous regulations forces us to lie at least once every 3 years.

Grumble - Remember, IACUC is primarily about technique. (Yes, question plays a role, but general is usually good enough for the question. [General is not good enough for technique!]) My IACUC asks me to provide one protocol to cover a realm of research. So, I have one protocol for all of my bunny hopping experiments, it details all of the things I might do to measure bunny hopping. And for my question, I have something general like "manipulations of bunny hopping should change the movement of rabbits through their environments". For technique, my IACUC asks us to provide ranges. "We will use between 1 and 10 mg/kg of bunny hopping chemical to slow/speed up bunny hopping under different conditions." As long as my experiment falls within that range, I'm covered. The key, I've learned from my IACUC, is to be very broad and very general. Now that I have several very different projects (with very very different techniques) running in my lab, I need a couple of IACUC protocols to cover them, but for years, I was able to get away with a single IACUC protocol to cover dozens of experiments.

That's exactly what I do, qaz. And probably everyone else. My point is that this latitude to change things is not explicitly stated in any regulation. As I understand it, the regulations say that investigators have to justify animal use and number. That is why my protocol form asks for a complete description of the experimental groups, the N in each group, and a justification for that N. Does your form ask for that? Then even if the IACUC lets you get away with listing one set of experiments and doing another, you are technically violating the rules just like I am.

My IACUC asks me to list all the possible N for a range of experiments I could imagine. So I have permission to do 10x the number of animals I actually am likely to do. If you phrase your experimental definitions carefully, you can define a bunch of experiments in the same way. (At least I can for my research questions.) So technically, I'm not changing things. Just not being as specific as I might.

Anyone can walk into any hardware store, pick up a can of rodenticide and use it however they like against mice or rats. Indeed, our animal facility uses rodenticides to keep the wildlife out. However, if I wanted to perform research on said rodenticides, I'd be buried knee deep in IACUC paperwork, especially if I proposed to use death as the endpoint. If only we showed as much concern for helpless and homeless humans.

Anyone can walk into any hardware store, pick up a can of rodenticide and use it however they like against mice or rats. Indeed, our animal facility uses rodenticides to keep the wildlife out. However, if I wanted to perform research on said rodenticides, I'd be buried knee deep in IACUC paperwork, especially if I proposed to use death as the endpoint. If only we showed as much concern for helpless and homeless humans.

I find it surprising a scientist would make this argument. Yes, anyone can kill rodents on their own personal time and at their own personal expense. Do those conditions apply to your research?

@poke. What's so special about lab mice? My annual per diem for one cage of mice could pay to immunize several hundred kids in the third world. I've just been informed that CO2 asphyxiation will no longer be allowed for mice at my facility because research shows they may suffer for up to 15 seconds during this procedure. Do I give a rat's arse about the last 15 seconds of a mouse's life? No I don't.

bacillus, I find it hard to believe that the paperwork justifying research into rodenticides is any more onerous that that required to justify any other kind of research. Indeed, if you can't justify it, you shouldn't do it.

Your attitude towards animal suffering is unethical and I find it repulsive. Not only that, but it gives scientists a bad name (to the extent that I wonder whether you are some kind of animal rights troll). It is your JOB to care about your mices' suffering, and to expose them to only the level of pain required to accomplish your research goals. There are easy alternatives to CO2 asphyxiation that do not cause pain, so what is wrong with the IACUC insisting that you use them?

@ Grumble. I've been performing animal experiments (i.e. hands on) for more than 30 years. I've never deliberately harmed an animal in all that time, and I have changed my habits in accordance with the changing times. However, compared to some of the "suffering" that is allowed on animal use protocols, 15 seconds of additional pain at the end is nothing to get worked up about. Our facility insists that we now anaesthetise the mice, then asphyxiate them, then break their necks just to be sure they are dead. Done properly, the latter is by far the most humane way to kill mice, but I was forced to switch from this to CO2 several years ago because some other people were too cack-handed at it. Unfortunately, blame one blame all seems to be the prevailing philosophy behind all regulatory measures these days.

Some days it takes a team of three of us the entire day to necropsy and process large numbers of mice (>100). This level of overkill will only add to that burden. The biggest cause of unnecessary "suffering" in laboratory animals is the terrible science using them that is published on a daily basis. Don't get me started on the huge numbers of female mice that are culled from breeding colonies because everyone wants to use males. I know this is out of sight out of mind for some people, but it exists nevertheless. I think you lack perspective on this matter, but perhaps you are fortunate to only perform minimally invasive or behavioural studies. I'm off to the pub now to unwind, I suggest you follow suit by whatever means tickles your fancy.

@Grumble. Unfortunately, it's not as simple as you think. I work in a BSL3 lab that lacks a fume hood or the space for one, so inhalant anaesthesia is not an option. Instead, I'd have to give systemic anaesthetic IP to 100+ mice under a biohood wearing full BSL3 kit including a full face respirator. So yes it will add at least an hour working in what is already a stressful environment, but I guess that's okay as long as the mice don't suffer for 15 seconds. Do you do your own animal work? If not how do you know what's being done in your name?

[…] brief introduction, I last discussed the 23andme genetic screening service in the context of their belated adoption of IRB oversight and interloper paternity rates. You may also be interested in Ed Yong's (or his euro-caucasoid […]

[…] I say this is outrageous and nonsense. Of course we should apply punitive sanctions, including retracting the paper in question, if anyone is caught trying to publish research that was not collected under proper ethical approvals an…. […]