Tuesday, August 12, 2008

Top 10 signs your reviewer suffers from bitchassness

Update: I've been away for the past couple of days, but I have recompiled the list with the fantastic comments that have been shared (I have given credit to the appropriate person as being the source but have interpreted their comments through my own crankiness!)

I've been cranky and unhappy this week and felt like I needed a little cathartic humor (apparently delving into what is wrong with math scores and whinging about my university really haven't been doing it for me). So I have started a top 10 list, but thought this might be more fun as a communal effort. Please share your top reviewer pet peeves to finish the list below. Here are my favorites.

10. Makes grandiose statements about how the research has already been done, sometimes providing strings of big name scientists but never actually providing citations. After reading all the papers by said big names you never do find any evidence that anyone, living or dead, has published a paper even remotely similar....ever.

9. A passage of the following spirit in a review has left you thinking you might actually be insane: "The authors clearly do not understand concept X, as evidenced by a series of statements I am about to make that actually have nothing to do with concept X but I will say authoritatively as if I actually understand concept X, which by definition means that the authors do not understand concept X."

8. Refers to unpublished papers as evidence that you have not adequately read the literature making you realize that you apparently failed your graduate class in clairvoyance.

7. (Isis) Suggests a really huge and elaborate addition to your experiment that not only would end up being a completely new paper but you both know there is no way in hell you'd actually conduct that experiment anyway!

6. (anon) When the reviewer uses "conventional wisdom" as evidence against you, when there has never been a paper showing that the conventional wisdom is actually true.

- note from River Tam to reviewers who do this: I HATE that shit. Legend, lore, and mythology is NOT science and often what we assume to be true actually is NOT! So, stop giving legitimate science the kiss of death just because you think you've run the experiment in your head.

5. (Anon and DamnGoodTechnician) If a reviewer has told you that you need to run additional experiments/analyses to bolster your claims and those experiments/analyses are not only in the paper already but constitute 3 out of your 5 figures!

4. (DamnedGoodTechnician) The reviewer claims that he has already proven your findings wrong in print but the paper is only vaguely related to your paper at all and could be interpreted as actually supporting your findings

- I would also like to add the variant where the reviewer's papers is totally a piece of flawed and steaming crap which you ignored because it was so truely awful that you were actually doing the reviewer a favor by ignoring its existence.

3. (Candid Engineer) When the reviewer's summary of your paper is so totally different from what you actually did that you have to wonder if the reviewer a) sent the wrong review to the editor or b) you should be contacting the editor to have medical help sent said reviewer because you are pretty sure they must have had a stroke.

When the reviewers don't look at your data and just say that all us important scientists know that conventional wisdom A is true, and this paper that shows not A is true is just wrong, wrong, evil and a personal attack on all people who back the conventional wisdom of A is true.

(this happened to me most recently last year when I was switching into a new area as a postdoc... and trying very hard to not step on toes. Not carefully enough it seems!)

Suggest that you do a critical experiment to back up your field data - when that experiment is actually the center of the manuscript and the data are presented in 3 of the 5 figures. Seriously- if you don't have time to read it don't write a review.

Anon #2 is right on the ball! The last paper from our group had the comment that "the authors show RepressiveComplex X at the promoter of this gene under active transcription conditions - this is contradicts their gene expression data". Well, actually, if you read the rest of the paper (or hell, even panel C of the same figure), you'd see we go on to explain that RepressiveComplex X is actually ACTIVATING transcription, COUNTER to its canonical role. Wow! Isn't it fascinating how illuminating papers can be when you actually frigging read them?!?

I am also a big fan of the brilliant reviewer analysis of "The authors do not reference MyLab et al, 2001, where it was clearly demonstrated that the findings in this paper are bullpockey" when MyLab et al 2001 demonstrates something only vaguely related to your paper, and if anything, confirms the frigging findings in your paper! Gah!

I have a good variation on 'didn't actually read the manuscript'. I once submitted a manuscript about the Slicing and Dicing of Grapefruit to the Journal of Grapefruit. Reviewer #1 came back with the comment, "This paper talks about the Peeling of Bananas. The Journal of Grapefruit is so awesome, it would never consider publishing on the low-impact topic of Bananas."

My favorite (because, in my limited experience, there always seem to be one of them for every single one of your papers) is the reviewer who has a pet method (usually long forgotten for more modern, efficient, easy to implement ones in the community) and wants to force you to use it. Along the lines of "The authors used a perfectly adapted paring knife to cut this strawberry, but for comparison and better assessment of their results, it would be great if they could also use a dirty spoon, because I love dirty spoons and actually dirty spoons work better than paring knives when it comes to eating apple sauce so why would you even want to try and use a paring knife to cut a strawberry?"