Monday, June 21, 2010

The latest newsy thing from the TheScientist.com just landed in my inbox. Top story today is titled "New impact factors yield surprises." In the short piece the reporter notes how the impact factor (IF) for Acta Crystallographica - Section A has jumped 20-fold since last year and is now the second highest for science journals (edging out NEJM). This apparently is all due to a single 2008 article, chronicling the development of the SHELX crystallography computer program suite, which garnered some 6600 citations.

Okay, so this is an extreme example of how IF's can be manipulated via review-type publications, but still...

On the other hand, if you're currently on the job market and have a publication in Acta Crystallographica - Section A, you just might want to make note of the current IF in your CV.

Why are there only seven hotdogs in a Hebrew National packet?* Packets of hotdog buns contain eight buns. I'd have to buy eight packets of Hebrew National hotdogs and seven packets of buns to even things out. Fifty-six hotdogs!

But then I'd need fifty-six beers - one per hotdog - to go along with that. Beer comes in six- or twelve packs, or cases of twenty-four... So I'd need to buy twenty-four packets of Hebrew National hotdogs, twenty-one packets of buns and seven cases (or twenty-eight six-packs) of beer. Now we're up to 168 hotdogs.

But what about the paper plates? Do they come in packs of 168? Or six? or seven? Eight? Twenty-four? And how about napkins?

...

Steak anyone?

* According to that repository of all things true, Google, seven is a lucky number in Judaism. Hence seven hotdogs.

Something mentioned in that forum and in a comment at DrugMonkey's piqued my interest. QoQ over at DM's asserted that the NSF is indeed broken and noted in support:

First, the identity of the reviewers is not public and changes from submission to submission -- so you can't target a grant.

You can't target a grant.* I'm not entirely sure what QoQ means by this, but I suspect they want to write their proposals for specific reviewers on the panel. Perhaps so they can try to "butter up" the reviewers by citing their work favorably and often, or to avoid having to write a proposal in more general language that reviewers not experts in the sub-sub-field can understand. Or maybe even both. Or neither.

It doesn't matter really, because targeting reviewers on the panel is WRONG, WRONG, WRONG!!!!!!!

Why?

'Cos there ain't no guarantee that the reviewers you are targeting will get your proposal.

In fact the odds are not in the least bit favorable. Review panels at the NSF (and study sections at the NIH) cannot have experts from every sub-sub-field on them. Unless you want either panels with many dozens of members, or many, many more panels than currently exist (and there are already a lot). So the odds are there may be one reviewer at most who is an expert in your particular sub-sub-field (and that person is likely a competitor...).

Targeting panel members would be a particularly stupid thing to attempt (if you could) at the NSF where you could have anywhere from three to ten people reviewing your proposal. Usually three or so on the panel, and the rest as outside, "mail in" reviewers. Even if there is an expert on the panel you could target, one good review isn't even close to being enough to land funding. And let's be realistic - a panel member might be somewhat flattered by some "buttering up" in a proposal, but they're generally smart enough to recognize it for what it is.

Write your proposals for people only somewhat conversant with your corner of the field (and cite all the relevant literature, and none of the irrelevant). If you can't do that, chances are, you won't stand a chance of being funded.

* Actually, the NSF does let you do a form of targeting that a proposer would be foolish not to take advantage of. When you submit your proposal you are given the chance to suggest reviewers. In my experience NSF PO's do actually use some of these suggestions as outside reviewers. Obviously these need to be reasonable suggestions...

Friday, June 11, 2010

I'm on the editorial board of a journal in my field. I am often assigned manuscripts as managing editor. This means finding reviewers. Of late I've noticed a disturbing trend (ANECDOTE ALERT!!!). People I ask are taking an unreasonably long time to decide whether or not they will review a manuscript. Days. A week even. If this were just one or two people you could explain it away easily enough. They're traveling, for example. But it's not one or two. It's approaching 30-40%. Given that I'm managing two to four manuscripts at any given moment, that's a lot. And when they eventually get back to me (those that do...), they invariably say no they can't review the manuscript.

Why are you taking so long? Read the abstract (which we send in the email), think about what else you have to get done in the next couple of weeks, and decide whether or not this is a review you want to do. Then get back to me by reply email. Not a hard process. The longer you take to decline the invitation, the longer the whole process takes. Is that what you want to happen with your manuscripts? Didn't think so. So if you're going to decline, get off your rear end and say no quickly.