30 June 2014

It’s weird how social media works sometimes. This appeared in two of my social media feed this morning, Facebook and Quora:

Seeing this in two places simultaneously, I assumed this was news. I Fucking Love Science links out to a longer article from November 2013. So right away, something that seemed to be new was not. As far as I can tell, I Fucking Love Science posted the picture yesterday / today, so they are responsible for the news necromancy.

Then, the I Fucking Love Science article reveals that this is based on research presented at a conference. Conferences are great, but the research presented at conferences is rarely peer reviewed. So people make all sorts of claims at conferences that may – or may not – withstand the deep scrutiny that extraordinary claims get. This is why publishing in peer reviewed journals or depositing data is valuable.

Similarly, the data presented at conferences tends to be work in progress. That preliminary analyses may not pan out. I’ve presented a lot of stuff at conferences that I wasn’t able to convert into papers.

Worse, people who were not at the conference don’t have any way to evaluate the claims.

That we haven’t heard anything on this since November of last year suggests that the evidence might be a bit preliminary.

The I Fucking Love Science article doesn’t do us any favours by not even mentioning the authors of this research. One link out (misleading linked to text saying “Royal Society”) leads to a Nature News piece. Based on that, part of the team seems to be David Reich. He posts PDFs of his papers, and none seem to list anything about the unknown human.

But I Fucking Love Science presents this conference research as an established fact on the (very simple and easily shared) image, with no link or date or context on the image. This can make the rounds on social media for a long time, even if it never makes it into a proper scientific journal.

There is a lot to learn from the successful formula of I Fucking Love Science. Pictures get shared; see the data from Google Plus below:

People interested in spreading their science shouldn’t just work on their sound bites. They should work on their social media meme images.

Incidentally, this claim might not be that wild. We already have a candidate for that “third, unknown species”: the lineage that includes Homo florisiensis, a.k.a. “the hobbits.” They are a third human species. But we don’t have any DNA sequenced from H. florisiensis, so if there was interbreeding between them and modern humans, we wouldn’t have a genetic signature to compare it to.

25 June 2014

Instead, some people say that we should call these “sea stars” and “sea jellies.” Because (the logic goes) they’re not really fish.

This annoys me. It’s an attempt to make common English words conform to scientific taxonomy. This seems pointless to me. This is the entire reason why we have a scientific naming system and give species Latin names. Linneaus realized that regular vernacular was far too imprecise, so sidestepped the whole issue by creating the binomical naming system.

It certainly is not like “fish” is the only noun that gets applied to several unrelated taxa. “Lobster” can mean clawed lobsters, slipper lobsters, squat lobsters, which are all different families. “Crab” and “shrimp” are just as messed up.

This is dour attempt to achieve linguistic purity. But it's a lost cause. As writer Ammon Shea (author of Bad English) noted in a recent interview:

By the time you have noticed a usage being misused, it is far too late for you to do anything to stop it.

In other words, nobody says we shouldn’t call this a crayfish.

What would be call it instead? A “lake cray”? A “river cray”? Ick. That would be just cray cray.

24 June 2014

Retraction Watch reports that a paper by Séralini and colleagues, which was published, criticized roundly multiple times, then retracted by the editor, has found a new home in a new journal. As with the first round, there are extremely weird publicity practices in play here, reported at Embargo Watch.

16 June 2014

Lately, I’ve been wasting timeplaying around with doing science outreach to engage the public on Quora. Quora is a question and answer site. You can post a question, and people may (or may not) answer it. I’ve been a member for a few years, but haven’t used it much until recently.

I follow a few topics, particularly in scientifically, evolutionary biology and neuroscience are the main ones that pop up in my feed. And it’s been fascinating to see certain kinds of trends in the questions that appear.

In “Evolutionary biology”...

A lot of questions are really questions about religion or creationism.

People think every feature of organisms must be adaptive, including human thinking.

A lot of questions in “Neuroscience”...

A lot of questions are really questions about religion or souls.

People are obsessed with the idea of downloading consciousness and machine / brain interfaces.

I don’t know is how much those questions that appear are due to some algorithm that’s determined what kinds of questions I click on.
It reminds me of Dr. Karl’s call-in show on Triple J: certain questions come up over and over. And some are topics that you just wouldn’t think so many people would expend mental effort wondering about.

I think it’s worthwhile for researchers to have some sort of site where they can hear people’s random questions. It gives insight into what common conceptions (and misconceptions) are out there, and how sophisticated people’s understanding of scientific concepts is.

12 June 2014

If we diminish reliance on GRE and instead augment current admissions practices with proven markers of achievement, such as grit and diligence, we will make our PhD programmes more inclusive and will more efficiently identify applicants with potential for long-term success as researchers.

I am not sure how you can get a sense of those.

Typically, faculty want grad students who have performed at consistently high academic levels in their undergraduate career. This means high grade point average (GPA). As I’ve noted before, these students are often so bright that they have never struggled academically. I’ve heard of many students who hit post-graduate study, either in sciences or health professions, who struggle because they are getting something other than As for the first time ever.

How can you assess persistence in those high-performing individuals? I’m not sure there is a clear way to do this.

Some might try to assess persistence by stressing applicants. Some job interviews do this with “stress tests.” I tend to agree with this piece: they’re worse than useless.

People who behave like that are either naturally jerks, or they’re
“manufactured” jerks who behave that way because someone told them it
was a cool way to interview people, by abusing them. None of it is
acceptable.

The Nature Jobs talks about extensive interviews. Personal interviews have their
own bizarre quirks that may also introduce systematic biases. It is easy to talk about screening students for their grad school potential based on their personalities, but I suspect it is rather difficult to implement in practice. This is not to say we should just give in and just use the GRE, mind you.

I’m also puzzled by the continued references in the Nature Jobs article to the 800 point scale, which the GRE hasb’t had for a couple of years now.

11 June 2014

Inexperienced scientific writers, even ones who are quite good good at composition and structuring sentences, often give themselves away in their citations.

New scientific writers often shorten lists of three or more authors by writing “et al”, with no period at the end.

This is a newbie mistake. “et al.” is short for the Latin phrase, “et alia”, which translates as “and others.” It’s an abbreviation, so it always has a period at the end. Also, neither word is capitalized.

People write this wrong so often, I have often thought that we should replace the obscure “et al.” with an English phrase (“and others”, “and colleagues”) or symbol (“&o”?).

In biology, another newbie mistake is not italicizing species names, or capitalizing the second word in a species name.

Any other newbie mistakes in other fields? Leave your examples in the comments!

The following is the slated list of papers, in the order they should appear in in the journal. They should all be out in August in volume 54, issue number 2, if all goes well. I’ll update this to include DOI links for the other papers as they arrive.

Weinersmith K, Faulkes Z. Parasitic manipulation of hosts’ phenotype, or how to make a zombie—an introduction to the symposium. Integrative & Comparative Biology: in press. http://dx.doi.org/10.1093/icb/icu028

06 June 2014

I will dress in bright and cheery colors, and so throw my enemies into confusion.

In other words, if you are an Evil Overlord, you don’t need to act the part so obviously.

The academic publisher Taylor & Francis would do well to heed this lesson. If you want to talk about a self-fulfilling prophecy, read this article. We learn that a journal article, critical of the business practices of academic publishers, was delayed and held up by, wait for it, the academic publisher of the journal in question.

(T)he non-appearance of the journal in September was followed, two months later, by a letter from a senior manager at Taylor & Francis demanding that more than half of the proposition article be cut. ... (W)hen the edition was finally published, Taylor & Francis unilaterally added a long disclaimer to each article warning that “the accuracy of the content should not be relied upon”.

It is rare to have such obvious meddling over such an obvious conflict of interest.

Despite my joking about publishers as Evil Overlords, this isn’t just about academic publishers. This is about the conflicts of interest that can arise any time you have gatekeepers. Open access publishing in and of itself wouldn’t solve this problem. (The article in question is open access.) Nor would having journals published by scientific societies solve this problem. An editor could be just as difficult and unreasonable as a publisher. So can reviewers.

Update, 24 June 2014: Taylor & Francis have apologized for their behaviour... albeit in a weak way:

Professor Macdonald said it had taken “hours” to agree a version of the letter with which he was satisfied and which avoided words the publisher was unwilling to use, such as “sorry”, “mistake” or “censorship”.

I’m not sure if it counts as an apology if you’re not willing to say “sorry.” C’mon, Taylor & Francis. Say, “Sorry.” It won’t kill you.

The article critical of publishers has become the most read thing in the journal Prometheus ever.

A point of pride with me is that I have published papers on many different species. When last I checked, four years ago, I was at original data on thirteen different species. It’s time for an update! This time, I’ve tried to list them more by taxonomy than by chronology of my papers (that is, crustaceans together). And I’ve still excluded a ascidian survey paper.

04 June 2014

Long ago, in a distant land, I, Zen, a sand crab biologist, unleashed an NSF grant for undergraduate research upon my department. And students applied to this.

In the third REU cohort, Jessica Murph (pictured) ended up working with me. Jessica was looking for a project that involved field work. She asked just at the right time, because I’d been digging up the local sand crab species(Lepidopa benedicti) for a while. I realized how little we knew about the basic biology. People would ask me things like, “How long to they live? What do they eat?” And I’d say, “No idea.”

Because I wasn’t trained as an ecologist, I couldn’t quite figure out how to go about studying the basics of a little animal that was completely concealed in the sand. Ultimately, my former colleague Anita Davelos Baines gave me the idea of digging transects. As long as we did it consistently, we might be able to get some meaningful information.

I asked Jessica to start collecting sand crabs for her REU project. She did, and between her efforts and mine, we got a year’s worth of data, which took her out to the end of her time with the REU program.

I compiled all her data, and was getting ready to write it up and submit it... but I balked. It’s that gut instinct you have to develop as a researcher about whether your own papers are ready. I knew that if I was reviewing this paper, I would say, “One year of data is not enough.” So I continued making monthly trips and collecting sand crabs myself for another year.

Then, I submitted the manuscript to what I thought would be an appropriate journal. This was regional natural history kind of stuff, and fortunately, there is exactly a regional journal dedicated to just such research: The Southwestern Naturalist.

When I mentioned this to one of my colleagues, he groaned a little and said, “They take forever.” I sort of shrugged and said, “I’m not in a hurry.” It’s not as though I am worried about getting scooped on this project: ecology of an obscure species of digging crustacean? Not exactly a “hot” research field. But in retrospect, I wish I had checked the “submitted” and “accepted” dates in a recent issue more closely.

I submitted the paper on 22 August 2011. No, that is not a typo.

When I finally got the page proofs, it chafed my chaps to see the “submitted” date at the end of the paper: “26 September 2011.” Apparently, nobody even looked at this manuscript for over a month.

I got my reviews back on 24 May 2012. That’s nine months there. I sent back the revision one week later, and I hadn’t thought the revisions were that major.

Time passes. Sound of crickets chirping.

My patience ran out around the ten month mark (we’re at February 2013 now). I emailed the associate editor about the status of my manuscript. He emailed me back to say:

I am no longer an editor for the Southwestern Naturalist.

I was gobsmacked. It seemed to me that a logical thing to do when an editor leaves would be to email the authors for correspondence to say, “This person is leaving; please direct your inquiries on your manuscript to this editor.”

I finally heard back that the article has been recommended for publication, and should appear in the December 2013 issue of the journal. This made me happy, because I thought it would be out by the end of the year, and I hadn’t had a lot of papers out in 2013.

In early November, I get an email that I should get page proofs for my article in January... which is already after December, dashing that hope that the article would be out before the end of the year.

With all that’s happened so far, I suppose I should not have been surprised that, having been told to expect proofs in January, I actually get them mid March. I sent them back quickly, in hope that the December, 2013 issue might be out around the end of March, 2014.

In the waiting time between when the paper was supposed to be published and when it actually was published alone, I submitted articles to three other journals, had them accepted, and proofed. I’m not trying to boast here, just contrasting my experience with this journal with others.

The paper has finally seen the light, and it has taken substantially longer to get this article through the editorial and production process (2.75 years) than it did to collect the data and write the paper (two years). I should have seen this coming, because the journal does list the initial submission and acceptance dates. But even those dates underestimate the publication time. My “received” date was a month after I submitted, and the publication date on the cover of the issue, December 2013, was months before the issue actually hit the web.

And, by the way, did I mention that there was a $320 page charge for all of this? And it’s a paywalled subscription journal, with no open access options?

Speaking of costs, I got an email on 19 May 2014 allowing me to order reprints. The paper still wasn’t available on the journal website, but I thought it was at least a promising sign that it hadn’t been forgotten. I clicked on the link, and was stunned to see this part of the ordering form:

They wanted $69.68 from me so that I could have a PDF of my own paper. I had never had a journal want to charge me for a PDF before.

I was also stunned to see that a copy on CD would be more than another $10, when the cost of a blank CD from Staples can be as little as 18 cents. It feels like a money grab. And I can’t remember the last PDF I needed or wanted in CD format. And why would you need a DVD (another $13) for a single journal article PDF?

The paper was finally available online on 4 June 2014. That’s 1,017 days – or two years, nine months, and thirteen days from submission to publication.

This whole process has soured me on submitting to some of these niche journals. That is a shame.

I submitted this article to The Southwestern Naturalist because it seemed to be the most appropriate journal for the paper. I see value in topical journals; it makes things easier to find. I think many researchers have certain journals they check regularly. But after this experience, I think I would have been much better off submitting this paper to PLOS ONE or PeerJ or a similar venue.

Except... wait, PeerJ didn’t exist when I submitted this paper. With publications like PeerJ, journals like The Southwestern Naturalist are going to be in trouble soon.

I feel bad for the staff at The Southwestern Naturalist. They clearly don’t have enough resources. I don’t know whether they don’t have enough people, enough pages, or something else, but these time frames are not signs of a healthy scientific journal. And, as I said, I do think journals like this can serve an important purpose! I want to support the journal. I do.

I’m pleased that the paper is out. Because of this paper, my co-author, Jessica, is more excited and positive looking back on her research experience now than she was when she was doing the work, I think. It gaves me a chance to add another UTPA success story to the REU Bio website. And it is the first to provide any sort of basic ecology for any species in this family. It’s a cute little paper. I love it for what it is: a little natural history on organisms that only a few people in the world care about.

But it’ll be a long time before I think more about the science in the paper than the frustrations and delays in publishing the paper.

Additional, 11 June 2014: The printed copy of the journal arrived today, which is reasonably quickly following the online publication.

Additional, 28 April 2016: I got an email today from the president of Southwestern Naturalists:

On the publication front, [the editors] are well on their way to getting The Southwestern Naturalist back on track. It is anticipated that by the end of 2016, we will be back to our normal publication schedule of 4 issues a year mailed in March, June, September, and December. HOWEVER, we need your help. It seems that some authors have shied away from submitting manuscripts to The Southwestern Naturalist over the past year or so because it was behind in publication. This is no longer the case, so I encourage you to consider submitting to the SWAN and I hope you will pass that word that The Southwestern Naturalist is back!

02 June 2014

A new paper by van Dijk and colleagues claims to have figure out how to predict “success on the academic job market,” according to the title. Given how tight the academic job market is, you might expect this paper to get a lot of downloads.

But the paper’s title promises waffles and the paper delivers pancakes.

If you have a paper that claims to be able to predict success in the job market, you might expect that the authors have measure things like how many people in their pool get tenure-track job offers, how many are awarded tenure, and so on. They don’t measure employment at all.

This paper says it measures the probability of someone becoming a principle investigator. That term is mostly user in reference to grants, so you might think that they are looking through award databases to see what people have gotten grants. They don’t measure granting success, either.

What this paper measures is how people become last author on papers. There is an assumption here that the last author is the “boss” of the paper. This isn’t a bad assumption, but authorship is a compleicated game, and people don’t always follow expected conventions for authorship (Shapiro et al. 1994).

van Dijk and company dug into PubMed, and tracked the publication records of many thousands of individual authors. They were careful to try to remove people who have similar names, so that each record is a single author. Using PubMed as a data source has its limitations, since PubMed’s purpose is to archive biomedical literature. It is an open question whether the patterns van Dijk and colleagues find would hold for other fields, like physics, social sciences, or even non-medical biology.

They find some very interesting patterns. The best predictor of quickly becoming last author on a series of papers is the Impact Factor of the journals you publish in early in your career. Frankly, this is extremely depressing news for those of us interesting in breaking the back of Impact Factor into a thousand pieces, because it shows, “Glam sticks.”

There is hope, though: if you can’t publish in the glamor magazines, you can publish a lot. People could also make the transition to last author by publishing a lot of papers, although the effect wasn’t as big as Impact Factor.

The authors also show that – to almost nobody’s surprise – it helps to be a guy. And it helps to have a degree from a fancy university. I’m still trying to work out if there are any confounds in their discussion of university rankings, however. The universities are partly ranked using the same publication metrics that they use elsewhere in the paper, so it’s not surprising that the university rankings are correlated with other features they include in their prediction algorithm.

If you are willing to accept the authors’ peculiar definition of “PI” as a proxy for “academic success,” there is a lot of interesting stuff in this paper.

Additional, 3 June 2014: If you want to play around with the authors’ prediction algorithm, they have a website called PI Predictor. It is a bit reassuring to know that the odds were in my favour.