Altmetrics Correlations: We need to think about n-dimensional impact space, taken from a slide presentation by Jason Priem

“See the mind at work and see the mind in work…”(from REMIXTHEBOOK review)

“Until quite recently the complete justification for even the most complex scientific facts could be understood by a single person,” Michael Nielsen reminds us before describing science today as graspable beyond individual understanding. If one person can no longer understand the ways of science, then, perhaps, “a group of people” may “collectively claim to understand all the separate pieces that go into the discovery, and how those pieces fit together.”

How do we correlate members of this group, are some members closer to each other, how can they collectively claim what they collectively communicate, and what pieces hold them together? This is what the altmetrics group tries to explore, and teaches the machine to seek. This is what “changes the game”, and the reason why we came to play has recently been interpreted by Jason Priem, a PhD student, whose work is focused on improving scholarly communication.

Communicating is Doing the Science

In a speech presented at the Purdue university, Jason aims at deciphering what is crucial for the Second scientific Revolution; that is, altmetrics, rediscovering impact, and decoupling of the journals. “Communicating is the doing of science,” he quotes Montgomery. The “conversation,” he continues, “is arguably the most important work we do in science,” added to it that it is easier than ever to make it visible – we have ways of publishing and growing it in patterns, through social networks, reference managers, blogs, social bookmarking systems.

At Research Remix altmetrics group, they claim: “By analyzing patterns in what people are reading, bookmarking, sharing, discussing, and citing online we can figure out what kind – what flavor – of impact a research output is making.” As opposed to what we would expect, “the goal isn’t to compare flavors: one flavor isn’t objectively better than another. They each have to be appreciated on their own merits for the needs they meet.”

To Each His Flavor

Priem mentions the example of Twitter feeds that solely accumulated to over 58k citations from Twitter to scholarly articles in just one month, that is, a certain dataset listed 58k tweets that mentioned a scientific article (broadly speaking anything with a DOI, PMID or arxiv ID) between the 1st and 31st of July 2011.

Recently, an altmetrics bookmarklet was launched, that you can easily add to your browser toolbar, that will “on the push of a button” count the number of Tweet mentions, mentions on Facebook, CiteULike, Connotea or Mendeley for any article, again, with DOI, PMID or arxiv ID.

Altmetrics Bookmark to add to your Toolbar

“The local genomics dataset has moved to an online repository, and now, we can track it,” it is written in Altmetrics Manifesto. “Altmetrics are fast, using public APIs to gather data in days or weeks. They’re open–not just the data, but the scripts and algorithms that collect and interpret it. Altmetrics look beyond counting and emphasize semantic content like usernames, timestamps, and tags. Altmetrics aren’t citations, nor are they webometrics.”

“Now we can listen in,” the manifesto reminds us. “In growing numbers, scholars are moving their everyday work to the web. Online reference managers Zotero and Mendeley each claim to store over 40 million articles (making them substantially larger than PubMed); as many as a third of scholars are on Twitter, and a growing number tend scholarly blogs.”

Altmetrics are not limited to one type of person – academics, one kind of resource – scholarly article, and one kind of use – using to support a scholarly article, and while, in words of Jason Priem, “we started to confuse “the kind of use we can track” with use, and “citation impact” with impact,” we still struggle to mine the first scholarly web, instead of mining “the next one.” Even worse, our system is “the best scholarly communication system possible using the 17th century technology.”

Time to Listen In

With altmetrics, we can crowdsource peer-review; taken from Altmetrics Manifesto

Jennifer Howard writes for the Chronicle: “Peer review has served scholarship well but has become slow and unwieldy and rewards conventional thinking. Citation-counting measures such as the h-index take too long to accumulate. And the impact factor of journals gets misapplied as a way to assess an individual researcher’s performance, which it wasn’t designed to do.”

Priem explains how looking into the next web will be done from a different angle: “Altmetrics impact is mostly orthogonal to traditional citation impact. We need to think about n-dimensional “impact space”, articles that fit into clusters by type of uses. Altmetrics can provide rich metadata, not just counts.”

Some of the tools that are already available (bearing in mind that altmetrics are in their early developing phase) were presented by Priem:

Priem concludes the first part: “Quantifying impact means we can teach machines what’s important. This changes the game, and the game needs to be changed.”

Decoupling Journals

The next thing Priem discusses is the decoupling of the journals. At this point, we may ask ourselves what highly relevant research result today really – is. Is it scholarly article, or rather a separate piece of information?

“First journals went hand in hand with the Scientific Revolution and applied the most advanced technology available to the problem of spreading scholarship at the time,” Priem suggests. In fact, all scientific progress is limited to the best technology available and, if this was once printing press, today it is the technology of the world wide web.

As a consequence, the current system has problems, it is “slow, restrictive in format, closed, inconsistent and hard to innovate on”. The new system, will however, need to do all that current system does, only better. It will need to connect all members through a “live CV” that shows altmetrics, and this, according to Priem, has already started, given the examples of:

The most compelling among Priem’s conclusions is that “once we have alt-citation data, it’s too useful to ignore” and “alternative filters and even certification paths based on this data will open.” He even borrows a theater reference from Peter Winkler, related to citation graph data: “It is like Chekhov’s gun: once on stage, it has to be fired.” Finally, Priem suggests, “new revolution will not be about homogeneity but about diversity of outputs and this is easier to do on the web then between covers.”