BU has an agreement with Springer which enables its authors to publish articles open access in one of the Springer Open Choice journals at no additional cost.

There are hundreds of titles included in this agreement, some of which are – Hydrobiologia, European Journal of Nutrition, Annals of Biomedical Engineering, Climatic Change, Marine Biology and the Journal of Business Ethics. A full list of the journals included can be found here

To make sure that your article is covered by this new agreement, when your article has been accepted for publication, Springer will ask you to confirm the following:

On 4 September 2018, 11 national research funding organisations, with the support of the European Commission including the European Research Council (ERC), announced the launch of cOAlition S, an initiative to make full and immediate Open Access to research publications a reality. It is built around Plan S, which consists of one target and 10 principles.

cOAlition S signals the commitment to implement, by 1 January 2020, the necessary measures to fulfil its main principle: “By 2020 scientific publications that result from research funded by public grants provided by participating national and European research councils and funding bodies, must be published in compliant Open Access Journals or on compliant Open Access Platforms.”

I often wonder if other scientists wake up every morning to delete a deluge of spam messages from no-name journals and questionable conferences. Sometimes one of these emails will escape my extermination efforts and I end up reading it by accident. The invitations from so-called “predatory” publishers are so transparently fake and poorly written that a part of me finds their annoying overtures oddly amusing.

I realize that predatory publishing and phishing emails are not laughing matters. There has been an explosion of predatory publishers trying to con scientists out of their money. For a fee, these journals or books are just frothing at the mouth to publish your work collect your cash. Some may even invite you to serve on their “prestigious” editorial board, but this is just to lend an air of authenticity to their sham operation.

What separates a predatory publisher from a legitimate science publisher? Both charge you large sums of money for you to do all the work, but the latter employs a rigorous peer-review process that ensures the articles they publish have been properly vetted. Sting operations have revealed that predatory journals will publish absolute gibberish, proving they are phonies who just want to make fast cash. A recent sting operation involved the submission of a manuscript about midi-chlorians from Star Wars, written by Dr. Lucas McGeorge and Dr. Annette Kin. The fictional paper was accepted by four of these flimflam publications masquerading as a legitimate scientific journal.

In an effort to help root out some of these predatory publishers, I’ve compiled some of my favorite lines from the suspect emails I receive on a daily basis. I hope this helps people spot dubious publishers. The typos, spelling mistakes, and grammatical errors were left in place intentionally—exactly as they were sent to me. These types of errors represent a big red flag that a predatory publisher is stalking you. Please don’t take them as a sign that the PLOS editorial staff is sleeping on the job!

The Greeting That Proves They Have No Idea Who You Are

Fake journals will address you in unusual ways, or not at all! Some make no effort to conceal that they merely cut and paste your name into the slot of a form letter. Here are some examples I have received.

Dear Dr. WJ William J,

Dear ,

>Dear Dr.Jr WJ ,

Dear Dr. SULLIVAN,

Dear Dr. William J. Sullivan, Jr.1,2*,

Dear Dr. Jr,

Dear Dr. Jr William,

Dear Author,

Dear Researcher,

Dear Dr. Ferris (or someone else who is not me),

The opening is almost always followed by something like this:

“Greetings from the [PREDATORY] Journal!!!” or “Hope you are doing great!!” or “Hope our e-mail finds you well and in healthy mood.”

A New Type Of Sport: Extreme Flattery

Reading these emails can be a big boost to your ego, but keep in mind that thousands of others received the same exact praise. I’ve been called “esteemed,” “brilliant,” “magnificent,” and the “leader in the field.” I should show these emails to my mom; fake or not, she’d be very proud to see her son put on such a high pedestal.

They also make our work out to be the greatest thing since the microscope:

“It’s your eminence and reputation for quality of research and trustworthiness in the field of [insert a field that usually has no relationship to mine whatsoever] and for which you have been invited to become an honourable editorial board member.”

“We have gone through your papers and find it is a wonderful resource for upcoming works.”

“It would be our honour and great fortune if you will share your manuscript.”

“It is our immense pleasure to invite you and your research allies to submit manuscript.” (I call all of my collaborators my Research Allies now, and warn everyone that if you’re not with us, you’re against us).

Some of the spam even tries to reassure you that it is not spam. Although I don’t doubt the last sentence (in bold).

“This is not a spam message, and has been sent to you because of your eminence in the field. If, however, you do not want to receive any email in future then reply us with the subject remove /opt-out. We are concern for your privacy.”

Smell the Desperation?

There is a palpable urgency in these invitations to get their hands on your work cash before someone else does…

Emails from predatory publishers often close with more friendly compliments, with the hope that you will continue to provide them a constant research revenue stream…

“We look forward to a close and lasting scientific relationship for the benefit of scientific community.”

“Anticipating for a positive response!”

“Thank you and have a great day doctor!”

“Our journey is effectively heading.” (whatever that means)

And here are some of the more amusing ways they describe my anticipated work:

“We await your adorable paper.”

“Your paper will serve as a wave maker.”

“We aspire you also to be a significant part of our team by publishing your magnificent article.”

And one that is just plain baffling:

“For future anticipation to reach out scientific community we want your support towards the success of Journal.”

They may even try to get you to do their dirty work for them. I take the following sentence to mean that they want me to spread the word about their predatory journal and invite my colleagues to be suckered into submitting work to them as well:

“As this is an invited submission request, you can also suggest your colleagues.”

Some Final Warning Signs

Be wary of any inaugural issues or invitations that ask you to submit your paper through an email address.

These predatory publishers may also try to give you a false sense of the importance of their “journal.” One of these invitations boasted that “email newsletters are being circulated to 90,000+ subscribers.” All this tells me is that 89,999 other people received their unsolicited spam.

I’ve also noticed they are often willing to discount their publication fees if you send them something – “anything at all” – within seven to ten days. No legitimate journal expects you to whip up a quality article in that short a time.

Oh, and one more thing: many of these emails have an excessive number of exclamation points!!!

I hope that you are now better equipped to catch a predator. If you still aren’t sure if your email invitation describes a bona fide journal or not, you can check Beall’s List of Predatory Journals and Publishers or see if the journal is indexed on PubMed. Here’s hoping that you don’t get fooled!!!

BRIAN will be upgrading to a new version next week, so will be inaccessible to users on Monday 30th April and Tuesday 1st May. The main improvement for this upgrade is the introduction of a new Assessment module to enable more efficient REF preparation. However, we hope to also introduce more user friendly reporting over the next few months.

It’s been over 18 month since Bournemouth University launched its new Research & Knowledge Exchange Development Framework, which was designed to offer academics at all stages of their career opportunities to develop their skills, knowledge and capabilities.

Since its launch, over 150 sessions have taken place, including sandpits designed to develop solutions to key research challenges, workshops with funders such as the British Academy and the Medical Research Council and skills sessions to help researchers engage with the media and policy makers.

The Research & Knowledge Exchange Office is currently planning activities and sessions for next year’s training programme and would like your feedback about what’s worked well, areas for improvement and suggestions for new training sessions.

Tell us what you think via our surveyand be in with a chance of winning a £30 Amazon voucher. The deadline date is Wednesday 28th March.

Open access is about making the products of research freely accessible to all. It allows research to be disseminated quickly and widely, the research process to operate more efficiently, and increased use and understanding of research by business, government, charities and the wider public.

There are two complementary mechanisms for achieving open access to research.

The first mechanism is for authors to publish in open-access journals that do not receive income through reader subscriptions.

The second is for authors to deposit their refereed journal article in an open electronic archive.

These two mechanisms are often called the ‘gold’ and ‘green’ routes to open access:

Gold – This means publishing in a way that allows immediate access to everyone electronically and free of charge. Publishers can recoup their costs through a number of mechanisms, including through payments from authors called article processing charges (APCs), or through advertising, donations or other subsidies.

Green – This means depositing the final peer-reviewed research output in an electronic archive called a repository. Repositories can be run by the researcher’s institution, but shared or subject repositories are also commonly used. Access to the research output can be granted either immediately or after an agreed embargo period.

To encourage all academic communities to consider open access publishing, Authors Alliance has produced a comprehensive ‘Understanding Open Access‘ guide which addresses common open access related questions and concerns and provides real-life strategies and tools that authors can use to work with publishers, institutions, and funders to make their works more widely accessible to all.

Lizzie Gadd warns against jumping on ‘bad metrics’ bandwagons without really engaging with the more complex responsible metrics agenda beneath.

An undoubted legacy of the Metric Tide report has been an increased focus on the responsible use of metrics and along with this a notion of ‘bad metrics’. Indeed, the report itself even recommended awarding an annual ‘Bad Metrics Prize’. This has never been awarded as far as I’m aware, but nominations are still open on their web pages. There has been a lot of focus on responsible metrics recently. The Forum for Responsible Metrics have done a survey of UK institutions and is reporting the findings on 8 February in London. DORA has upped its game and appointed a champion to promote their work and they seem to be regularly retweeting messages that remind us all of their take on what it means to do metrics responsibly. There are also frequent twitter conversations about the impact of metrics in the up-coming REF. In all of this I see an increasing amount of ‘bad metrics’ bandwagon-hopping. The anti-Journal Impact Factor (JIF) wagon is now full and its big sister, the “metrics are ruining science” wagon, is taking on supporters at a heady pace.

It looks to me like we have moved from a state of ignorance about metrics, to a little knowledge. Which, I hear, is a dangerous thing.

It’s not a bad thing, this increased awareness of responsible metrics; all these conversations. I’m responsible metrics’ biggest supporter and a regular slide in my slide-deck shouts ‘metrics can kill people!’. So why am I writing a blog post that claims that there is no such thing as a bad metric? Surely these things can kill people? Well, yes, but guns can also kill people, they just can’t do so unless they’re in the hands of a human. Similarly, metrics aren’t bad in and of themselves, it’s what we do with them that can make them dangerous.

So, you might have an indicator such as ‘shoe size’, where folks with feet of a certain length get assigned a certain shoe size indicator. No problem there – it’s adequate (length of foot consistently maps on to shoe size); it’s sensitive to the thing it measures (foot grows, shoe size increases accordingly), and it’s homogenous (one characteristic – length, leads to one indicator – shoe size). However, in research evaluation we struggle on all of these counts. Because the thing we really want to measure, this elusive, multi-faceted “research quality” thing, doesn’t have any adequate, sensitive and homogeneous indicators. We need to measure the immeasurable. So we end up making false assumptions about the meanings of our indicators, and then make bad decisions based on those false assumptions. In all of this, it is not the metric that’s at fault, it’s us.

In my view, the JIF is the biggest scapegoat of the Responsible Metrics agenda. The JIF is just the average number of cites per paper for a journal over two years. That’s it. A simple calculation. And as an indicator of the communication effectiveness of a journal for collection development purposes (the reason it was introduced) it served us well. It’s just been misused as an indicator of the quality of individual academics and individual papers. It wasn’t designed for that. This is misuse of a metric, not a bad metric. (Although recent work has suggested that it’s not that bad an indicator for the latter anyway, but that’s not my purpose here). If the JIF is a bad metric, so is Elsevier’s CiteScore which is based on EXACTLY the same principle but uses a three-year time window not two, a slightly different set of document types and journals, and makes itself freely available.

If we’re not careful, I fear that in a hugely ironic turn, DORA and the Leiden Manifesto might themselves become bad (misused) metrics: an unreliable indicator of a commitment to the responsible use of metrics that may or may not be there in practice.

I understand why DORA trumpets the misuse of JIFs; it is rife and there are less imperfect tools for the job. But there are also other metrics that DORA doesn’t get in a flap about – like the individual h-index – which are subject to the same amount of misuse, but are actually more damaging. The individual h-index disadvantages certain demographics more than others (women, early-career researchers, anyone with non-standard career lengths); at least the JIF mis-serves everyone equally. And whilst we’re at it peer review can be an equally inadequate research evaluation tool (which, ironically, metrics have proven). So if we’re to be really fair we should be campaigning for responsible peer review with as much vigour as our calls for responsible metrics.

Bumper stickers by Paul van der Werf (CC-BY)

It looks to me like we have moved from a state of ignorance about metrics, to a little knowledge. Which, I hear, is a dangerous thing. A little knowledge can lead to a bumper sticker culture ( “I HEART DORA” anyone? “Ban the JIF”?) which could move us away from, rather than towards, the responsible use of metrics. These concepts are easy to grasp hold of, but they mask a far more complex and challenging set of research evaluation problems that lie beneath. The responsible use of metrics is about more than the avoidance of certain indicators, or signing DORA, or even developing your own bespoke Responsible Metrics policy (as I’ve said before this is certainly easier said than done).

The responsible use of metrics requires responsible scientometricians. People who understand that there is really no such thing as a bad metric, but it is very possible to misuse them. People with a deeper level of understanding about what we are trying to measure, what the systemic effects of this might be, what indicators are available, what their limitations are, where they are appropriate, how they can best triangulate them with peer review. We have good guidance on this in the form of the Leiden Manifesto, the Metric Tide and DORA. However, these are the starting points of often painful responsible metric journeys, not easy-ride bandwagons to be jumped on. If we’re not careful, I fear that in a hugely ironic turn, DORA and the Leiden Manifesto might themselves become bad (misused) metrics: an unreliable indicator of a commitment to the responsible use of metrics that may or may not be there in practice.

Let’s get off the ‘metric-shaming’ bandwagons, deepen our understanding and press on with the hard work of responsible research evaluation.

Elizabeth Gadd is the Research Policy Manager (Publications) at Loughborough University. She has a background in Libraries and Scholarly Communication research. She is the co-founder of the Lis-Bibliometrics Forum and is the ARMA Metrics Special Interest Group Champion

The outcomes of this year’s Teaching Excellence Framework (TEF) and the direction for the Research Excellence Framework (REF) as set out in the 2017 consultation response are likely to have significant implications for the higher education sector. The links between research and teaching are likely to become ever more important, but set against the context of increasing emphasis on student experience, how should the sector respond and where should it focus?

REF & TEF: the connections will be hosted at Bournemouth University and will bring together some of the leading experts in higher education in both research and teaching policy. During the morning, attendees will have the opportunity hear from experts from across the higher education sector, as they share their insights into the importance of the links between teaching and research. The afternoon will feature a number of case studies with speakers from universities with a particularly good record of linking research and teaching.

Speakers confirmed to date include Kim Hackett, REF Manager and Head of Research Assessment, HEFCE and John Vinney Bournemouth University, William Locke University College London, ProfessorSally Brown Higher Education Academy.

As part of the Writing Academy, a series of writing days have been organised to help support BU authors work on their publicationsby providing some dedicated time and space, away from everyday distractions.

The days will have a collaborative focus on productive writing with other BU authors, the RKEO team will also be on hand to provide authors with help and guidance on all areas of the publication process.

Writing Days have been scheduled on the below dates:

Friday 15th September 2017

Thursday 2nd November 2017

Friday 5th January 2018

Wednesday 7th March 2018

Tuesday 22nd May 2018

Monday 23rd July 2018

Spaces are limited so please only book on if you are able to commit to attending for the whole day.

Scopus have enhanced their article-level metrics through the integration of Plum X Metrics and to support this are hosting a webinar titled ‘How PlumX Metrics on Scopus help tell the story of your research’ on 10th August at 5pm.

The ThinkProductive Team will be visiting BU next Wednesday to deliver a 90 minute action-packed seminar on How to be a Productivity Ninja™ . They will share with you the 9 Characteristics of the Productivity Ninja™ and help you to identify specific ways you can implement them.

If you want to learn the way of the Productivity Ninja™ then book on here!

The ThinkProductive Team will be visiting BU next Wednesday to deliver a 90 minute action-packed seminar on How to be a Productivity Ninja™ . They will share with you the 9 Characteristics of the Productivity Ninja™ and help you to identify specific ways you can implement them.

If you want to learn the way of the Productivity Ninja™ then book on here!

On Wednesday 28th June, the Writing Academy will be hosting a Lunchbyte session with Sara Ashencaen Crabtree. During the session Sara will talk about her personal publishing experience, her approaches to research and writing, her tips on developing a publication strategy, working with co-authors, reviewers and editors. She will talk about all types of publishing drawing on personal experience, focusing on international reach.