The rise of national research assessments – and the tools and data that make them work

Nations are basing high-stakes decisions on reports that use Elsevier’s knowledge, tools and datasets – with impressive results in the UK’s REF2014

By Michiel Schotten and M’hamed el Aisati Posted on 17 December 2014

Share story:

National assessments of a country's research performance are gaining traction worldwide, be it on the level of institutions, research units or individual researchers. This is mainly due to the immediate benefits that can be gained from national assessment exercises: not only do they aim to boost the country's research performance; they provide public accountability to taxpayers for the nation's research spending.

A national assessment can showcase the direct benefits of funded research to society by showing that it has stimulated the economy, improved healthcare and guided policy decisions. And the outcomes of a national assessment exercise may be used by government funding agencies in deciding how to distribute the research funding pie among the different institutions and research projects. This way, research excellence can be fostered by funding the best and most impactful research, providing the "best bang for the buck" from the taxpayers' perspective.

It may be no coincidence that the UK also features one of the longest running and most elaborate national research assessment programs worldwide. Originally started in 1986 as the Research Assessment Exercise RAE, it was conducted approximately every five years until its most recent assessment in 2008, after which it was replaced by the Research Excellence Framework (REF 2014).

The REF 2014, undertaken by the four UK higher education funding bodies to assess the research quality in all UK universities, for the first time explicitly incorporated a research impact element, or so-called REF "impact case" studies, to assess the non-academic impact of research. Also the use of citation data, to support the evaluation of a limited number of subject areas, was first introduced during REF 2014. The results of the REF 2014 assessment will be made publicly available on December 18, 2014.

The data behind REF2014

In 2011, Elsevier won the open tender issued by the Higher Education Funding Council for England (HEFCE) to provide the citation data, tools and expertise required to support the REF 2014 assessment. For every UK researcher included in the REF 2014 submission, up to four of what they considered to be their best scientific works (e.g., journal articles, conference papers, books, patents, software – all published in the period 2008-13) were to be submitted, then assessed by Expert Panels, based on the criteria of scientific originality, significance and rigor. In addition to peer review by these panels of the actual contents of scientific publications themselves, in a number of subject areas, or so-called Units of Assessment (UOAs), citation data of the publications were also used as a potential indicator of research quality.

It was these citation data that Elsevier provided for the REF 2014, again using Scopus, the largest abstract and citation database of peer-reviewed research literature, as the database of choice. With its 55 million records from over 21,900 scientific journals, conference proceedings and books, provided by more than 5,000 international publishers in 105 countries, Scopus has the broadest range of coverage to offer the best, most accurate and most comprehensive citation data available. And HEFCE is not alone in their choice for Scopus: as of last month, Scopus was also selected as official partner to provide data for the influential Times Higher Education (THE) World University Rankings for 2015 and beyond.

For the REF 2014, Elsevier's Analytical Services team developed a sophisticated publication "Scopus matching module" that was incorporated into HEFCE's own REF submission system. This enabled UK institutions to submit all their publications for which citation data were required. These included publications belonging to one of the 11 Units of Assessment (UOA), out of a total of 36 UOAs, that were subject to citation analysis in the REF 2014 exercise, such as Chemistry, Physics and Biological Sciences. They were able to receive an immediate match of the corresponding record in Scopus, with the number of citations that the publication had received from other records in Scopus. This match was based on the article metadata submitted by the institution, such as article title, authors, year, DOI, volume and page numbers, which were then used by a bespoke algorithm to link the article to the correct Scopus record.

During the REF, some 250,000 articles were submitted for Scopus matching by the UK institutions. When a Scopus match was returned, including the Scopus citation count for that article, this could then be used by the institution as additional information, to assist them in making a final call on which publications to submit for REF 2014. For that same purpose, all participating institutions in the UK – independent of whether they were already an existing Scopus customer – were also given free access to Scopus Preview mode for the duration of the REF 2014 submission period, providing them with additional details about the articles. On top of this, a dedicated REF Support Team at Elsevier assisted institutions with any queries they had on Scopus data, citation counts and their submission for REF – a service which was very much appreciated, according to customer feedback.

In addition to the support provided to UK institutions, Elsevier also delivered citation benchmarks to HEFCE and the Expert Panels. This enabled them to put citation counts of submitted publications in the context of the relevant subject area, and then be able to interpret and compare the citation count of, say, an article in Health Sciences to that of a Physics article, for example. This helps to ensure that metrics are used wisely and only in "apples-to-apples" types of comparisons. Elsevier also acted as a collaborating partner to HEFCE during the extensive preparations that setting up an elaborate assessment such as the REF 2014 requires. Elsevier provided Scopus APIs to incorporate into HEFCE's submission system, training, consultancy, and pilot runs – every possible effort was taken to ensure a very smoothly running assessment, with the least amount of administrative overload for all involved stakeholders. According to Dr. Vicky Jones, REF Deputy Manager at HEFCE, "(Elsevier) took the time to understand what we were trying to achieve and what we needed. … It went very smoothly. Good lines of communication and a good relationship meant that any issues were quickly resolved. We were very pleased with the outcome."

Pure, SciVal and the REF 2014 Results Analysis tool

Other tools from Elsevier also contributed to the REF analysis process. For example, a dedicated REF 2014 submission module was specially developed for Pure, Elsevier's Current Research Information System (CRIS) that is used by universities to manage and track all their internal and external research management data sources. The Pure REF module allowed for an easy and streamlined REF submission process, which, apart from the "up to four publications per researcher", also required a wealth of additional data from the UK institutions, such as their number of staff FTEs, research income, number of doctoral agrees awarded, etc. Since the REF 2014 submission rules were built into the Pure REF module, and it also streamlined the submission of the "impact case" studies now required for REF, it immensely alleviated the entire REF submission workload. Dr. Claire Dewhirst, Research Impact Manager at Queen's University Belfast, said, "We managed everything within the REF module. We didn't really work within the HEFCE system at all." And Dr. Mark Cox, Head of Research Information Systems at King's College London, said, "I interviewed staff members in the university and they said that they didn't know how they would have got by without Pure. In fact they didn't know how any other university could get by without a system similar to Pure."

Another Elsevier tool that was used by several UK universities to prepare for the REF is SciVal, which allows quick and easy access to analyze the research performance of over 5,200 research institutions in 220 countries. Its main use during the REF submission preparation was to evaluate faculty staff members and the impact and quality of their research papers prior to submission. In its latest release on October 15, SciVal now also offers the option to classify journals according to the 36 REF2014 Units of Assessment, as well as another classification by the 22 so-called Fields of Research (FoRs) that are used for the Excellence in Research for Australia (ERA 2015) assessment in Australia.

Finally, the Results Analysis Tool has helped UK institutions to analyze their own REF 2014 results prior to the official public announcement of the REF2014 results on December 18. The new tool (which was approved for institutional use by the REF team at HEFCE) allows instant performance assessment across a number of measures, both nationally and relative to chosen peers, helping to quickly establish internal and global responses. So far, over half of all higher education institutions in the UK have signed up for the tool. This completes the circle of support provided by Elsevier during all phases of the REF 2014: during the extensive preparation and planning process, during the actual submission of research outputs and additional data, and support for interpreting and analyzing the final REF outcomes.

National assessments in other countries

Beyond the UK, national research assessments are also on the rise in other countries. An assessment which is comparable in its size and level of sophistication with the UK's REF is the ERA national assessment in Australia, mentioned above. ERA has been organized by the Australian Research Council (ARC) about once every two or three years since its first edition in 2010. As with the REF, Elsevier's support consists of matching the research outputs, submitted by the Australian universities, with their corresponding records in Scopus (through a custom-designed submission web portal that uses an elaborate Scopus EID tagging algorithm), providing static citation counts for the submitted articles, as well as citation benchmarks to put these numbers in the right context. Elsevier has been the partner of choice for the ARC in designing and planning their national assessment exercises, and even helped set up the first pilot ERA exercise in 2009. Additionally, Pure also now offers a dedicated ERA submission module.

Furthermore, the Fundação para a Ciência e aTecnologia (FCT) in Portugal organized an evaluation of all Portuguese R&D units in 2014, for which it selected Elsevier to provide citation data, benchmarks and bibliometric analyses. For this purpose, the FCT required every Portuguese researcher to create a personal research profile in ORCID (an open, community-based platform which stands for "Open Researcher and Contributor ID") and to import all their publications indexed in Scopus into their new ORCID profile. Here, the enhanced integration functionality between Scopus Author Profiles and ORCID, supported by the "Scopus 2 ORCID feedback wizard" tool as well as new author details exporting functionality in Scopus, proved to be of immense value. The Portuguese evaluation exercise was also the first national assessment to incorporate and promote the use of so-called Snowball Metrics, a community-driven and bottom-up initiative by the universities themselves to define a set of global standard metrics for institutional benchmarking.

The long-term impact of research assessments

So it certainly looks like national research assessments are here to stay. This represents a cultural change for many of the institutions and researchers that are subject to an assessment, and obviously a fair balance needs to be found between the administrative load for institutions and the thoroughness of an assessment – as well as following other best practices, such as those laid out in the San Francisco Declaration on Research Assessment. But this change has potential benefits for everyone involved as well – such as improving the quality and standing of a country's research output, as has arguably happened in the UK. With the expertise Elsevier has built up over the years as a development partner supporting such national assessments, together with its comprehensive data and tools, Elsevier is looking forward to partnering with other countries as well.

Elsevier's Research Intelligence

As part of Elsevier's Research Intelligence services, the Analytical Services team collaborates with nations around the world to prepare research assessments. They use data from Scopus – the world's largest abstract and citation database of peer-reviewed literature – along with sophisticated tools to show how institutions and nations compare.

Snowball Metrics

University research executives need metrics for all the research activities in which their institution invests resources and would like to excel. Therefore, representatives at major research institutions agreed to use Snowball Metrics, which include all activities and an additional set of denominators that can reveal research strengths on a more granular level or which helps to normalize in reference to size. This is a chart of the metrics used for the for the FCT assessment of all Portuguese R&D units in 2014:

Elsevier Connect Contributors

Michiel Schotten is National Research Assessments Manager at Elsevier, based in Amsterdam. He has been involved with managing the day-to-day work around the submission of research publications by institutions, and their matching to the Scopus database, both for REF 2014 in the UK and currently for ERA 2015 in Australia.

M'hamed el Aisati is Director of Content and Analytics at the Research Management department of Elsevier, based in Amsterdam. After joining Elsevier in 1998, M'hamed contributed to the realization of the first Elsevier digital journals platform preceding ScienceDirect , and later on in 2004, stood at the cradle of Scopus, which he helped launch. In his current role, M'hamed heads a content and services team that looks after research management content, analytical services and supporting large research performance evaluation programs, as part of the Elsevier Research Intelligence suite. M'hamed has been a key player at Elsevier in developing the Scopus matching submission platforms for all ERA assessments and for REF 2014, and in establishing Elsevier's collaboration with both ARC and HEFCE to help organize these assessments.

M'hamed holds an MSc in Computer Sciences from the University of Amsterdam. He has published several papers in peer-reviewed journals and holds two patents.