About this blog

Small print

Tools

Research Assessment Exercise (RAE) 2008

January 20, 2011

Vince Cable and David Willetts are trying to steer universities in a new direction. So the ministers at the Department of Business Innovation and Skills gave the Higher Education Funding Council for England unusually extensive and detailed guidance in the annual grant letter they sent at the end of December. What follows is the full text of that letter with annotations by me in red that are intended to explain, interpret and comment on what the government is doing.

November 12, 2010

What follows is an annotated version of the speech given by Richard Lambert on Browne and the Comprehensive Spending Review today. My annotations are shown in red.

The director-general of the Confederation of British Industry gets top marks from me for tackling both issues together. In the end, both bits of policy are just ways of talking about universities. Let's see how compelling his analysis is...

I‟d like to discuss how the combination of Lord Browne‟s report on university funding and the outcome of the Spending Review will shape the way that businesses and universities are likely to work together in the future.One way or another, the impact is going to be big.

November 11, 2010

The pilot exercise to test the Higher Education Funding Council for England's plans to assess economic and social impact in the 2014 Research Excellence Framework has reported back, and on the whole the people who took part—at least those who chaired the evaluation panels—are satisfied that it works.

This will certainly be a relief to HEFCE, who have faced harsh criticism over the past year from academics implacably opposed to impact assessment , though in some cases that ire would have been better directed at the research councils.

The pilot panel chairs did recommed some changes to make the system work better. They want the weighting given to the impact element to be reduced from the planned 25 per cent, at least for the first go-round, until everyone gets used to it. This is a sesible suggestion, which universities and learned societies have also been pushing for, and one that HEFCE is sympathetic to. It will almost certainly be heeded.

September 29, 2010

Cuts of 15 per cent in the research funded by the Higher Education Funding Council for England could redraw the map of research in England, with dozens of universities, hundreds of departments and tens of thousands of researchers potentially losing all their funding.

UPDATE. You can view the full institution-by-institution spreadsheet of losers cited by the Times yesterday and Guardian today here.

Research Benchmarks has looked at three scenarios for implementing cuts of 15 per cent in HEFCE’s £1.6 billion QR budget. The powerful impact on many different kinds of institutions in all three shows there is no easy option for cuts as ministers finalise their plans.

The three scenarios have been inspired by the different emphasis placed on cuts by Vince Cable, David Willetts and Universities UK.

March 18, 2010

England's top universities will get a boost in the research portion of their block grant from the Higher Education Funding Council for England in 2010-11, according to the funding council's preliminary funding allocations, published today.

The University of Oxford came out top, with a £7 million - 6 per cent - rise in research funding. The University of Cambridge gained 3.7 per cent, University College London 4.3 per cent, Imperial College London 3.3 per cent and the University of Manchester 2.4 per cent. These institutions, the top five in terms of research funding, between them took 33 per cent of the £1.6 billion pot, a marginally larger share than last year.

Most of the rest of the Russell Group of large research intensive universities also saw increases, though the Universities of Liverpool and Newcastle saw small declines. Losses to other universities were generally small, and were spread among many smaller institutions.

The change was mainly due to HEFCE's decision, in response to the government's desire for greater concentration of research funding, to increase the "slope" of the formula used to determine research allocation to more heavily favour the highest quality work as determined in the 2008 Research Assessment Exercise.

Few universities, however, saw large increases in their overall block grant funding. The overall pot of £7.3bn is 1.6 per cent down on last year, though both the teaching and research components rose. This is the first time HEFCE's grant has fallen since Labour came to power in 1997. The majority of the losses come from capital funding and other "special funding". This means, for example, that despite Oxford's impressive gain in research funding, its overall block grant is up just 1 per cent, mainly due to losses in funding for old and historic buildings.

Overall, around half of universities in England will receive less money than last year, with the other half posting small gains in cash terms. But many of those gains are less than inflation, so are essentially a cut in real terms. A few universities will see larger percentage cuts, for example the University of Reading and the London School of Economics. This is due to the end of "moderation" funding, which was used to smooth out drastic changes in grant allocations last year. The £20m in moderation funding HEFCE has available this year is being used to smooth out changes in other areas.

February 01, 2010

Oxford and Cambridge are set to benefit from the steeper funding ‘slope’ introduced by the Higher Education Funding Council for England for quality related research funding (QR) in 2010-11, according to our preliminary analysis, but the overall changes will be small.

On 1 February the funding council announced a new formula for distributing QR in a letter to universities in England and Northern Ireland. In response to the government’s desire for a greater concentration of research funding, the funding council has increased the weightings for 2*, 3* and 4* work, as judged by the 2008 Research Assessment Exercise, from 1:3:7 to 1:3:9.

The shift will benefit institutions that have a higher proportion of work in the 4* category in the 2008 RAE. First estimates from the Research Fortnight Benchmarking application show Oxford and Cambridge gaining about £4m a year between them, a rise of about 3 per cent, with the losers scattered among the English members of the various university groupings. The rest of the Russell Group shows a marginal gain with the 1994 Group showing a marginal loss, while both the University Alliance and the million+ groups lose about £2m. The University of Newcastle, which got just 14 per cent of its staff into the 4* category, is likely to be among the hardest hit, and could lose around half a million a year.

Despite the small amounts of money involved, some universities are still concerned about the direction this is going. They point to the fact that HEFCE calls this an "initial step towards increased concentration" and worry that there are other plans afoot to concentrate it further in years to come.

HEFCE also decided to tweak the QR allocations for geography and psychology, which were left outside of the ring fence for science disciplines last year. HEFCE says it recognises that “around half the research activity in these disciplines…could reasonably be regarded as more akin to work in STEM disciplines than to that in the other social sciences”, geography and psychology will receive 50 per cent of the additional funding that they would have received had there been no STEM ring fence. Got that?

December 22, 2008

The RAE2008 results have produced more information about the spread of the UK’s research achievement than we could have hoped for back in the 1980s. The profiling that Gareth Roberts proposed has overcome some of the worst deficiencies of the old grading system. But it is pretty clear that the data avalanche has buried a lot and it will take a while to dig ourselves out.

Evidence planned to get a post on the blog by Thursday lunchtime last week, and we failed. We failed because, in the words of Morris Zapp, ‘every decoding is another encoding’. Each time we ran a query we came up with new questions, and very few answers.

The best bit about the data is the profiling, and that is also the trip-wire for analysts. Profiling is great because it shows ‘what lies beneath’. For research, most activity is skewed. Some people earn a lot of money, train a lot of post-grads and publish a lot of papers. Most people are less Stakhanovite. Most papers get a couple of citations, or none. Some papers get hundreds of citations. So, wherever you look, you see a distribution that is anything but normal and that means that indices that use averages are not telling you much about the data spread and the centre of the distribution. The Roberts profile takes us away from those averages, and it takes us away from single grades with their terrible funding cliffs.

But how do you analyse a profile? One profile is OK. We can look and see the spread, and reflect on the balance of national and international and whether that means ‘good’ in our dictionary.

A set of profiles across one subject is even better – more work to do the comparisons but we can look up and down the list and see how things shift about. Interesting to see that one institution got a lot of 4* material but actually got less on [4* plus 3*] than another institution. Interesting to speculate on whether those top end 4* outputs are the most critical element or whether it is an overall grade point average that establishes a ranking. And how would you weight the elements in this subject, and would it be the same in that subject?

And that is what makes it very difficult when you start to try and create a combined picture across an institution, and then to put institutions into a single table. We work on the data and start to pick out some strange differences. Can it really be true that Anthropology, History of Art, Music and Drama all really have more than 20% of their output at 4* while Education, Psychology and Agriculture are under 10%? In media subjects and the arts we get 4* values as high as 65% ‘international leading’. I’m very happy for the institutions picked out but I do not really believe it. That means two-thirds of the activity was at the international cutting edge and that is an almost infeasible standard to meet.

Biology, Chemistry, Physics and Mechanical Engineering are in the low teens. Why so different? Two things. One is a much greater familiarity with the concept of categorising a portfolio of evidence about research activity. It’s a pretty basic part of science culture but it doesn’t come so easily to humanities and arts (check the scars on the Warden of Goldsmiths if you don’t believe me). So, for scientists, this is ‘what we do’.

The second thing is dialogue and consensus. Running through 2007, at a lot of meetings and on visits to institutions, I heard people talking about what they thought the RAE2008 outcome would be like. The view, which became a common one, was that selective submission meant few 1* items, a lot of 2*, not too difficult to hit 3* but bloody difficult to get a 4* except on the very best. I thought that view was universal but I now think it was much more the view of scientists, and of scientists in pre-92 universities, than it was of academia as a whole. But because the dialogue was going on, the scientists’ and the engineers’ consensus was reflected fairly consistently in the outcomes for the bigger subject areas.

More thoughts follow, but one thing is clear. Any league table you create now may look pretty shaky by Twelfth Night.

December 19, 2008

So, once again we have the gift of RAE results to brighten our festive cheer. But, in the spirit of the season, should we not pause a moment to reflect on what the outcomes actually tell us? There will, inevitably, be much obsessive interrogation of the outcomes and their implications in the days and weeks ahead but, in the midst of that, let us not forget the limitations of the RAE.

Much might have been done to refine the assessment process for 2008, building on the experience of panels in previous exercises. For example, this exercise could have tried to improve the consideration of interdisciplinary research, applied work, and the lot of early career researchers.

I have no doubt that all panel members in 2008 will have worked hard at these issues, and yet the framework they were given to work within looks very familiar and increasingly anachronistic.

Research has moved on apace: it is ever more collaborative, trans-national, interdisciplinary; it is increasingly done in knowledge sharing and translational partnerships with its non-academic users and beneficiaries; it is disseminated through a multitude of formats and media.

Yet, the RAE remains stubbornly disciplinary in its structure and, as I am sure closer examination of submissions will show, concerned with a rather narrow range of ‘safe’, conventional, academic outputs.

Now is not really the time to talk about the prospective Research Excellence Framework, but it is hard to see how the REF can be other than a retrogressive step in these areas as it is currently proposed. Research is complex and sophisticated: it deserves the same qualities in the systems by which it is evaluated. The RAE is not perfect, but it is a subtle and mature assessment tool, pioneered in the UK and recognised internationally as a model of good practice that commands confidence among the research community. It could be further improved, drawing on the skills and experience of panel members and our own growing research-based understanding of peer review processes. Should we really abandon it so lightly in favour of the questionably cheaper and undoubtedly nastier blunt instrument of bibliometrics?

The only genuine major change introduced for the RAE 2008 is the one that is proving most problematic, in many ways, as we grapple with the results. The currency has changed: graded profiles replace numerical values. Moreover, the conversion rate from the old grading scale is far from clear, and significant questions remain over whether, and to what extent, the currency has been debased in the process.

Equally, the mix of base and precious metals does not appear, on first glance at the results at least, consistent across the different territories in the RAE landscape. There are variations in the grade profiles awarded by some panels that seem rather at odds with what we might otherwise expect of the disciplines they have assessed.

The results of previous RAEs have been broadly endorsed after the event by comparative bibliometric analyses and independent expert opinion. We will need the same tests to be applied to the 2008 results, I suspect, before we can have full confidence in their robustness and consistency.

Perhaps the most worrying aspect of the change in rating scale is that the familiar hallmarks of quality, the 5 and 5*s, which are understood and relied on by research partners, sponsors, prospective research students and staff, have disappeared. As a sector, we face major challenges ahead in communicating the meaning of the results to those outside our walls who are not steeped in the mysteries of the RAE and for whom the ‘ready reckoners’ of quality have gone.

The need to offer clear, unambiguous quality labelling of the UK research base to industry, the public, voluntary services, policy makers and potential international students and staff recruits, will only increase as we face the difficult financial times ahead.

The season calls for celebration, and there is undoubtedly much to celebrate in the outcomes of the 2008 RAE. Research in UK higher education institutions is bigger, better and more exciting than ever before, so let’s party. But as we do so, let’s remember that there is also much exciting, relevant, valuable and high quality research that is not included within these results. They are only one indicator of the quality of a limited selection of the work that is going on in the UK research base. Keep them in perspective!John Rogers, Director of Research and Knowledge Transfer at the University of Stirling, was the Manager of RAE 2001.

Last February, the Council of the Royal Astronomical Society concluded that they had "no-confidence" in the new Science and Technology Research Council, which a few months earlier had taken on the responsibility of funding UK astronomy.

"The STFC’s Delivery Plan pays lip-service to the need to foster the UK academic community, who play the key role in delivery of all of STFC’s outputs–first class science, facility design and usage, and knowledge exchange, but has shown no evidence in its public statements or actions that it recognises this duty," announced the RAS, a normally cautious and diplomatic learned society. "The 25-per-cent decline in grants across the CSR period, with no sign of any intention or even desire to level this out in later years, has filled the community with deep pessimism and anger."

In a broad-ranging review of the current status of UK Physics, published in August, the Wakeham Panel reported the discipline to be in generally good health. However, as the review was set up in response to concerns such as those arising from the RAS Council, it was significant that the panel found that many of the strongest areas of UK physics were highly dependent on STFC support.

Furthermore, given concerns over a long-standing fall in physics A-level entries, it was important that the research areas in STFC’s remit, such as astronomy, space science and particle physics, are often also those most attractive to students–of both genders.

Adding to this turbulent scene, we now have another major input, with the results of the RAE 2008 RAE. Profiling all university physics departments on the percentage of staff judged to be of national or international repute, the overall picture confirms that university research is indeed competing very well at the (critical) global level. Or, at least, was recently performing well, given that the RAE is necessarily backward looking (over the period from 2002 to 2007).

So, how important is this final RAE for university physics?

In addition to providing another assessment of the discipline in a globally competitive research context, departmental profiles will, for several years ahead, determine the level of QR funding from HEFCE. And this second funding stream is more important than ever, in my view, given the erratic performance of the STFC and the enthusiasm of the Engineering and Physical Sciences Research Council for top-down thematic programmes.

Although representing a smaller fraction of university research income than it did before Fixed Economic Costing (fEC) was introduced, pre-FEC, QR funding is likely to form around 30 per cent of research ‘overhead’ income for a typical research-active department such as my own. It provides ‘seed corn’ for funding of young researchers not yet on the research council radar, and a useful counter to a reduction of external support for ‘blue skies’ research in response to government pressures for wealth-creation.

Translating a departmental research profile into the HEFCE grant will, of course, depend on the still-to-be announced funding formula. If HEFCE follows earlier practice, this will be skewed to disproportionately favour 3* and 4* researchers. If used that way, the QR funding will indeed be a significant cushion against STFC cuts, as it will benefit particularly those same strong departments that, paradoxically, are being hardest hit by the 25 per cent reduction in STFC grants.

However, recalling the damage done to the broader university physics community by previous RAEs, I would caution against straying too far from a linear funding formula, whereby a 4* researcher attracts 4 times the funding of a 1* researcher. Broadly spread QR funding could be crucial in avoiding more physics department closures, and an expansion of the regional ‘research deserts’ identified by the Institute of Physics.

Prestige is also at stake in the RAE results, in addition to funding. In that respect, the current exercise appears badly flawed.

To give a true picture of a department’s research strength, it is clearly important to know whether all eligible staff were submitted for assessment, or whether substantial numbers were ‘hidden away’. That normalisation has been denied in the 2008 RAE by HEFCE deciding not to reveal non-submission data. More sinister is the rumour that HEFCE’s decision was made under threat of legal action by a number of universities who chose the more ‘selective’ approach.

In response, we are likely to see conflicting ‘league tables’, with the most meaningful, of research ‘intensity’, using previous staff numbers to normalise the ‘quality’ tablebased only on submitted researchers.

So, should we welcome or regret the passing of the RAE?

I believe the RAE brought a new rigour to university research, and substantially raised the overall standards. However, as noted above, the harsher light led to some weak (and not so weak) departments being closed or merged. In physics, the number of university departments fell from 79 to 51 over the period of the 1991, 1996 and 2001 RAEs, with the undoubted loss of some important potential. On the other hand, the fall in student numbers probably made some shrinkage inevitable.

The Wakeham Report’s criticism that UK physics departments are sometimes too narrowly focussed has some validity, and has roots partly in the RAE. Another undesirable consequence of the non-linear funding of QR has been its influence on university appointments, with ambitious VCs competing–and often over-paying-for star performers in the manner of chairmen of Premiership football clubs.

Looking ahead, it seems that some metrics-based quality assessment will be used to guide the HEFCE component of dual research funding. No doubt there will be complaints that the particular metric is unfair to some. However, all universities must surely prefer to keep an imperfect system rather than to lose the dual-funding lifeline that must be in some danger now that both HEFCE and the research councils find themselves in the same part of Whitehall.Ken Pounds is Emeritus Professor of Physics at the University of Leicester and former Chief Executive of the old Particle Physics and Astronomy Research Council.

Hertfordshire’s performance in the RAE 2008 is good news and demonstrates that our selective strategy of submitting 14 Units of Assessment was successful. The graded profiles are very pleasing, particularly with six Units of Assessment showing 50 per cent or more internationally excellent and world-leading research (combined 3* and 4* ratings). Graded profiles are a fairer approach, compared with the single quality numerical values of RAE 2001, in the way that they identify the range of research activity in percentage terms over the five categories.

STEM research at Hertfordshire has performed very well with strong outcomes in Physics, Computer Science and Informatics, and General Engineering. The recently established Pharmacy provision also records a very respectable performance. Furthermore, both Nursing and Midwifery, and History show exceptional performances, which pus them towards the top of the university sector.

In these Units of Assessment, Hertfordshire has performed not only better than most of its post-1992 contemporaries but compares favourably with many longer established universities.

More generally, a number of post-1992 universities have shown real improvements in performance in specific Units of Assessment, and it is apparent that the 2008 RAE signals a further breakthrough in research for this sector. The relatively small investment to support research in the non-intensive research universities has resulted in excellent results overall.

Hertfordshire’s strategy over the past seven years has been to focus our somewhat limited resources on developing international excellence in selected research areas. This has not only proved successful but, in combination with the finer tuning in the assessment methodology, it now provide a platform for the university to build further its research profile.

The additional information that will be provided when the sub profiles of the assessment become available to individual universities in January 2009 will be invaluable. It will enable individual Units of Assessment to better understand the rationale for their overall profile, and to more fully identify their strengths and any weaknesses associated with their research activities.

Clearly, this will be particularly important for those Units of Assessments at universities that appear to have under-performed since the RAE 2001.

In addition to receiving our RAE results directly, we have benefited from using the Research Fortnight’s RAE benchmarking software. This has enabled us to assess our position in quality research terms against other universities.

We are delighted to be placed number 53 in a table of 117 universities, which again is indicative of the quality of our submissions in relation to the overall sector. There will, of course, follow a very interesting period of reflection, and a requirement that the funding algorithm applied is fair in its support of quality research wherever it is found. Assuming this is the case, then the RAE 2008 will have set a new benchmark in the assessment of research in the UK.

The challenge for the now delayed Research Excellence Framework will be to surpass this exercise by further enhancing the recognition of cross-disciplinary research and research aligned to societal and economic impact.John Senior is Pro-Vice-Chancellor (Research) and Graham Galbraith is Deputy Vice-Chancellor at the University of Hertfordshire.