Creating value for business and the UK economy – evaluating our impact

Karen Lee, Head of Impact, discusses the importance of understanding and measuring impact following the publication of the Hartree Centre’s first evaluation study.

The Hartree Centre is transforming UK industry through high performance computing, big data and cognitive technologies.

That’s our mission. But how do we know whether the research and innovation support we provide to businesses actually creates any value to them or the UK economy? Do we really need to know?

The quick answers to these questions are ‘through impact evaluation’ and ‘yes, we do’. But I would urge you to humour me a little and read on…For the Hartree Centre, measuring impact is about how we quantify, validate and articulate the benefits and outcomes of our specialist expertise, technologies and activities. For example, this could be new or improved products and services, creating gains in performance, skills development, new technology adoption, business growth or increased competitiveness.

But evaluating impact isn’t easy. There are plenty of challenges, and there is no one size fits all approach to doing it either. This year I took on a new role within the centre – swapping marketing what we do, for measuring what we do. And so far it’s been a fascinating challenge.

I’ve spent the last few months working with our clients, partners, colleagues across STFC and the Technopolis Group (the independent evaluators who undertook the study), on finalising the Hartree Centre’s baseline impact evaluation. This provides the first picture of the early benefits to industry, society and the UK economy (you can read the full report on our website, or if you are short of time there’s the summary version).

An infographic of some of the headline findings from the Hartree Centre Phase 1 and 2 Baseline Evaluation Report

Evaluating impact isn’t about knowing the number of companies and sectors that we have worked with, or even hearing about the projects we have worked on with them (though these are all positive messages for engaging our stakeholders about what we do). For me, evaluation is about being able to demonstrate that what we are doing is an effective use of public funding. It’s about understanding why what we are doing is having an impact, about what works, what doesn’t work, whether we are we doing the right things, what can be improved and to look at what’s next.

One of the Civil Service’s impact evaluation ‘bibles,’ the Magenta Book, says something similar: “evaluation is an objective process of understanding how a policy or other intervention was implemented, what effects it had, for whom, how and why.”

The need to use impact evaluation to ‘prove and improve’ the support we provide to industry is increasingly something that those developing and managing the UK’s research and innovation funding mechanisms are wanting to help shape future policy and resource allocation.

As I said earlier, impact evaluation isn’t easy – it can be hard to quantify the benefits, there can be time lags to the benefits materialising, how do you know it was your support that made the difference? And then there’s the issue of getting the right data in the right depth and balancing this with client confidentiality.

No evaluation is perfect. But a good evaluation ensures that the most robust methods of evaluation possible are applied, using multiple methods to triangulate findings. Key to this is making sure that right data is captured and monitored.

Through our baseline evaluation we have a stronger understanding of where we are creating value, what is working well and how we can move forward to make creating and measuring impact part of our culture.

The report made a number of recommendations about how we could start to build a longer-term evaluation programme to support our impact measurement. As we build our evidence base by adapting these to suit the centre’s needs we will, over time, be able to better design our activities further to increase our impact and maximise the return from public funding.

We’ll publish our outline evaluation framework next year – and we know that there will always be room for improvement. If you have any thoughts, stories or lessons about impact evaluation from your own experiences, do get in touch!