Non-news story of the week

Even by the standards that now characterise policy debate and government pronouncements about the student experience, media coverage of the recent HEPI report plumbs new depths.

The report in question and the media recycle similar assertions to those made when the first report came out in 2006 – that variations in the contact time students have with lecturers are grave matters, that students are dissatisfied if they get fewer contact hours, that if students pay more in fees then they should expect and get more contact time (why?), that time consumed in studying is linearly and causally related to the quality of learning achieved, that students in the UK are getting a bad deal and probably inferior degrees compared with those in the rest of Europe…

These are claims that fly in the face of evidence and logic.

Just a few points:

What students say they get is not a measure of how much they actually get. Self reports on time spent are a poor substitute for hard data.

The new report somewhat disingenuously says that it and the previous ones are not mainly about contact time. Apparently it was all the fault of universities for thinking that they were.

The fees are not extra money; they replace taxpayers’ money. Whether or not more contact is a good thing, it is illogical to assert that the same amount of resource should lead to more teaching time. This is a rhetorical sleight of hand on HEPI’s (and the government’s) part.

Beyond a certain minimum, amount of time is unrelated to quality. It is not how many hours you study or spend in lectures that matters: it is the value you add to your learning during that time.

The report itself provides evidence that there is no relation between national student survey results and either contact time or private study time. However, it invites us to draw the opposite conclusion.

Does anyone really believe that students in Italy and Spain receive a better education than in the UK because they spend more time in class?

In fairness, quite a few comments on the Guardian’s story about this report pick up similar issues and articulately shred the whole media hype into fragments.

The report does raise some interesting issues, though. I stand by the response I made to the first HEPI report back in 2006 (except the parts that are a naked self-promotion of the HEA, of which I was then chief executive):

As a social science undergraduate, I studied statistics. Being something of a duffer at maths, I spent five hours private study on what my friend, Kevin (maths A-level), did in 30 minutes. I was at every lecture 10 minutes early. He skipped half of them. We both passed. We were both happy with the value for (taxpayers’) money the course gave us.

The point is that the number of hours you study does not tell you about quality of learning, student satisfaction, or value for money. It does tell you about students’ experiences, and we need to know more about them. That is why the Higher Education Academy funds research that tells us about those experiences.

The Higher Education Policy Institute (Hepi) report gives us a snapshot of how around 15,000 first and second-year students in English universities, taking a range of subjects, perceive the services and inputs they receive. A striking conclusion is the high level of student satisfaction with their academic experience. This is consistent with the findings of the National Student Survey.

What do the reported differences in hours spent tell us about the student experience in English universities?

We learn something about inputs, but very little about outcomes. People learn in different ways and at different paces. The relationship between hours invested and students’ learning outcomes is intricate. Research into students’ approaches to learning suggests that a high number of contact hours and private study does not automatically mean they learn better. You can fritter away 30 hours in front of the computer or your books and emerge with very little to show for it. You can go to a lecture and remember nothing significant. What is more important than time is the quality of the engagement – the degree to which you try to understand the material.

How any one student learns is a complex mixture of motivation, ability, peer pressure, available learning resources, previous knowledge, learning opportunities outside the classroom and other factors – as I found when I sat next to Kevin in statistics classes all those years ago.

The students surveyed by Hepi were broadly satisfied above and below a certain number of hours of set teaching time. It would be good to find out whether it is the hours themselves or the quality of what they do with those hours that affects their view. Are students who study longer hours more or less likely to be positive about their overall HE experience and to succeed?

Nor does the report tell us much about the degree system in English universities. Assessment does not depend on how many hours it took someone to complete a programme of study. A strength of the UK sector – not just in England – is the freedom it gives universities and colleges to set their own parameters and students the chance to find the method of learning that best suits them. It would be a pity if a crude cut of the data reported on hours spent became another form of league table (longer hours equals harder degree).

What the report does do very successfully is open up a number of policy areas around widening participation and definitions of “full-time” students. There is striking variation in the proportions of students at the different institutions who do differing hours of paid employment. This survey found that the impact of paid work on student satisfaction and on academic outputs is relatively small. The greatest reported impact is on perceptions of value for money among students who have paid jobs on top of their studies. This is an area crying out for further investigation.

The Higher Education Academy funded Hepi’s research because it is important to understand how students say they use their time. These are student experiences – not facts about quality. The quantitative benchmarks established, while limited, will provide a valuable basis for comparison as the impacts of fees and the next stages of the widening participation agenda begin to filter through to universities. We still need to find out more about the student experience: this report raises important questions about it.