An attempted antidote to the More Means Worse argument used in higher education

Month: October 2018

Rankings, of all sorts and all qualities, are here to stay. There are the mega-rankings with their spurious claims to objectivity with columns and columns of data (to measure how dissimilar a university is to Oxbridge). Then there are those which are shamelessly about plugging a product/service by running a bit of a survey to drum-up up a ranking.

But you have to admire Which?, who’ve previously stuck to the middle ground in their stuff, by striking out into an entirely numberless ranking.

Which? have a survey that’s supposed to get around the claims in prospectuses by going to the students for the truth. What they’ve done is another round of a Youthsight survey (I have issues with Youthsight – but let’s move on) adding up the data from previous years so now it has more than 10,745 participants. Students have been surveyed on nightlife or political scene etc.

The results are portrayed in an interesting way. Here’s the nightlife results:

So, a top 10 but without any ranking. Or any sense of which data means these ten, very fine universities, have a better nightlife than the other 117 universities.

For students union, we get a top 7.

But top for what? Advice services? Societies? Why a top 7 – was there a tie? There are 11 universities top for sports scene, with a 6 top for both creative scene and political scenes. There’s no clue as to how these arbitrary top groups have been chosen.

For the first time this year Which? have added an indicator on how much a university is helping students to be ‘job-ready’. This is a fairly contentious question, but worthy of proper study. This indicator is also drawn from 3 years survey data but only has 6103 participants. Here 38 universities are listed. That’s a third of the universities. But we’re not told what’s the substantive difference between 38 and 39.

Of course this is poor. It’s not as bad as some, but Which? Have been making an important point that students should get more accurate advice and guidance. Last month they said:

Now any university that said that it had a ‘top’ students’ union because Which? said so would be making an unverifiable claim.

It’s not complicated: they should publish their data, and no news outlet should re-publish any of this stuff until they do.

How are the Office for Students (OfS) getting on with their registration of HE providers? There was a report on 2 October over delays to the process and when some more universities were added to this list they contained another with a condition relating to A1 of the register.

With 122 providers on the register it’s still too early to know how the initial registration period has gone, but it was interesting that Iain Mansfield picked up that the three universities* with a condition all have this related to A1 – the Access and Participation Plan. What about the other parts of the register?

OfS are dealing with the initial conditions of registration, and it’s possible to imagine that they are considering whether to set specific conditions here. Take condition B2:

The provider must provide all students, from admission through to completion, with the support that they need to succeed in and benefit from higher education

The OfS suggest that behaviours that would indicate this condition would be met are support for ‘all students to achieve successful academic and professional outcomes’ and data suggesting that there’s a ‘reliable and fair admission system’ resulting in successful completion.

We know that some providers attracted attention in the boom years of for-profit college expansion, offering HNDs to students. We have never really got a full account of how many learners took out loans and just how many completed their courses. One of the things that the OfS has been charged to do is to bring together the system of course designation that BIS/DfE operated with the HEFCE system. That’s a core reason we have the new regulatory framework, so we should expect this to be a key test of how it works in a diverse sector.

OfS will have access to data not in the public domain as part of its consideration of registration, but let’s look at some Unistats data for one provider which I spotted**. This is the rather alarming continuation data for an HND in Health and Social Care Management. According to the data no students continued or completed the course.

The entry qualifications data for that HND shows that 69% of the students had no or unknown prior qualifications, suggesting that there might not might much evidence of a reliable admissions system leading to successful student outcomes.

The provider isn’t currently offering that named HND so it’s possible that the continuation data reflects some issue, say where the students have transferred to different course, but data from the five HNDs for that provider represented in Unistats (broken down into seven subject categories), seem to show a similar picture.

Continue at college

Complete the course

Complete different award

Taking a break

Left before completion

Business Studies

3

0

0

18

79

Business & Management

2

0

0

18

80

Health & Social Care Management

0

0

0

17.17

82.83

Hospitality Management

1

0

0

19

80

Information Systems

4

0

0

15

81

Computing

2

0

0

13.13

84.85

Network Engineering and Telecommunications

1.98

0

0

12.87

85.15

Data provided to the QAA at its latest monitoring visit put the retention and completion data at much higher points – at least 50% in 2015/16 and 93% in 2016/17, so there must be a big data issue somewhere. Things can go wrong in the HESA record, but you’d expect something on this magnitude to be picked up. Unistats data is supposed to be accurate so as to inform applicants about the course.

This provider is part of a group, another part of which has strategically moved out of offering designated courses, but assuming that this college has applied to be an Approved provider it will be an interesting test case to watch. Hypothetically, if the performance shown on Unistats was really that for B2, would it be so poor that OfS will refuse to put it on the register, or would its risk profile still drive a set of specific conditions with a view to improve?

Excitingly, because the provider doesn’t have sufficient data, it has a provisional TEF ranking. How long would that last?

The higher education provider meets rigorous national quality requirements for UK higher education, and is taking part in the TEF, but does not yet have sufficient data to be fully assessed. The provider may be fully assessed in future when it has sufficient data.

Since we had a change of minister, there’s been a lot less emphasis on the for-profit providers driving up quality and driving down prices in higher education. There are many providers that were not funded by HEFCE which will thrive in the OfS registration categories, but a key test will be how OfS deals with those whose data shows performance that cannot been seen as acceptable. An outcome must also be a proper look at the few Colleges who grew to offer thousands of places on HND courses. The NAO has done some work on this, but this has the potential to be a major ongoing issue, with students having lifetime loans for courses they were unlikely to proceed into the second year of.

*having worked at two of these universities, I’m not going to comment on this.
** no names here. I can confirm these data were on Unistats on 6 October 2018