Tags

Are UK universities biased in their admissions’ standards towards less privileged applicants? That’s the million dollar question, the answer to which could spawn a thousand column inches on both sides of the debate. Today’s data released by UCAS was supposed to give us the definitive answer: for the first time, application and offer data at the institutional level has been made available, broken down by sex, ethnicity and socio-economic background.

Having been widely accused of keeping such data away from prying eyes in the past, it certainly looks like today’s release is one of reluctance rather than fully willing. Not that UCAS would say so; they argue that this is a new ‘service’ to universities and a natural step forward in their work to release data. It is also the first step in meeting the government’s challenge to the sector to drive a ‘transparency revolution’ heralded by the White Paper.

The data shows that a sizable number of universities give statistically significant fewer offers to black applicants and applicants from the most disadvantaged areas, even if the sector does not appear to show any bias at an aggregate level. Oxford and Cambridge are two institutions that do not appear to show systematic or consistent bias against black or less privileged applicants. The data also shows that geography has a substantial effect on the racial diversity of campuses, and that specialist arts institutions may face a particular challenge when it comes to giving fair offers to applicants from the least advantaged neighbourhoods.

Data wars

The pressure on UCAS and the sector to release more data has largely been provoked by conflicting conclusions about the data that is already available. The arguments largely boil down to methodological disputes only comprehensible to social scientists.

UCAS has found itself at an impasse with researchers over what admissions data is telling us about fair access, particularly for black and ethnic minority applicants to high tariff universities. Analysis by the Equality Challenge Unit and research by Vikki Boliver at Durham University argues that there is evidence of systemic bias in the process. UCAS claim there is not.

Nonetheless, the latter conclusions made enough noise to lead the Prime Minister to announce in January that admissions would be made name-blind and that more data would have to be released to prove that universities were welcoming ethnic minority and socially disadvantaged candidates. Meanwhile, a row erupted over UCAS’s reluctance to release their full set of individualised data to the ESRC’s Administrative Data Research Network (ADRN) for further scrutiny and to be linked up to HMRC and DWP datasets. UCAS based their opposition on a survey of students which suggested that such a release might be viewed as a breach of privacy and that students should have the right to withhold their data.

UCAS’s objections ultimately proved futile, with the White Paper announcing plans to legislate for this data to be made available. UCAS are already negotiating with the ADRN to preempt this requirement and make the full dataset available in the near future. The whole episode has raised some fundamental questions about the governance of UCAS and questions have been asked about whose interests the admissions body is meant to prioritise. The government’s stance implies that it believes a greater amount of independence for UCAS is needed from the established sector, and that the supposed authority on university admissions should not be owned by universities. This issue will only become more pertinent with the expansion of the ‘challenger’ sector.

Analysing the data – what we know so far

Although the headline suggested by UCAS that there is “no evidence of bias within the admissions system” is broadly true at an aggregate level, a closer analysis of the data shows that there are substantial variations between institutions. It turns out that plenty of institutions show enough variation in their offer rates to POLAR 1 (i.e. the most disadvantaged) and black applicants, but these are not all the usual suspects. It would be a substantial category error to view the sector homogeneously on this issue.

Bias against the most disadvantaged applicants and black applicants

From our analysis of the 2013, 2014 and 2015 datasets, the following institutions have a significant gap in offer-rate between the most disadvantaged applicants and the average application rate in two of the last three years:

University of the Arts, London

University of Birmingham

Birmingham City University

University of Bristol

Arts University Bournemouth

Coventry University

Imperial College London

Manchester Metropolitan University

University of Nottingham

Queen Mary, University of London

University of York

It is important to remember that this is not a measure of the number of disadvantaged students that get accepted into these institutions, but rather the rate at which they get offers in comparison to similarly qualified applicants. An institution may accept very high numbers of disadvantaged students, but still not give enough offers to adequately match the number of qualified applicants.

This becomes particularly clear when we analyse the situation for black applicants. From our analysis of the 2013, 2014 and 2015 datasets, the following institutions have a significant gap in offer-rate between black applicants and the average application rate in two of the last three years:

Anglia Ruskin University

University of Bath

University of Bedfordshire

University of Birmingham

Birmingham City University

Bournemouth University

Brunel University

University of Central Lancashire

University of Chester

Coventry University

De Montfort University

Durham University

University of Greenwich

University of Hertfordshire

Imperial College London

University of Kent

King’s College London

Kingston University

University of Leeds

University of Lincoln

Loughborough University

Manchester Metropolitan University

University of Manchester

Middlesex University

Nottingham Trent University

Newcastle University

Oxford Brookes University

Sheffield Hallam University

The variety of the institutions on this list is intriguing. Some, such as Manchester Metropolitan, Birmingham University and De Montfort, receive and accept some of the largest numbers of black applicants in the country.

So while the overall sector headlines suggest that bias against both black and POLAR 1 candidates is negligible, the institution-level data suggests there is some work to do in many universities to eliminate possible unfairness. The flipside is that those institutions without any significant sign of bias are ‘pulling up’ the rest of the sector’s overall score.

Oxbridge not guilty?

Someone from a POLAR 5 (most advantaged) area is 16 times more likely to go to Cambridge than some from a POLAR 1 area, and 14 times more likely to go to Oxford. A white student is 3.25 times more likely to go to Cambridge than a black student, and 2.74 times more likely to go to Oxford. Does this mean that Oxford and Cambridge have an unfair admissions process?

Probably not is the verdict that today’s data suggests. While POLAR 1 candidates did get fewer offers from Cambridge in the 2015 cycle, they did not in previous years and so we cannot say it is a definite trend. Meanwhile, POLAR 1 applicants to Oxford received slightly more offers than expected this year. In fact, while the average offer rate decreased at Oxford in 2015 (meaning it was harder overall to get in), the number of POLAR 1 offers stayed just about the same.

Similarly, neither institution appears to be guilty of bias or iniquity in giving offers to black students. While the number of black students at these two institutions is still extremely low, it would appear that to improve the situation, the focus would need to be on ensuring more black students apply in the first place. The Prime Minister thus appears to be mistaken in singling out Oxford for discrimination against black applicants.

Are institutions being proactive enough?

In theory, if institutions were taking proactive steps to offer places to disadvantaged and black applicants, we would see a significant positive difference in offer rates between POLAR 1 and the average; institutions would be giving sufficient extra offers to applicants from poorer backgrounds to account for their disadvantage. While we know that many institutions do this, we can perhaps infer that that use of this contextual data may not be making up for biases elsewhere in the system.

One of the ways some of this aggregate data is being spun is to put the blame for unequal access firmly at the feet of the schools system, but surely universities must be more proactive themselves, however contentious the issue of ‘affirmative action’ or ‘positive discrimination’?

That said, data about one Russell Group institution gives some unexpected results. POLAR 1 applicants to the London School of Economics were 26.8 (!) percentage points more likely to get an offer than the average in 2015. Some might put this down to a mere statistical anomaly. LSE tell us that they introduced a new ‘flagging’ system for POLAR 1 candidates in 2015, but working out whether this could have had such a substantial effect will need further investigation.

Trouble for the arts?

Specialist arts institutions had the five largest negative differences in offer rates for POLAR 1 applicants in 2015, including UAL and Arts University Bournemouth, listed above. However, the numbers involved in most arts institutions are so small that it is not possible to say with an absolute degree of certainty that this is not within ‘statistical noise’ for each institution. However, the cross-sector results for arts institutions should sound some alarm bells. Specialist arts universities may wish to look very carefully at whether they are making enough offers to the most disadvantaged applicants.

It’s the geography, stupid

From analysing the relative likelihood of attending different institutions, it is clear that POLAR 1 students are most likely to attend Million Plus and University Alliance institutions in large northern conurbations. If you’re from a POLAR 1 area, the institution you are most likely to attend is Sheffield Hallam, followed by Manchester Metropolitan, Leeds Beckett, Liverpool John Moores, Northumbria and Nottingham Trent, which are all near to a large proportion of POLAR 1 postcodes. Conversely, if you’re from a POLAR 5 (i.e. most advantaged) area, geography appears to be less of an influence. These students are most likely to be placed at the large Russell Group institutions: Nottingham, Leeds, Birmingham, Bristol, Manchester, Exeter and Southampton.

That said, and as noted below, the data doesn’t fully reflect disadvantage in London because relative deprivation is so different to rest of country and because rich and poor communities are less segregated. Therefore, it appears that those London institutions that admit a high number of local applicants are taking fewer disadvantaged students than they probably are.

This is further underlined by the data on ethnicity, which underlines a vast diversity in the sector in the racial makeup of campuses. Black and ethnic minority students are far more likely to apply and be admitted to institutions in London and other cities or towns with large BME populations (Luton, Leicester, Birmingham and Bradford). These are the half of institutions that explain why white students are “underrepresented” overall according to UCAS. This seems to suggest that if a ‘white working-class disadvantage’ was ever to be tackled, it might be through setting up new HE providers outside of large cities where populations are still predominantly white.

Again, all this tells us nothing about bias in admissions, but rather tells us about the preferences and qualifications of different groups of applicants. It can be further highlighted by analysing particular geographic areas.

Case study: The North West

Students from black, Asian and mixed backgrounds are more likely to attend the University of Manchester and Manchester Metropolitan University than their white counterparts (numbers of applicants placed per ten thousand population). For example, 53 of every 10,000 white students join Manchester Met with the equivalent of 102 Asian students and 59 black students. For Manchester, the comparators are 53, 88 and 60. The University of Bolton is broadly similar, as is the University of Salford for two of the groups – black and Asian participation – but students of mixed race are marginally less likely to attend.

On Merseyside, black and Asian students are much more likely to attend the University of Liverpool than either Liverpool John Moores or Liverpool Hope. Interestingly, white students are nearly eighteen times more likely to attend Hope than Asian students.

Elsewhere in the North West, the University of Central Lancashire is close to parity for white versus black and mixed race while Asian students are again more likely to attend. For Lancaster University, the University of Cumbria and Edge Hill University the ratios are much more uneven: much smaller proportions of the BME student population are attending these institutions compared with their white peers.

Case study: The North East

At the Universities of Newcastle, Teeside and Sunderland white students are between 1.5 and 4 times more likely to attend than BME students. Durham University is in line with this except for mixed race students who are on par with white students. Northumbria University is also within the regional norm, although black students are less likely to apply than elsewhere locally, with over 6 times more white students than black for each 10,000 in the population.

Understanding the numbers

It would be naive to assume that today fully begins or ends the ‘transparency revolution’. Open data releases always have two major faults: information overload, and risk of misinterpretation, whether accidental or deliberate. Surpluses of information can muddy a picture just as much as a lack of information.

This is a massive release, with over 200,000 individual lines of data to be spun many different ways. It will take a long time for comprehensive conclusions to be made about what it really tells us. But what we can say today with some confidence is that these further conclusions will not tell a simple story about bias in university admissions.

However, there are two vital pieces of data that need to be understood:

Likelihood to attend – This is measured by the number of applicants that are placed in an institution per 10,000 of any given population (e.g. 20 placed applicants for every 10,000 white people). It shows how likely a certain group is to attend an institution. This measure tells us very little about the fairness of a particular institution’s admissions system, but it does say a lot about different groups of students’ preferences and also about the institutions that different groups of students are likely to have the grades to be able to attend. It is also related to an institution’s size: all things being equal, all applicants would be most likely to attend the institution that accepts the most students (i.e. the University of Manchester).

Difference in offer rate – This is the number we’ve all been waiting for. It shows whether a particular group of applicants is more or less likely to get an offer than the ‘average’ when controlling for important factors such as predicted grades, the amount of offers made by an institution, and the types of courses available. As UCAS put it, “a difference simply means that the offer rate is higher or lower than it is for all applicants who are similar in terms of the subject applied for and a summary measure of their predicted grades”.

How these latter two factors are taken into account is the source of disagreement between UCAS and other researchers. There are different ways of controlling for these factors. UCAS claim that their way is the most “precise”, but their method likely irks some researchers for not releasing the underlying data that allows them to come up with their own method.

Furthermore, the Equality Challenge Unit have made a persuasive argument that small differences that fall within the margin of error can lead to much larger differences in graduate and postgraduate opportunities.

When a significant difference in offer rates occurs, there will thus be enough evidence to at least question whether it is a result of bias against disadvantaged groups. UCAS and universities will argue that there are any number of other factors that might be taken into account. Whether this is a fair defence will likely vary a lot between institutions. Caution must be urged, but the sector is rapidly running out of ways to easily explain away wide differences in offer rates. To give institutions the benefit of the statistical doubt, we’ve chosen to analyse results over the past three admissions cycles to identify any relatively consistent trends.

One final methodological point must be made: Scottish and English methods of measuring socio-economic disadvantage differ substantially, and so are not directly comparable. The POLAR 3 area measure used in England has come in for some criticism in the past, particularly on its applicability in London, and so must be interpreted carefully. POLAR data is broken into five categories, with 1 representing the least advantaged quintile, and 5 representing the most advantaged quintile.

Our analysis so far has only focused on the POLAR 1 quintile and black applicants. There are interesting trends to be uncovered in this data relating to sex, Asian applicants, and POLAR quintiles 2-5. There is also insight to be gained from looking at institutions’ overall admissions’ approaches by analysing offer rates, June acceptances, and total places offered. Keep an eye on Wonkhe for further analysis of the data in the coming days and weeks.

23 responses to “Transparency revolution: is there bias in university admissions?”

A good start given the time available with the data. One thing I would pick up on is the lazy use of ‘suitably qualified’ – with this data it’s only possible to talk about academic (I.e. level 3) qualifications, there are other types of qualification for entry. Arts institutions (and courses) are a prime example, with auditions and portfolios of work often given greater importance than predicated grades. Some subjects have a greater predominance of interviews; Law, for example. Looking at offers, instead of invitations to auditions or interviews may be misleading (and if not, may help to identify any source of bias). Before we start looking at data at the institutional level, we need to check and address subject biases.

I believe this sentence is incorrect:
“POLAR data is broken into five categories, with 1 representing the least disadvantaged quintile, and 5 representing the most disadvantaged quintile”

POLAR 3 quintile 1 represents the areas where students are least likely to progress to HE at 18 or 19 i.e. the most disadvantaged.
POLAR 3 quintile 5 1 represents the areas where students are most likely to progress to HE at 18 or 19 i.e. the least disadvantaged.

UCAS have provided the last 6 years of data, going back to 2010. The choice of ‘two of the last three years’ is partly due to only have time to get through three of the six years provided by UCAS, but also because a lot has changed in university access policy and practice over the past few years, in my view at least, it doesn’t make much sense to analyse in detail what was happening over three years ago. Because the data shows quite a lot of up-and-down movement, I’ve used ‘two of the last three years outside the margin of statistical noise’ as a proxy for a ‘likely trend’.

The ideal analysis, to me, would be to aggregate the past three admissions’ cycles and then look at the difference in offer rates. There are a lot more institutions that consistently show a negative difference in offer rates towards POLAR 1 and black students *but* just fall within the ‘statistical noise’ year-on-year. That margin of statistical noise would be small if analysed in one, larger, aggregate sample. Unfortunately, as far as I can tell, UCAS have not provided the sufficient underlying data to do this, or if they have, it’s out the range of my skill set.

Thanks for a really sensible piece. Hearing John Humphreys on Radio 4 beating up an admissions officer from Bristol on Saturday made me weirdly grumpy, since this issue is so much more complex than such an approach allows. Your attention to application patterns is important in this context. And the next question should be: why do we get these patterns? It seems to me that there are issues of cultural diversity (or the lack thereof) that need to be addressed at many universities, if they are to be places that will be genuinely attractive to applicants from ethnic minorities and lower socio-economic groups. More at:https://headofdepartmentblog.wordpress.com/2016/06/13/the-widening-participation-debate-the-sound-the-fury-and-the-missing-terms/

Has anyone looked at the correlation between POLAR 4/5, BTECs and First Generation? Students who are the first in their family to go to university are more likely to be persuaded to take a BTEC which then makes them ineligible to apply to higher tariff universities.

‘Oxford and Cambridge are two institutions that do not appear to show systematic or consistent bias against black or less privileged applicants’ but it must be harder for women to get in there else how do they retain their 50:50 gender balance when everywhere else it’s 60:40? (They say their subject mix but I don’t buy it!) Patrick Ainley author ‘Betraying a Generation’ Bristol: Policy Press.

Gender imbalance across the sector as a whole appears to be mostly caused by subject differences. Institutions that teach large numbers of nursing and education students have the highest gender imbalance. Oxbridge don’t, which is probably why they are closer to 50:50. That said, I haven’t gone through the data on offer-rates yet – hopefully will get a chance this week.

With such a large amount of data, a ‘fishing expedition’ is bound to find some statistically significant result. For example, if there are 100 universities in the sample and you do some standard 5% p-test to see if any of them are biased, you would expect an average of 5 false positives. I’m no expert, but I believe that there are more advanced statistical methods which can take this into account, can you elaborate on you came to your conclusions?

” Similarly, neither {Oxbridge} institution appears to be guilty of bias or iniquity in giving offers to black students. While the number of black students at these two institutions is still extremely low, it would appear that to improve the situation, the focus would need to be on ensuring more black students apply in the first place.”

Your analysis assumes the institutions are not responsible for the low number of black student applications. Not many black people apply to join the Metropolitan police, either. It is certainly the case for 6th formers I know that there is a clear understanding of what Oxbridge is ‘really like’ for disadvantaged students (and even those who are not disadvantaged at all except in comparison with a public school elite). This comes not least from social media accounts of those who have tried at interview, and/or have actually got in. So they don’t apply. And this is despite the encouragement of schools and teachers to apply, not least because the numbers of Oxbridge students is an metric that state schools use in their publicity.

So, not so easily off the hook. And otherwise, you are saying David Lammy is wrong ?