Benchmark your school using Assembly’s free tool for state secondaries

School leaders need benchmarking data to analyse their relative strengths and weaknesses, and prioritise areas for improvement accordingly. Governors use it to support and challenge leadership about the school’s performance. Parents see benchmarking data as a way of assessing whether a school is doing a good job for their children. The problem is that all these groups struggle to make sense of the mounds of data that are out there.

We believe that public education data can yield valuable insights, but that right now it’s difficult to use it effectively. So to help, the Assembly team have developed a Secondary School Benchmarking Tool, which distills the available data into the measures we think matter the most. This free tool is intended to help schools, parents and governors understand the relative strengths and weaknesses of their school.

Over the next few months we’re building this out into a full benchmarking app on the Assembly platform, covering all key stages. But in the meantime, we’ve decided to make the main key stage 4 dashboards available through our website, powered by Tableau Public.

Our three key principles for designing school dashboards

Data exploration should be interactive and intuitive, so we make it easy to drill down from the high-level summary into the more detailed picture.

Public data should be free, so we’ll never charge for benchmarking resources.

Contextualising the data should be intuitive, so we have included explanations and colour coding to make it easier to interpret percentiles.

The future of school benchmarking: progress and percentiles

Moving to the content of the tool, there are three main points to note:

5+ A*-C including English & Maths is dead; long live Progress 8 and Attainment 8!
Ask an education expert to rate a secondary school and they’ll still often quote the school’s 5+ A* - C including English & Maths (AC5EM) percentage. They shouldn’t. The measure is flawed, and is thankfully being phased out. From 2016 onwards it’s all about the four new headline measures: Progress 8, Attainment 8, Ebacc and A*-C in English and Maths. So we’re leading with these, plus Best 8 Value Added and Best 8 Average Points Score per Pupil since they’re the forefathers of Progress 8 and Attainment 8, and are therefore decent proxies for those schools that didn’t opt in for these new measures this year. It's also possible to look at them over time, unlike the two new measures.

The implications of this can be huge: take Ark Putney Academy for example.

At 58%, its A5CEM is roughly equal to the national average of 57.1%. However, its Best 8 Value Added score of 1,037 puts it well above the score of 1,000 (which would indicate students on average made expected progress), putting it in the 95th percentile (top 6% of schools for whom we have data nationally). Given the low attainment of students at entry, we think the progress measure is more representative of the quality of the school than the attainment metric in this instance. So in all cases, to gain a proper picture of a school you need to see progress and attainment metrics side-by-side.

Benchmarking dashboard for Ark Putney

We’ve separated progress and attainment measures to emphasise the difference between them.
One drawback of attainment measures is that they give no indication of the starting point of the students they aggregate. AC5EM goes one step further and applies a threshold (i.e. simply asks whether or not you crossed the ‘C’ grade boundary), and then complicates things further by requiring you to have achieved a specific blend of subjects to qualify. This is problematic: for students with level 3 or below at Key Stage 2, meeting this threshold involves exceeding expected progress in 5 subjects. For students getting level 5s at KS2, C grades would be disappointing results.

Attainment measures still have their value - absolute attainment is of course important, and the laudable aspiration of AC5EM is that all students should leave school with at least a basic level of qualifications. The problem with them as a means of measuring school performance is that they can judge schools very harshly or leniently depending on their student population. So while we’ve included attainment measures, we’ve given priority to progress metrics by putting them to the left of the page (where your eye is first drawn), and more generally splitting them out from attainment so the difference between the two types of measure is clearly demarcated.

Comparison to the average isn’t enough… so we use percentiles too.
It can be difficult to contextualise your school’s performance when the only benchmark provided in the raw data is a national or local authority average. Simply knowing you’re above or below average isn’t enough to know the areas in which you are truly excelling or lagging behind. And a fixation with averages hardly promotes excellence. So while we include averages in many places, we prefer to contextualise data in more granularity by using percentiles.

What you see on our website still only scratches the surface of public data. It’s just Key Stage 4, and we haven’t yet included destinations data, subject data or Ofsted judgments. We’re working on these features for the full app in our Assembly Platform, and if you’d like to be kept informed of our progress then and we’ll add you to our mailing list.

We hope you find our tool insightful and easy to use. We’d love to get your feedback - so please spare a moment to fill out our feedback form, or email me at rachel@assembly.education to offer your thoughts. We’ll be hugely grateful for any input you’re willing to give. And do join our mailing list if you’d like to be notified of future developments and updates.

Finally, a couple of quick caveats on the data used in the tool:

You will find null values in places if a school has been suppressed for that metric due to low pupil numbers. This in turn affects the percentile calculations, which are unable to take account of suppressed data. This effect should be fairly minimal, but it may skew some figures in places. There may also be some variation between the percentile and the distance from national average, since national average is at a student level and the percentile is calculated at the school level. It's also worth noting that the percentiles for Progress 8 and Attainment 8 are calculated based on the 327 schools that opted in (and therefore may be skewed by an 'opt in' bias.)

The tool only considers state secondary schools - we’ve removed independent schools completely from the dataset. That’s partly because including independent schools means having to restrict the tool in ways that would be of no benefit to the state school majority (e.g. by using the all school average, which is gap-filled due to the inevitable lack of independent school data in areas such as pupil premium, and skewed by the independent school preference for non-standard qualifications). It’s also because we don’t think independent schools would find our tool that useful anyway: independent schools often choose to enter students for alternate or non-compliant blends of qualifications, which results in meaninglessly poor performance against headline measures.