KS2 Writing: Moderated & Unmoderated Results

After the chaos of last year’s writing assessment arrangements, there have been many questions hanging over the results, one of which has been the difference between the results of schools which had their judgements moderated, and those which did not.

When the question was first raised, I was doubtful that it would show much difference. Indeed, back in July when questioned about it, I said as much:

@jevans_jill I'm not persuaded it will be that significant, because actually moderation had such vague guidelines

At the time, I was of the view that LAs each trained teachers in their own authorities about how to apply the interim frameworks, and so most teachers within an LA would be working to the same expectations. As a result, while variations between LAs were to be expected (and clearly emerged), the variation within each authority should be less.

At a national level, it seems that the difference is relatively small. Having submitted Freedom of Information Requests to 151 Local Authorities in England, I now have responses from all but one of them. Among those results, the differences are around 3-4 percentage points:

Now, these results are not negligible, but it is worth bearing in mind that Local Authorities deliberately select schools for moderation based on their knowledge of them, so it may be reasonable to presume that a larger number of lower-attaining schools might form part of the moderated group.

The detail that has surprised me is the variation between authorities in the consistency of their results. Some Local Authority areas have substantial differences between the moderated and unmoderated schools. As Helen Ward has reported in her TES article this week, the large majority of authorities have results which were lower in moderated schools. Indeed, in 11 authorities, the difference is 10 or more percentage points for pupils working at the Expected Standard. By contrast, in a small number, it seems that moderated schools have ended up with higher results than their unmoderated neighbours.

What can we learn from this? Probably not a great deal that we didn’t already know. It’s hard to blame the Local Authorities: they can’t be responsible for the judgements made in schools they haven’t visited, and nor is it their fault that we were all left with such an unclear and unhelpful assessment system. All this data highlights is the chaos we all suffered – and may well suffer again in 2017.

To see how your Local Authority results compare, view the full table* of data here. It shows the proportions of pupils across the LA who were judged as working at the Expected and Greater Depth Standards in both moderated and unmoderated schools.

*Liverpool local authority claimed a right not to release their data on the grounds of commercial sensitivity, which I am appealing. I fully expect this to be released in due course and for it to be added here.

Reblogged this on Teacher Voice and commented:
Have to share this – expert analysis from Michael Tidd.
Hopefully change will start to happen or at least moderators, LAs and schools across the country will endeavour to try and make some more sense of this chaos.

I don’t know why South Gloucestershire have not given you their 2016 Y6 writing data. It was released to the South Gloucestershire NAHT branch in the Autumn of 2016 at our AGM and then shared with the NAHT South West Executive by myself as the South Gloucestershire representative on the 4th November 2016 so is in the public domain .

Thanks for sharing that data – that’s very interesting to see, and mirrors many similar authorities.
I wonder if the reason South Gloucestershire have denied my request is because of a misunderstanding. I have requested the list of schools who were moderated, which will presumably lead to the same overall data as has already been shared, but does open up an additional level of scrutiny. Interesting to see if they realise and overturn their decision.

I have been looking at the relationship between the LA moderated average being above the LA English Grammar, Punctuation and Spelling average . The LAs with the biggest discrepancy are often LAs where the LA moderated average is above the LA unmoderated average. It is not a direct relationship on all occasions but there does seem to be some relationship visible. Do you think there is anything interesting we can learn from that for 2017 or is it just more of the confused picture we are dealing with in the 2016 KS2 data.

What also needs to be checked, however, to test the impact (and value?) of moderation, is what the assessments were in the moderated schools before moderation took place. We also need to know how many LAs moderated substantially more than the required minimum.

Hi. I’d like to comment on your thought that, “At the time, I was of the view that LAs each trained teachers in their own authorities about how to apply the interim frameworks, and so most teachers within an LA would be working to the same expectations.”

I’d be interested to know if this view was based on your experience of what happens in the LAs of your experience. In the LA where I work, the idea that the LA ‘trains its own teachers’ – in the sense that it has either the capacity or the reach to deliver training to a critical mass of the county’s professionals – is…well, somewhat different from reality. It’s true that the LA offered local briefings on the new curriculum, annually offers training on assessment and moderation, and informative handbooks on moderation requirements, but this is not the same as securing that most teachers work to the same expectations. I wish it were so, though I’m not convinced that what this implies – that LAs might have a stranglehold on what schools do and think – is necessarily a good idea.