Our Ofsted experience

I’m reliably assured that mentioning Ofsted is bound to get a spike in visits to one’s blog page, so let’s see.

About a month ago, we were thrilled to receive that lunchtime phone call that meant the wait was finally over. As any school with a ‘Requires Improvement’ label (or worse) will know, although perhaps never quite ‘welcome’, there comes a point where the Ofsted call is desired, if only to end the waiting. We wanted to get rid of the label, and so this was our chance.

We’d been “due” for a few months, but knew that it could be as late as the summer, so in the end, the second week after Easter didn’t seem so bad (particularly as it left us with a long weekend in the aftermath).

So how did it go? Well, for those of you interested in grades, I am now the deputy headteacher of an officially GOOD school. It’s funny how that matters. Six weeks ago, I was just deputy of an unofficially good one.

But those of you still awaiting the call will be more interested in the process than the outcome, so let me start by saying that having spent the past 18 months building up my collection of “But Sean Harford says…” comments, I didn’t have to call upon it once. The team who visited us were exemplary in their execution of the process according to the new guidance and myth-busting in the handbook.

In the conversation on the day of the phone call, we covered practicalities, and provided some additional details to the lead inspector: timetables, a copy of our latest SEF (4 pages of brief notes – not War and Peace) and the like. And then we set about preparing. We had only just that week been collating teachers’ judgements of children’s current attainment into a new MIS, so it was a good opportunity for us to find out how it worked in practice!

We don’t keep reams of data, we don’t use “points of progress”, and we’ve gone to some length to avoid recreating levels. All for good reasons, but always aware that a ‘rogue’ team could find it hard to make snap judgements, and so make bad ones. The data we provided to the team was simple: proportions of each children in each year group who teachers considered were “on track” to meet, or exceed, end-of-Key-Stage expectations. We compared some key groups (gender, Pupil Premium, SEN) and that’s it. It could all fit on a piece of A4. So when it came to the inspection itself, there was a risk.

Day One

It may be a cliché to say it, but the inspection was definitely done with rather than to us. The first day included joint observations and feedback with the headteacher, as well as separate observations (we had a 3-person team). An inspector met with the SENCo, and the lead also met with English and Maths subject leaders (the former of which happens to be me!) and our EYFS leader.

The main question we were asked as subject leaders was entirely sensible and reasonable: what had we done to improve our subjects in the school? I think we both managed to answer the “why?” and “what impact?” in our responses, so further detail wasn’t sought there, but it was clear that impact was key.

Book Scrutiny

The afternoon of the first day was given over to book scrutiny. We provided books from across the ability range in the core subjects, as well as ‘theme’ books for each team. The scrutiny focused most closely on Years 2, 4 and 6, which fits both with the way we structure our classes and our curriculum and assessment approach. Alongside books, we provided print-outs for some children that showed our judgements on our internal tracking system. I’m not sure whether the focus was set out as clearly as this, but my perception of the scrutiny (with which both my headteacher and I were involved) was that the team were looking at:

Was the work of an appropriate standard for the age of the children? (including content, presentation, etc.)

Was there marking that was in line with the school’s policy? (one inspector described our marking – positively – as “no frills”, which I quite liked)

Was there evidence that children were making progress at an appropriate rate for their starting points?

They asked for the feedback policy in advance, and made connection to it briefly, but the focus on marking was mainly on checking that it met what we said we did, and that where it was used, it helped lead to progress. Some pages in books were unmarked. Some comments were brief. Not all had direct responses – but there was evidence that feedback was supporting progression.

Being involved in the process meant that we could provide context (‘Yes, this piece does look amazing but was quite heavily structured; here’s the independent follow-up’; ‘Yes, there is a heavy focus on number, but that’s how our curriculum is deliberately structured’, etc.). But it also meant a lot of awkward watching and wondering – particularly when one inspector was looking closely at the books from my class!

The meeting at the end of the first day was a reasoned wander through the framework to identify where judgements were heading and what additional information might be needed. We were aware of one lower-attaining cohort, which was identified, so offered some further evidence from their peers to support our judgements. There was more teaching to be seen to complete the evidence needed for that. And there was one important question about assessment.

Assessment without levels

I had expected it. Assessment is so much more difficult for inspectors to keep on top of in the new world, and so I fully expected to have to explain things in more detail than in the past. But I was also slightly fearful of how it might be received. I needn’t have been this time. The question was perfectly sensible: our key metric is about children being “on track”, so how do we ensure that those who are not on-track (and not even close) are also making good progress?

That’s a good question; indeed it might even have been remiss not to have asked it! We were happily able to provide examples of books for specific children, along with our assessments recorded in our tracker to show exactly what they were able to do now that they couldn’t do at the end of last academic year. It gave a good opportunity to show how we focus classroom assessment on what children can and can’t do and adapt our teaching accordingly; far more important than the big picture figures.

Day Two

On the second day I observed a teacher alongside the lead inspector, and was again pleased by the experience. Like all lessons, not everything when perfectly to plan, but when I reported my thoughts afterwards, we had a sensible discussion about the intentions of the lesson and what had been achieved, recognising that the deviation from the initial plan was good and proper in the circumstance. There was no sense of inspectors trying to catch anyone out.

Many of the other activities were as you’d expect: conversations with children and listening to readers (neither of which we were involved in, but I presume they acquitted themselves well); meeting with a group of governors (which I also wasn’t involved in, but they seem to acquit themselves well too J); a conversation about SMSC and British Values (with a brief tour to look at examples of evidence around the school); watching assembly, etc.

Then, on the afternoon of day two we sat with the inspection team as they went through their deliberation about the final judgements. In some ways it’s both fascinating and torturous to be a witness in the process – but surely better than the alternative of not being!

As with any good outcome, we got the result we felt we were due (and deserved), and areas for feedback that aligned with what was already identified on our development plan for the forthcoming year. The feedback was constructive, formative, and didn’t attempt to solve problems that didn’t exist.

Like this:

Related

Post navigation

18 thoughts on “Our Ofsted experience”

Judging from your Twitter and this post, my school received THE call at the same time. Although our inspection process was slightly different, the outcome was the same and then general feeling was of support. As a middle leader of LKS2, I met with the lead inspector alongside other middle leaders. It seemed to us that they wanted to. Know that we knew the data, areas for focus and what we were doing to address issues.

We are also now officially a GOOD school and feel we got the result we deserved!

Micheal – first really pleased for you (as also a GOOD school) that you now have this label, and accuracy of these labels aside, it does make a difference in the general stress levels!

I think what still worries me is the focus on Ma and En above pretty much everything else so you report, “the lead also met with English and Maths subject leaders” and “We provided books from across the ability range in the core subjects” (though you also mention theme books) did this include Science? (the Cinderella of the core).

I do feel that the primary sector is being pushed into being a “provider of secondary ready children wrt their Ma and En” rather than a developer of young children into well rounded individuals – I am sure your school, like most schools, will be working hard at the latter but I feel the system is pushing towards the former – would welcome your thoughts.

Hello Paul,
Yes, it’s fair to say that there was a clear emphasis on English and Maths, although as you point out, Theme books were also included in the scrutiny.
I hope we do work towards developing rounded young people, but I am also of the belief that the best route to a broad curriculum overall is a very secure base of English and Maths. I’ve written somewhere on here before that I’d be happy if there were no national curriculum for foundation subjects before Y4/5. I don’t think that means narrowing children’s experience, but rather securing their foundations.

Hmm … there is certainly evidence that a focus on En (esp. Reading and Oracy) and Ma aids later learning (e.g. Rose, 2008) but this report suggests that learner happened best when incorporated within a wider curriculum where these skills can be practised. The importance of other areas of development are highlighted in a number of areas (e.g. Alexander, 2009).

When you say. “no NC” do you mean you would not want other things to be taught or just that there was no expectation in terms of monitoring / inspection?

Thanks for this link. Yes, I also like this model – and have argued for both En to be more embedded (or situated in the curriculum as bods like Alexander and Mercer also argue for) and for the curriculum also be wider until 18 – though this might also have knock on for the HEI sector.

Now time for a HT break – let’s hope the sun continues to shine (as it is in the East Midlands at the moment).

Thankyou for sharing in such detail. You seem to have had a very similar visit to ours. I couldn’t have listed the details as I was too fearful at the time.
I really appreciate the details about book scrutiny and tracking as my head is not convinced about our approach. Also other schools we work with have a more traditional approach, putting more pressure on us to comply with their thinking . We are still developing our use of classroom monitor for tracking, so it’s great to read about similar approaches elsewhere.
Many thanks
Suzanne

Well done Michael. We were inspected in March and have been recognised as a “Good” school after many years being not. It is a great feeling to be the deputy head of a “Good” school. I too was also dreading the conversation about assessment and I tried to demonstrate progress from KS1 results to our system (with support from my LA) but wish I hadn’t and stuck to my guns on showing who was at expected etc. I was honest and upfront and they said we had honesty and integrity. I’ll take that!