Category Archives: Curriculum

As I look to timetabling in the new school year, I reflected on the work Tom Sherrington did a few years ago about secondary timetables. Unfortunately, the primary curriculum timetable is not so easy to analyse, given that very few schools stick to a simple programme of x lessons of equal length per day, and few teach every lesson every week – or even every fortnight, as would be common in secondary.

Because of this, it’s much harder to get a sense of how much time schools are giving over to each subject, particularly given the changes of recent years and those on the horizon. So, I set out to try to find out as much as I could, through another of my Google surveys.

It’s impossible to present all of that information tidily, since every school’s situation is unique, but here I’ve tried to draw out some key things.

Weekly subject hours

Different schools take different approaches. Three different schools might offer 36 hours of Art each year, with one offering a weekly 1-hour lesson, another having two hours every other week, with a third having two-hour lessons every week, but only every other half-term. Yet another might mainly use Art days each term to reach its quota. So we’re not comparing like with like, here, but the table below attempts to show the average number of hours taught for each subject if evened out over 36 weeks of term (allowing a couple of weeks for being off-timetable) – all rounded to the nearest 5 minutes.

I don’t imagine anyone being massively surprised by any of those figures, but it certainly gives an indication of the narrowing of the primary curriculum. When the QCA last recommended teaching hours in 2002, it suggested an average of 55 minutes a week for the majority of foundation subjects. We’re now struggling to get above 30 for Geography!

Perhaps more surprising is the fact that although there is more time given to the tested subjects in Year 6, the decline in ‘breadth’ is not huge. It seems that the curriculum is fairly limited across the whole of primary.The greatest breadth in curriculum, at least in timetable terms, appears to be in Year 3.

Regularity

Some years ago there was a clear government target for primary pupils to have at least 2 hours of timetabled PE each week. It seems that the target has achieved something, as it is the only foundation subject which has ended up with significantly more than its previously recommended amount (which was 1 hour 15 minutes in 2002). It’s also one of the few subjects with weekly slots, with 98% of responses saying they taught PE every week, with nearly 90% having more than 90 minutes of PE each week.

The only subject that comes close to regular weekly slots is Science, with around 2/3 of respondents saying they taught Science every week.

At the other end of the scale, Design & Technology is very rarely taught on a weekly schedule. This is perhaps not surprising given the amount of resource required for the subject. Nearly a third of schools appear to use standalone days each term or half-term for the subject instead:

Exceptional Cases

I didn’t collect exact data, but only in categories, so of those schools who said they had 7½ or more hours each week of English or Maths, I could only count the 7½ hours. In the end, more than half of responses (52%) gave an answer of 7½ hours a week or more for English. It seems, therefore, if anything that the above are under-estimates of the time given over t o English.

Only about 10% of schools gave a similarly high answer for Maths, but this is still quite a significant number. Those figures rise to 57% and 16% each for pupils in Year 6.

At the other end of the scale, approximately 5% of responses said that they gave over no time to PSHE. The subject is not yet statutory, so presumably that figure will fall over the coming year or two. Around 4% of responses said they taught no Computing at all; I wonder if that’s more a confidence issue than a planned decision. Who knows?

As ever with such things, it is important to point out that this data is not a scientific sample, has not been verified, and could be completely meaningless. However, in the absence of any comparative data from the DfE, it is an attempt to give some vague indication of the national picture of schools that took part in the MTC sample.

At the time of writing, some 211 sets of data had been submitted to the open spreadsheet online. Because it’s an open spreadsheet, there’s no guarantee that it doesn’t have errors, or that some data hasn’t been damaged, or even completely made up. With that in mind, I have completed some very simple calculations based on the data to give some idea of indicative figures.

Overall Averages

The mean average of all pupils’ results was 18.4

The mean average of all schools’ averages was also 18.4

The following table shows the approximate cut-off points when comparing schools’ averages, to place schools into bands.

Perfect Scores

There was talk at one point of full marks being the expect threshold. It’s no longer clear that this is the case, or even that there will be a pass mark of any sort at all, but within the sample:

Overall proportion scoring 25/25: 17.4%

Bands for proportion scoring 25/25:

Pupil Scores

More pupils did score full marks than any other individual score, with scores clearly more likely to be at the top end of the scale.

School Averages

The majority of schools had an average score of between 16 and 20

Does any of this mean anything? Not really… it’s a tiny sample from a voluntary pilot of a new test with no clear expectations hastily compiled from questionable data. But some of it is at least slightly interesting.

Just a quick blog, inspired by this much more detailed and challenging one by Solomon Kingsnorth:

NEW BLOG: Please RT! The latest installment of my 'Small is Beautiful' series where I re-write the primary curriculum. This time on how the bloated curriculum is harming reading outcomes for children. Less fluff, more fluency. https://t.co/SzWIIQoDh1

I think he has a point about the importance of vocabulary, and it’s something we can easily underestimate. It’s also something we can worry that we’ll never be able to resolve, because there’s no way of knowing what vocabulary will come up in any given text or test.

So I took a look at this year’s KS2 Reading test paper and tried to identify some of the vocabulary required to answer each question. It’s not every word in the texts, but it’s also not just the case of the 10 marks theoretically set aside for vocabulary. In fact, I think there were 80 or more examples of vocabulary which might not have been met by pupils who don’t read regularly:

Q1

approximately, survive

Q2

disguise

Q3

razor-like, powerful

Q4

majority

Q5

develops, newborn

Q6

hibernate

Q7

captivity, territory

Q8

puzzling

Q9

vital, essential

Q10

extinction, survive, supplies, diminishing, poaching, territory

Q11

adopt, reserve

Q12

challenge

Q13

Q14

Q15

fascinating,

Q16

protective, enfold

Q17

punished

Q18

mountainous, praised, lavishly

Q19

wounded, lame, circumstance

Q20

seized

Q21

Q22

vividly recall

Q23

frail, hobbled

Q24

hobbled, hesitate, peered

Q25

Q26

lit up

Q27

amusing, shocking, puzzling, comforting

Q28

arrives, injured

Q29

verses

Q30

suggests, bothered, basins, smelt

Q31

lifeless, ancestors

Q32

guardian

Q33

devices (left to my own devices)

Q34

recesses

Q35

dawned (dawned on me)

Q36

assorted, debris, network, grime

Q37

detemination, thorough

Q38

impression, evidence, frightening, intensity, cautiously

Q39

justice, efforts

Q40

inspect, fashioned, ought

The only questions that are counted as vocabulary marks are the 10 written in italics. And all those ones in bold? They’re listed as inference questions in the mark schemes. The challenge of inference is often about interpreting complex language as much as it is about guessing what the writer intended.

Perhaps more importantly, very few of those words are technically specific to the texts they appeared in. Even in the case of the non-fiction text about pandas, much of the apparently technical vocabulary is applicable to plenty of other contexts that children meet in the course of the curriculum.

The link here to ‘tier two’ vocabulary is clear: there is plenty of vocabulary here that would come up in a number of different contexts, both through fiction and non-fiction reading.

Which rather makes me think that Solomon is on to something important: a significant part of teaching reading is about getting them reading and reading to them.

I’ve long been clear that I think that the current system of assessing writing at KS2 (and at KS1 for that matter) is so flawed as to be completely useless. The guidance on independence is so vague and open to interpretation and abuse, the framework so strictly applied (at least in theory), and moderation so ineffective at identifying any poor practice, that frankly you could make up your results by playing lottery numbers and nobody would be any the wiser.

One clear sign of its flaws last year was in the fact that having for years been the lowest-scoring area of attainment, and despite the new very stringent criteria which almost all teachers seem to dislike, somehow we ended up with more children achieving the expected standard in Writing than in any other subject area.

My fear now is that we will see that odd situation continue, as teachers get wise to the flaws in the framework and exploit them. I’m not arguing that teachers are cheating (although I’m sure some are), but rather that the system is so hopelessly constructed that the best a teacher can do for their pupils is to teach to the framework and ensure that every opportunity is provided for children to show the few skills required to reach the standard. There is no merit now in focusing on high quality writing; only in meeting the criteria. Results will rise, with no corresponding increase in the quality of writing needed.

For that reason, I suspect that we will likely see a substantial increase in the number of schools having more pupils reaching the expected standard. At Greater Depth level I suspect the picture will be more varied as different LAs give contradictory messages about how easy is should be to achieve, and different moderators appear to apply different expectations.

In an effort to get a sense of the direction of travel, I asked teachers – via social media – to share their writing data for last year, and their intended judgements for this year. Now, perhaps unsurprisingly, more teachers from schools with lower attainment last year have shared their data, so along with all the usual caveats of what a small sample this is, it’s worth noting that it’s certainly not representative. But it might be indicative.

Over 250 responses were given, of which just over 10 had to be ignored (because it seems that some teachers can’t grasp percentages, or can’t read questions!). Of the 240 responses used, the average figure for 2016 was 71% achieving EXS and 11% achieving GDS. Both of these figures are lower than last year’s national figures (74% / 15%) – which themselves seemed quite high, considering that just 5 years before, a similar percentage had managed to reach the old (apparently easier) Level 4 standard. Consequently, we might reasonably expect a greater increase in these schools results in 2017 – as the lower-attaining schools strive to get closer to last year’s averages.

Nevertheless, it does appear that the rise could be quite substantial. Across the group as a whole, the percentage of pupils achieving the expected standard rose by 4 percentage points (to just above last year’s national average), with the percentage achieving greater depth rising by a very similar amount (again, to just above last year’s national average).

We might expect this tendency towards the mean, and certainly that seems evident. Among those schools who fell short of the 74% last year, the median increase in percentage achieving expected was 8 percentage points; by contrast, for those who exceeded the 74% figure last year, the median change was a fall of 1 percentage point.

Now again, let me emphasise the caveats. This isn’t a representative sample at all – just a self-selecting group. And maybe if you’re in a school which did poorly last year and has pulled out all the stops this year, you’d be more likely to have responded, so it’s perfectly possible that this overestimates the national increase.

But equally, it’s possible that we’ll see an increase in teacher assessment scores which outstrips the increases in tested subjects – even though it’s already starting from a higher (some might say inflated) base.

I’m making a stab in the dark and predicting that we might see the proportion of children – nationally – reaching the Expected Standard in Writing reach 79% this year. Which is surely bonkers?

When Jon Brunskill recently agreed to share his work on Knowledge Organisers in primary school, I was excited to see what he came up with. I wasn’t disappointed, and I’m sure many others have been looking with interest. I think there’s a lot of merit in the model, but inevitably I think there is some refining to do.

I say this not as an expert – far from it, I’ve cobbled together one Knowledge Organiser in my life and remain unhappy with it. However, having spoken briefly to Jon about his, I think we both agree that there is merit in unpicking the model further.

Firstly, with Jon’s permission, let me share an image of the organiser he shared (I highly recommend reading the accompanying blog before continuing further with mine!)

At first glance, it looks like a lot of content to learn. I think that’s partly because most of us have spent a good many years teaching broad ideas, and not expecting children to learn detail off by heart. I think there are also very few of us who could hand-on-heart say we know all this content to recall. But I think that represents the shift we need to make rather than something to fear.

That led me to question the purpose behind the Knowledge Organiser. I haven’t spent enough time thinking about them, and certainly not enough time using them, but when I have, I’ve usually considered it a vehicle for outlining the key information that I expect students to learn and retain for the longer term. Often over longer units of work these might include key ideas which are integral to later understanding, whether that’s later in the school year, or later in their education career.

By way of illustration of my thinking, let me share a knowledge organiser I constructed a couple of years ago for my Year 5/6 class

My first attempt at a Knowledge Organiser in 2015

The differences are quickly obvious. For a start, mine is clearly based on a wider period of teaching, and perhaps more indicative of a basic revision guide, rather than providing content in advance of a unit. I think perhaps that’s also its biggest downfall. It’s worth noting that it’s something I tried and didn’t come back to.

But I think there is maybe a useful middle ground. In Jon’s case, much of the content set out – particularly on the timeline – is content that is useful for the purposes of writing an information text about the event itself (a task which Jon plans to do in his Y2 class). However, I don’t think he expects those students to secure that detail in the very long term. Arguably, this brings the organiser perhaps closer to the cramming model of revision than the more successful spaced practice approach.

Ruth Smith posted a comment on Jon’s blog saying she could imagine the organiser being used as a prompt during writing. While I can see the merits, I do think that the risk then – as Jon would rightly say – is that we replace the value of knowledge with the reliance on someone/something else to do the work for you. That’s not the aim here.

It leaves me wondering what the function of a Knowledge Organiser should be. I’m not persuaded of the value of knowing the date of leaving quarantine after the lunar landing. That said, the value of learning the word ‘quarantine’ is something I think is highly valuable.

The question for me becomes one of later testing (and let me be honest, I’m only at the very beginning of this journey; don’t for a second presume that I’m an expert. I’m a way behind Jon on this!) In a knowledge rich curriculum, I think one of the key functions of a Knowledge Organiser is to set out the key knowledge that I want students to retain and that I will test for.

We know of the great merit of spaced testing to aid learning, and it strikes me that a Knowledge Organiser should aim to set out that content which would likely later form part of such tests. In the context of Jon’s organiser, I could see merit in testing much of the vocabulary, the date of the landing, and perhaps the names of the crew. However, I’d also want to include some wider context – perhaps a bit more detail behind the Space Race, mention of JFK’s 1960 aim, etc. Might these replace some of the less significant dates of 1969?

Of course, we’re talking about 7-year-olds in Jon’s context. They will lack much of the wider historical knowledge to place events in context, and so there is a risk of expecting too much. But equally, if we train children that knowledge is to be learned, then ought we not be training them to learn it for the long term?

The content I think* I’d like to see on Knowledge Organisers is the detail that I would also expect to use in a brief pop quiz a week later, but also on a test mid-year drawing on prior units, and again at the end of the academic year, or in the first days of the following September. There is a risk that using Knowledge Organisers to aim for short-term recall of detail that is later lost, will develop a cramming ethos, rather than one of long-term storage of information.

What does this mean for Jon’s example? I’m not sure. Maybe a separation of the content that he expects children to retain in the long term from information which would be useful in this context? There is certainly some merit in having this timeline clear in the child’s mind as they are writing – not least because it helps to build a narrative, which is a great learning technique – but is it necessary for it to be stored in long-term memory? Indeed, is a two-week unit even long enough for such a transfer to be made?

Yet there is unquestionably information here which would be re-used in future that would allow such a long-term retention.

More thinking to do… but well worth doing, I think.

*I say I think, because I am not entirely sure that I won’t think completely differently in six months time.

If you haven’t already, I again recommend reading Jon’s original post here.

This has become something of a recurring refrain over my teaching career, and it always – always – frustrates me.

Nobody ever says it about Science: “Oh, you’re not still teaching solids, liquids and gases, are you?”. Or music: “Oh, you’re not still teaching standard notation, are you?” And yet for some reason it seems to abound in other areas – especially English.(Even maths seemed to go through a phase where the standard basics were frowned upon!) But such decisions are often distinctly personal.

The first time I read Holes by Louis Sachar, I couldn’t wait to get planning for it, and was desperate to start teaching it. Now, having taught it too many times for my own liking, I’m tired of it. I suspect that this will be my last year of tackling it because I’ve lost my love for it. But for my class this year, it was their first time of approaching it. It was fresh for them. The only reason to abandon it is that my waning love for it risks coming through in the teaching.

But that won’t stop somebody somewhere from saying “Oh, but you’re not still teaching Holes, are you?”

It happens too often.

Tonight I’ve seen the same said of both The Highwayman and the animation The Piano. Now for sure they’ve both had more than their fair share of glory, but there was a reason why they were chosen in the first place. I’m all in favour of people moving away from them, finding better alternatives, mixing things up a bit. But they don’t cease to be excellent texts just because they’ve been done before. Every Year 5 child who comes to them does so for the first time.

I’ve heard the same said before of The Lighthouse Keeper’s Lunch at KS1 -as though somehow the fact that a topic has worked brilliantly in the past should be ignored simply because a consultant is over-familiar with it.

Of course, there are reasons to ditch texts. Sometimes they become outdated. Sometimes they cease to match the curriculum. Sometimes the ability of the children demands more stretch. Sometimes something much better comes along. Sometimes you’re just sick of them.

I’ve never cared for Street Child even though it’s wildly popular. I’ve always found Morpurgo’s work irritating. But if others find them thrilling, and get great results with their classes, then so be it. Who am I to prevent them teaching them?

As somebody also responded on Twitter this evening: the best “hook” is the teacher. If a teacher feels passionately about a poem, a book, or a topic, then it can be a great vehicle for the teaching that surrounds it. And if we make them all ditch those popular classics merely because they’re popular, then you’d better have a damned good replacement lined up to offer them!

For some time now I have been working on a model of teaching Writing built around the idea of longer blocks focusing on fewer things. Previously I have written about a model I used in my previous school, and since then have had many requests for more information.

This year I have finally produced some notes about the model I use, based on 4 Writing Purposes. My view is that rather than trying to teach children 10 or more different ‘genres’ or ‘text types’ as we used to do in the days of the Writing Test, rather it is better to focus on what those types have in common. It means that at my school we use 4 main types of writing across KS1 and KS2: Writing to entertain; to inform; to persuade; and to discuss.*

The 4 main writing purposes, and some of the ‘text types’ that could fall under each.

Importantly, by the end of KS2 I’d hope to see children recognise things like the fact that newspaper articles could actually fall under any or all of the 4 headings: they’re not a distinct type in themselves, really.

As a very rough rule, I’d expect around half of curriculum time to be taken up by “Writing to entertain”, with the remaining non-fiction elements sharing the remaining time. Notably in KS1 the non-fiction focus is only on Writing to inform.

Example guidance note

To support structuring the curriculum in this way, I have now compiled some guidance notes for each category. I say compiled, rather than written, because much of the legwork on these notes was done by my wife – @TemplarWilson – as she rolls out a similar model in her own school.

The guidance notes attempt to offer some indications of National Curriculum content that might be covered in each section. This includes some elements of whole-text ideas, suggestions for sentences and grammar, notes on punctuation to include, and also some examples of conjunctions and adverbials.

They’re not exhaustive, nothing radical, but as ever, if they’re of use to people, then I’m happy to share:4 Writing Purposes – guidance (click to download)

Alongside the guidance sheets, I also have the large versions of the 4 main roadsign images, and an example text for each of the four purposes. The example texts are probably of more use at the upper end of KS2, and could almost certainly be improved, but they are a starting point for teaching and analysis by the children to draw out key features, etc. Both can be downloaded here:

As I plough through marking the 49 questions of the KS2 sample Grammar test, I find keep flicking back and forth in the booklet a nuisance, so I’ve condensed the markscheme into a single page document.

You’ll still want the markscheme to hand for those fiddly queries, but it means a quicker race through for the majority of easy-to-mark questions. For each question, where there are tickboxes I’ve just indicated which number box should be ticked; where words should be circled/underlined I’ve noted the relevant words. For grid questions, I’ve copied a miniature grid into the markscheme.

As so many schools have evidently used the sample tests to help ascertain their pupils’ progress towards the expected standard (whatever that might be), I’m sure many will welcome the opportunity to analyse the outcomes.

Emily Hobson, (@miss_hobson) of Oasis Academies, has kindly agreed to share the template she put together for analysing the KS2 tests.

The spreadsheet can be downloaded below, and then data entered to scrutinise your pupils’ progress in the main areas, and for each question.

Months of moaning about the delays to the delivery of exemplification for Writing Teacher Assessment, and now it arrives I’m still not happy.

But then… it is a bloody mess!

The exemplification published today demonstrates what many of us feared about the new interim teacher assessment framework: expectations have rocketed. I appreciate (probably more than most) that direct comparisons are not ideal, but certainly having been told that the new expected standard would be broadly in line with an old Level 4b, I know I feel cheated.

The discussions in this household about the “expected standard” exemplification were not about whether or not the work was in line with a 4b, but whether or not it would have achieved a Level 5. That represents, of course, an additional 2 years of learning under the old system. We’re expecting 11-year-olds to write like 13-year-olds.

In fact, the only time where 4b ever came into the conversation was in our browse through the new “Working towards” exemplification. It seems that a child who used to meet the expected standard in 2015, would now be lucky to reach ‘working towards’ even.

What this will mean for national data this year, who knows? If schools are honest, and moderation robust, could we see a new “expected standard” proportion somewhere in the mid-30% range, like we used to with Level 5s?

Among all this, though, is another confusing element. For while in the old exemplification materials for levels in years gone by we were told that “All writing is independent and is in first draft form” (my emphasis), it seems that now this message is not so clear. Informal feedback from the meetings held at STA on Thursday and Friday last week seemed to bring up some surprises about what constituted independent writing, including the scope for using dictionaries, following success criteria, and even responding to teacher feedback.

So now we have what looks like horrendously difficult expectations for a majority of pupils who have had barely two years of a new National Curriculum instead of six, and a lack of clarity, once again, about what is actually expected.

Is it really too much to ask?

For those who haven’t yet had the pleasure, the KS1 and KS2 Writing exemplification documents are available here: