Over in today’s Guardian column I discuss the surprising findings of the PISA 2012 survey. Though governments commonly report on the results of skills tests, the released data actually includes pupil answers to lots of survey questions. This allows cross-country comparison of all kinds of things – e.g. what possessions students have, what their parents do, how often they read, etc.

I first came across this additional information in a class I took over the past few months. In the class we selected a country (I chose England), were provided the complete 2009 PISA dataset (downloadable here), and were asked to perform a multi-level analysis that showed how certain variables impact on an outcome. E.g. we might look at how being female influences your math score. (‘Badly’, is the answer).

At the time I was hearing lots of comments in the blogosphere about the lack of memorisation in teaching, and how lessons were now ‘relatable’ to students’ lives but not very rigorous. So, I thought, let’s see what the impact of that is on their results.

Only, when I started looking at the scores for memorisation versus more ‘vapid happy talk’ things such as information being related to students’ lives, it turned out students reported much more of the former.

Hence,I created two variables: ProgValue and TradValue. ProgValue took the scores from items such as “We express our opinions” and “Our teacher relates activities to our everyday lives” whereas TradValue used items questions such as “We memorize everything” and “Our teacher checks we are concentrating”. These are not purely ‘progressive’ and ‘traditionalist’ in the sense of curriculum (i.e. teaching traditional knowledge vs. skills) but they do reflect the teaching approaches commonly referred to by these groups in the blogs that I read and in various government speeches.

Out of a possible score of 20 (the higher the score, the more the student had that type of teaching), the mean for ProgValue was 11.29, whereas for TradValue it was 13.94. This was puzzling. Overall students were reporting more traditionalist teaching than progressivist.

Using MPlus I separated out the effects of socioeconomic status of individual students, whether or not they spoke English, pupil scores of the quality of relationship with their teachers, and the type of schooling (public or private).

Here is what I found:

State school pupils report a lower progressive score than private school pupils, however the difference is not statistically significant (so it might be down to random variation).

State school pupils report a higher traditional score than private school pupils, and this is a statistically significant difference

Wealthier pupils report more progressive teaching, regardless of what type of school they attend (though this is not what explains the difference between state and private schools, as it was controlled for in the previous test.

Pupils who say they have good relationships with their teachers also scored more highly on the progressivist measure even when accounting for school type and wealth (i.e. they report doing more of the ‘express opinions’ type activities)

These are not causal findings. It is not possible to say that because someone is teaching using more progressivist activities that they have better relationships. Also, this particular run of statistics doesn’t get into the impact on results. It could be that while results are not affected by progressivist teaching in private schools it is nevertheless problematic in state schools. [There are things in the 2013 dataset, which I talk about in the Guardian piece, which suggest this is unlikely, but without running the model I cannot say that for certain].

BUT! what can be inferred is that the idea of state schools systematically teaching in a more ‘progressive’ way than a ‘traditionalist’ private sector is highly unlikely.