The danger of the ‘expert’

Back in 1998, Andrew Wakefield caused a stir when he produced a long-since discredited article suggesting a link between the MMR vaccine and autism. What resulted included a long-running debate, the retraction of the article by the publishing journal, and the striking off of the surgeon behind it.

But the damage was done. The fact that the original doctor behind the report has been struck off has not removed the problem from history. The reality is that when an ‘expert’ speaks, the average listener does not question his credentials, or investigate their own research: we rely on those offered up as experts to do this work for us and to guide the rest of us.

Now, my point is far less serious than that posed by the MMR controversy, and the mention of it merely illustrative. But there is an issue with ‘experts’ in our profession offering solutions which may actually cause more harm than good.

This week I was attending a conference in London hosted by The Key, at which I was speaking about my approach to assessment without levels. There were other schools represented, also sharing their own models, some of which I thought brilliant, others of which were – to my mind – awful. If nothing else, such events serve to ground me and make clear that I am no oracle.

However, at the same event, David Driscoll – an education consultant who also works as an additional inspector and as an “associate education expert” for The Key – was asked to speak on the topic of “Inspection of Assessment”. He was listed in the brochure as an expert, and the intention was clearly to provide leaders with guidance on managing new assessment systems in an Ofsted-friendly way.

Now, the fact that such reassurance is needed suggests that Ofsted’s messages about not having a pre-determined view on what assessment systems should look like are not yet trusted by the profession; the presentation given by this particular lead inspector demonstrates exactly why that is the case!

To credit Mr Driscoll, he did at least once state the official Ofsted view. However, he then proceeded to explain to delegates that data ought to be presented in standard forms, and that schools would be best advised to keep levels and simply re-write the descriptors.

I was astounded.

He continued to explain that schools needed to choose a starting and end measure and define a fixed measure of expected progress, saying that “you need a number”. Now, perhaps this was evidence of his own limited concept of how assessment and progress works, but it certainly isn’t a message that fits in with the direction of travel in education at the moment. Or at least, it oughtn’t be. But, of course, the problem is that he is “the expert”. And every headteacher there will have had at the back of their mind the realisation that he could have been the lead inspector of their next Ofsted visit.

This was different to the usual doubts I might have about another school’s approach to assessment without levels. This was not some practising school leader musing on his current thinking (in fact, it appears from his website that Mr Driscoll hasn’t taught for over 25 years). This was someone presented as an expert, offering guidance on how data ought to be managed and presented for the purposes of Ofsted. It was advice that was likely to take priority over much of the other content of the day (including excellent presentations from people such as Katharine Bailey of the CEM at Durham).

It’s true, the problem is not a patch on the risks of poor advice about vaccinations or such things. But the root of the problem is the same: ‘experts’ with a poor message can present more danger than no message at all.

I live in hope that the Assessment Commission set up before the election soon helps to bring some guidance to the profession that quashes the nonsense spouted by ‘experts’ such as this, and ensures that Ofsted is supported to keep its inspectors in line!

For teachers unsure of how best to move forward with assessment, I cannot recommend strongly enough the article by Dylan Wiliam in Teach Primary magazine from last autumn:

5 thoughts on “The danger of the ‘expert’”

[…] Back in 1998, Andrew Wakefield caused a stir when he produced a long-since discredited article suggesting a link between the MMR vaccine and autism. What resulted included a long-running debate, th… […]

There is a significant difference in expert as in expertise based on empirical data and expert as in someone with a largely political opinion in a context where they have a lot of experience. An example of the former would be me saying I’m an expert in digital technologies and 8 bits all set to high is normally interpreted as the decimal number 255. An example of the latter would be me saying I’m an expert in digital technologies and all schools should be adopting open source technologies. While the latter might be an informed opinion, it is not undisputed whereas in the first example no-experts would dispute it. What is worse is when an expert like Roy Meadows who was a medical doctor was assumed to be an expert in mathematics which he was not and consequently made a convincing case out of bogus statistical “facts”. Unfortunately no mathematics experts were on hand to shoot his arguments down.What really matters here is being able to distinguish between stuff backed by irrefutable scientific data and stuff that is politics backed by relevant experience. Its a big problem when people in senior positions like judges are relatively scientifically and mathematically illiterate.

In fact the public reception of expert advice is rather fickle. The MMR stir was caused by a single ‘expert’ and very flawed evidence, yet it had widespread and, in some cases, devastating repercussions. A large majority of the world’s climate experts, however, puts forth the idea, with considerable data, that we are impacting on our climate with potentially catastrophic results and you’d hardly think anyone had said anything!

With regard to OFSTED, it seems worryingly widespread. I was at some pains to insist that we stopped using levels since they were no longer relevant to the new curriculum and there were sound reasons why they had been abandoned, but now I hear from several schools who have been recently inspected, that the inspectors have said such things as, ‘Oh thank goodness you’re still using levels!’ I’m beginning to wish we were too and to hell with the differences in the curriculum. As it is, we’ve the worst of all worlds, attempting to convert teaching objectives (as opposed to assessment criteria) into numbers (a la Focus Ed), losing all meaning and reliability in the process.

[…] not expecting detail dialogue and many other things, it’s hard to see change on the ground. I wrote only recently about one inspector who proudly tells audiences that they should stick with levels. Any school […]