“Madam, how like you this play?”
“The lady doth protest too much, methinks.” – Hamlet
A popular tactic in arguments is to attack the person bearing a displeasing message. This ad hominem response works by slapping emotive labels on opponents in an attempt to stop other people thinking about what they are saying. At times I have seen this tendency rise to feverish heights in politics, and in education. It’s a tendency that currently seems to be growing in the Labour Party.
Even the academic world is fraught with ambition, rivalry and conceit. The same threat of vilification and ostracism can just as easily apply to researchers as it does to politicians, or to teachers who happen to espouse a view that is is contrary to the progressive zeitgeist. For instance, in a speech to the Baltimore Curriculum Project in 2012, Shepard Barbash discussed the rejection of Direct Instruction by the educational establishment in the US, and related this anecdote. He had been corresponding with an academic researcher who had praised his book, ‘Clear Teaching‘, on Direct Instruction, but gave this response when asked for a public endorsement:
I enjoy discussing these issues — that’s what I do. But there is no chance that I will end up writing a note to Amazon that endorses your book or Direct Instruction.
I worry that anything I write in praise of your book could be seen as an endorsement of DI and that implication — however indirect –might diminish the impact of my research.1
I have certainly seen some questionable practices by academics at first and second hand. Those experiences, and engagement with academics less inclined to toe the party line than Barbash’s example, have developed in me a healthy element of skepticism when it comes to educational research.
I have critiqued two studies released by the Educational Endowment Foundation on this blog: Switch-On Reading and Accelerated Reader. Both report mediocre effect sizes as ‘positive effects’. In reality, the most favourable interpretations of the studies’ data suggested that they led to three months’ gain over three months in the case of Switch-On Reading, and three months’ gain over four and a half months for Accelerated Reader.
Why is this inadequate? Because, for students who are reading significantly behind, such progress is ineffectual. Let’s take a student who is 36 months behind. At least 7% of secondary students fall into this category, so in a community with low socio-economic resources, it’s likely that in a school of 1000 pupils there would be at least 70 in this situation.2 And let us say that we ensure that they receive one of these interventions.
A student following Switch-On Reading for three months makes a gain of an additional three months. So now they are just 33 months behind. The deficit has been closed by one twelfth. At this rate, if they do Switch-On Reading every term, it will only take another 11 terms for them to catch up, or three and a half years. So this intervention would, if it had sufficient materials and teaching programmes, enable Year 7 students reading 36 months behind to catch up late in Year 10. Of course, it doesn’t have these resources – it is a ten week programme. If it did have the wherewithal, it would cost not £672 per pupil, but 12 x £672, or £8064. Per pupil. Even that would be something. But in fact, the intervention offers three months’ gains in ten weeks, and that’s it. As for Accelerated Reader, students would gain about six months per year, meaning it would take six years to catch up. A ten or twenty week intervention that moves struggling readers just three months further on is the educational equivalent of giving a course of aspirin to a TB patient.
Yet according to the EEF studies, both programmes have ‘positive benefits’ with medium security for the reliability of their findings. The fact that these findings are absolutely no use in the real world, for students with real learning needs, seems to have escaped the researchers.
Recently the lead researcher of the Switch-On study, Professor Stephen Gorard, tweeted his objections to this post (which was about Reading Recovery research). There were three:
- I did not make sufficient distinctions between the Switch-On study’s use of effect size and Hattie’s.
- He believed I had an agenda.
- My anonymity.
I had expected to be chastised on some technical grounds at some point (few issues are settled in the world of educational statistics). Professor Gorard wrote:
He is correct that Hattie uses different types of effect size calculations in his meta-analysis. How much this matters is another issue. Regardless, the argument still stands, even if we remove all reference to Hattie and the ‘hinge point’. By the Switch-On researchers’ own reckoning, students gained three months – so the effect of the intervention was insufficient to enable a student to catch up if they were reading just four months behind, let alone four years. Would schools invest if they understood this?
This yawning gap between educational practice focused on students’ needs on the one hand, and academia focused more on the security of a research design than the utility of findings on the other, is a major concern. Not only does it call into question the government funding provided to the EEF, but it ultimately undermines confidence in the value of educational research.
In this context, Professor Gorard’s complaints about anonymity are perhaps best seen as an attempt to dismiss the entire discussion rather than to go into awkward territory. Instead, he pursued the issue of identity in order to prove an unspecified ‘agenda’.
It was as if he had read my recent post on reasons to be anonymous on the internet, and was at pains to prove me right. When John Walker innocently tweeted a comment, this was the accusation:
Either an argument stands up or it doesn’t. Whether the author is a millionaire, a teacher or an academic, a supporter of free speech, rail nationalisation or fruitarianism, the argument and the evidence are separate issues from their identity. Of course the writer and their writing are related: but if the only way you can oppose an argument is to decry the author’s motivations, then you have lost the argument.
Crying ‘vested interests’ can also be a two-edged sword. Does receiving income from your educational work mean that you are precluded from expressing views on education? If so, no one employed as a teacher has a right to express a view on education. Indeed, no MP would have a right to express views on politics. Such a view is clearly ludicrous, but this does not prevent its frequent appearance in the modified form: ‘if you receive private income rather than from the state, you have a vested interest.’ Piffle. The money smells the same regardless of how it was (legitimately) earned. There is no moral superiority attached to being funded by the taxpayer. In this sense, civil servants, union members and boards of directors all have vested interests. So, for that matter, do academics who rely on government-funded bodies for research grants.
Professor Gorard, for example, led the teams on the Switch-On Reading study, the Accelerated Reader study, and the P4C study (which has been criticised by others for its statistical weakness and poor interpretation). Defending that funding stream could be construed as a vested interest. Does this mean Professor Gorard has no right to comment? Of course it doesn’t. His views should stand or fall on the evidence and the logic. And it is reasonable that he should accord the same right to others, focusing on the debate rather than innuendo.
‘Vested interest’ is rarely an argument. It is a principle used to decide who should be part of a decision-making process, for example when allocating funds. It can’t be used to prove that something is or is not true.
Back to the nub of the issue:
Apparently looking for two months’ additional progress per month of intervention is ‘absurd’. When researchers focus their expectations of what is ‘normal’ on statistical experience, and write about findings in this light, they do students a grave disservice. No amount of crying ‘vested interests’ makes a three-month gain ‘effective’ for struggling readers. Schools need to look beyond the bland headlines when researching interventions. The Education Endowment Foundation needs to think carefully about whether or not their current approach is providing the service to schools for which they are funded.
- Barbash, S. (2012). Keynote address to the Baltimore Curriculum Project, Baltimore USA.
- Source: Department for Education, National Key Stage 2 Results, 2012-2015. Based on percentage of students at or below Level 2 in English at the end of Year 6.