Eight Signs of Snake Oil

“ . . . it is a massy wheel,
Fix’d on the summit of the highest mount,
To whose huge spokes ten thousand lesser things
Are mortised and adjoin’d . . .” –
Hamlet

There are literally thousands of enterprises and organisations which service and support the education sector, and without them the sector would struggle to function. Their support ranges from supplying food for school dinners, stationery, printing, office supplies, to cleaning and security services.  Some provide a better service than others.

Then there are the publishers of textbooks and resources, constantly evolving their publishing lists to follow the changes in curriculum. Again, some publishers do this better than others.

Then there are those providing professional services, in the form of advice, training and intervention programmes. (To be clear, I work in this field myself.) While it is tempting to write off all such endeavours as rip-offs, the truth is, some make a valuable contribution. Unfortunately, there are also the snake oil sellers, those whose main focus is on maximising profit rather than delivering a quality service.

To navigate successfully the complex ecosystem that services schools, it is important to be able to recognise the danger signs of predatory, rather than productive, partners. Here are eight things I have noticed these providers do to sell into schools.

  1. Present their product or service as ensuring a quick win

School managers are often persuaded by this strategy, particularly if they are short of evidence that they have met their KPIs, or are being monitored by Ofsted. The problem with ‘quick wins’ is the threat of homeostasis – the tendency of a system to return to its former state. Real changes tend to come incrementally from long-term initiatives.

  1. Describe short term, temporary gains as if they are substantial

Nothing hurts students more than low expectations and, unfortunately, many schools have them. For example, we frequently hear gains of two years in reading touted as ‘substantial’. They’re not. For a student who is five years behind, a two-year gain is woefully insufficient.

  1. Make the programme sound cheap to deliver, and disguise hidden costs

For example, it may only cost £9 per student for a licence – but if you then have to restock your library, and the librarian has to spend August putting colour-coded stickers on every one of the new books, who pays for that? Clue: not your new ‘provider’.

4. Present a superficial programme as the solution to a deep problem.

For example, a reading ‘promotion’ scheme might be presented as a reading intervention, but if it doesn’t actually include a teaching component, it can’t help the weakest readers – which is, surely, the point of a reading intervention.

5. Manipulate impact data through selection

Inferior programmes often use a simplistic selection system which allows students who don’t need the intervention to be included in the group. These students’ apparent ‘progress’ can then inflate the averaged outcomes. Quality interventions will use more than one pre-test to check that the students really do need the intervention.

6. Manipulate data to disguise who didn’t benefit

One way to do this is to cite only averages. For example, the students who make the biggest gains could be students who weren’t as far behind and didn’t need the intervention in the first place.  Always look for the breakdown that shows the progress of the lowest-achieving students.

7. Avoid any mention of follow-up gains

There is no point to an intervention if students go backwards later. Always check what the follow-up procedures are, and whether there is a system in place to help students if they do slip back.

8. Cite ‘research’ vaguely

What is meant by ‘research’ can range from scientific experiments to ruminations in the bathtub. Today, everyone claims that what they do is ‘research-based’. Always check the extent of a) the research conducted on the impact of the programme, and b) how well this matches the peer-reviewed body of research evidence in the field.

All this implies, of course, that as teachers, we have to be proficient evaluators of research ourselves. Caveat emptor!

This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s