Using the data properly is vital in developing and implementing school, as well as national, policies

Cause and effect? Or just an association? It matters. Anyone in schools concerned with CPDL needs to be sure which interventions are really working. Some help is available…

A big issue in evidence-based teaching and learning is understanding the difference between association and causation. The DfE, many respected publications, many teachers and school leaders, and even some researchers are weak on this. Dylan Wiliam refers to this logical fallacy with the Latin saying, ‘post hoc ergo propter hoc’ (after this, therefore because of this).

The post hoc fallacy, stating ‘since event Y followed event X, event Y must have been caused by event X’, is an easy mistake to make. The fallacy stems from coming to a conclusion based just on the order of events, rather than taking into account other factors that might rule out the connection.

A recent Stephen Tierney post on CPD and teaching quality gives a memorable example: “The way I explain this when talking is: there were no maternity leaves for the first two and a half years of my headship. However over time there were more and more and by the time I left headship we’d regularly have six to eight per annum. There is a positive correlation between the length of time I was in headship and the number of staff becoming pregnant; however, there was no causation. I was responsible for none of them.”

Making the point that causation is very difficult to prove, he adds: “If we are going to invest tens of thousands of pounds per annum in reducing teachers’ teaching time, to enable them to undertake more professional development, instead of spending it on something else, it has to have impact. This is not about vanity projects. It’s about improving teaching, enhancing learning and making a positive difference to children’s life chances.”

Why it’s important
As Montrose42 blogged recently, “ministers endlessly quote the same one piece of research which they manage to fundamentally misunderstand and misuse about the importance of employer engagement” [as they do with schools funding, teacher workload…]. To base policy on such a narrow and selective evidence base will lead to poor policy design and ultimately a hopeless strategy. Nobody suggests that it’s only about careers advice from professionals. This has to be combined with (quality assured) employer engagement, of course, careers education ie equipping young people with the tools to make informed choices, and high quality work experience.”

Clarity about what the data does not show

As well as giving some guidance of the effectiveness of the Mindfulness in Schools (MiSP) programme, a 2013 study gives an idea of how vital it is to separate association from causation. You have to be clear about what your research shows, and what it does not show. This non-randomised controlled study, conducted by researchers from the universities of Exeter, Oxford, Cambridge and Western Australia, along with the two co-founders of the MiSP, was published by British Journal of Psychiatry.

The paper gives the usual data about the intervention and control groups, and finds that, after adjusting for baseline imbalances, children who participated in the intervention reported fewer depressive symptoms, lower stress and greater well-being at follow-up. The degree to which students in the intervention group practised the mindfulness skills was associated with better wellbeing and less stress at 3-month follow-up. Probability data for all of these findings are given.

Key point: the limitations

But most interesting in this context is the paper’s admission of the study’s limitations. These included: “we were not able to randomly assign schools or students and therefore some inevitable baseline imbalances were observed. Although we adjusted for these statistically, there may have been imbalance on other key prognostic factors. Our study used a small set of self-report measures. Future studies should broaden beyond self-report outcome measures to look at schools and/or classroom-based measures, observer measures, biobehavioural measures of stress reactivity and/or resilience and mental health outcomes.”

It concludes: “interventions that demonstrate acceptability, efficacy, cost-effectiveness and potential for implementation are most likely to be sustainable. This feasibility study… provides preliminary evidence of acceptability and efficacy” (our italics).

“To base policy on a narrow and selective evidence base will lead to poor policy design and ultimately a hopeless strategy”

Applying it in your school

This all sounds a bit daunting when such research, vital as it is, represents additional responsibilities for teachers and school leaders. But help is at hand, for example through the Education Endowment Foundation’s well-known Teaching & Learning Toolkit, which summarises some educational research on effective ways to improve student attainment. The 34 teaching approaches and interventions are each summarised in terms of their average impact on attainment, the strength of the evidence supporting them and their cost.

Of course, these do not cover all the interventions schools might find helpful. Many may also need support in designing and applying their own evaluations. EEF’s DIY evaluation guide, which is perhaps less well known, is designed for schools to use in evaluating their own projects. And a 2016 blog by Robin Hall, NCTL’s R&D manager, covers the design of randomised controlled trials in a school context.

However you approach it, you need to be absolutely clear on the effects of an intervention, distinguishing them from changes in behaviour and results that have other causes, to avoid the post-hoc fallacy. After all, you probably don’t want to be thought responsible for an increase in pregnancies among your colleagues.


Follow SSAT on Twitter.

Find SSAT on Facebook.



Tagged with:

Leave a Reply

You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

Reducing teacher workload – do’s and don’ts

3 March 2017

Workload matters

3 March 2017