Concentrate on what really matters – high quality teaching and learning – writes Colin Logan, SSAT Senior Education Lead, Accountability…
When the provisional school key stage 4 results were released for checking last month, it gave the opportunity to compare projected Progress 8 scores with the real thing (or, at least, the real thing before re-marks and appeals). Many schools reported that their score (which shows by what proportion of a grade in each subject their pupils’ progress was either above or below average) had gone down; a seemingly smaller number said that it had gone up.
In reality, no such revisions had occurred. No school had a Progress 8 score to start with, so there was nothing to revise. What they had been working from were calculations made by educational software companies, and sometimes by school data managers, based on last year’s figures. They were produced with all good intentions – to give school leaders an idea of what to expect – but there was no chance that they could be sufficiently accurate.
SSAT, along with the likes of FFT and even the DfE, has been advising schools for some time now not to try to best guess a Progress 8 score in advance as it is ultimately a futile exercise.
No predetermined benchmark
One major advantage of Progress 8 over the previous expected progress model is that it is calculated from scratch each year once all the results are in. There is no predetermined benchmark, unlike, for example, the notion of making a set number of ’levels of progress’ (a concept which, incidentally, was fatally flawed from the start, but that’s another story).
Once the results are in, DfE statisticians pore over the results of every year 11 pupil in the country, look at each of their detailed starting points at KS2 and then calculate what the average progress was nationally from each starting point. These then become estimates against which the performance of pupils in each school are compared and, subject by subject, their differences from the average are worked out. Once the individual scores are in, these are totalled for each school to give an overall below-average, average or above-average figure for each school.
One major advantage of Progress 8 over the previous model: it is calculated from scratch each year once all the results are in
Each school’s Progress 8 figure, therefore, depends not only on their own pupils’ performance but also on that of all pupils nationally. Using last year’s calculations to work out this year’s figures is a bit like using last year’s weather reports (for somewhere with a particularly volatile climate) to prepare a forecast for this year – except worse.
In 2015, most schools hadn’t developed their curriculum to be more Attainment/Progress 8-friendly. Those that had were in the early days of teaching their revised curriculum and had yet to develop their expertise. In 2016, there will have been further changes to curriculum and entry patterns nationally, together with growing expertise from early-adopters.
That will have significantly altered the average progress from each starting point across a wide range of subjects. And it will continue into 2017, further complicated by the first examination of the new GCSEs in English and maths with number grades, not to mention the revised point score allocations for unamended qualifications, which predominantly affect lower attainers.
So if it was never going to be possible to predict Progress 8 in advance for this year, it most certainly will not be possible for next.
What’s the solution? The simple answer is, don’t bother trying. Much heat but very little light is being generated in some schools by leaders trying to set targets that will result in an average or positive Progress 8 score.
Set suitable targets in each subject
That time would be far better spent making sure that pupils are set appropriate, challenging targets in their individual subjects (and, most importantly, provided with the essential guidance and feedback on what they need to do to achieve them). Senior and middle leaders can then focus on making sure that the quality of teaching and learning in each of those subjects is the best it can be. That’s the best way to improve progress and thereby to come out with a positive Progress 8 score – looking after the pennies allows the pounds to look after themselves.
But where can schools find appropriate targets if the national picture is so volatile? FFT Aspire is one source. It uses the latest pupil-level data, in a similar way to the DfE, to produce for every pupil – in just about every subject – a range of estimates pitched at average performance, top 20% and top 5% to help with target-setting. They have two models, one including pupil and school context, the other just taking account of gender and month of birth to allow schools to use whichever is the more challenging in their circumstances.
They’re estimates, not targets in themselves, but they are based on the most recent performance of real pupils in real schools. These, too, will vary year on year more than they have in the past as a result of the changing national picture already discussed. But they avoid the uncertainties and guesswork encountered when attempts are made to aggregate individual pupil targets into the elements and slots of the Attainment and Progress 8 silos.
Now that the new accountability measures and GCSEs are with us, we need to let go of some of the practices that have become almost second nature: expected progress, targets that are set in stone and that focus on school benchmarks and rigid flightpaths from years 7 to 11 are among them.
That will allow schools to concentrate their energies on what really matters – high quality teaching and learning.