Progress 8: don’t try to guess it

data-1024Colin-Logan-final
Concentrate on what really matters – high quality teaching and learning – writes Colin Logan, SSAT Senior Education Lead, Accountability…

When the provisional school key stage 4 results were released for checking last month, it gave the opportunity to compare projected Progress 8 scores with the real thing (or, at least, the real thing before re-marks and appeals). Many schools reported that their score (which shows by what proportion of a grade in each subject their pupils’ progress was either above or below average) had gone down; a seemingly smaller number said that it had gone up.

In reality, no such revisions had occurred. No school had a Progress 8 score to start with, so there was nothing to revise. What they had been working from were calculations made by educational software companies, and sometimes by school data managers, based on last year’s figures. They were produced with all good intentions – to give school leaders an idea of what to expect – but there was no chance that they could be sufficiently accurate.

SSAT, along with the likes of FFT and even the DfE, has been advising schools for some time now not to try to best guess a Progress 8 score in advance as it is ultimately a futile exercise.

No predetermined benchmark

One major advantage of Progress 8 over the previous expected progress model is that it is calculated from scratch each year once all the results are in. There is no predetermined benchmark, unlike, for example, the notion of making a set number of ’levels of progress’ (a concept which, incidentally, was fatally flawed from the start, but that’s another story).

Once the results are in, DfE statisticians pore over the results of every year 11 pupil in the country, look at each of their detailed starting points at KS2 and then calculate what the average progress was nationally from each starting point. These then become estimates against which the performance of pupils in each school are compared and, subject by subject, their differences from the average are worked out. Once the individual scores are in, these are totalled for each school to give an overall below-average, average or above-average figure for each school.

One major advantage of Progress 8 over the previous model: it is calculated from scratch each year once all the results are in

Each school’s Progress 8 figure, therefore, depends not only on their own pupils’ performance but also on that of all pupils nationally. Using last year’s calculations to work out this year’s figures is a bit like using last year’s weather reports (for somewhere with a particularly volatile climate) to prepare a forecast for this year – except worse.

In 2015, most schools hadn’t developed their curriculum to be more Attainment/Progress 8-friendly. Those that had were in the early days of teaching their revised curriculum and had yet to develop their expertise. In 2016, there will have been further changes to curriculum and entry patterns nationally, together with growing expertise from early-adopters.

That will have significantly altered the average progress from each starting point across a wide range of subjects. And it will continue into 2017, further complicated by the first examination of the new GCSEs in English and maths with number grades, not to mention the revised point score allocations for unamended qualifications, which predominantly affect lower attainers.

So if it was never going to be possible to predict Progress 8 in advance for this year, it most certainly will not be possible for next.

What’s the solution? The simple answer is, don’t bother trying. Much heat but very little light is being generated in some schools by leaders trying to set targets that will result in an average or positive Progress 8 score.

Set suitable targets in each subject

That time would be far better spent making sure that pupils are set appropriate, challenging targets in their individual subjects (and, most importantly, provided with the essential guidance and feedback on what they need to do to achieve them). Senior and middle leaders can then focus on making sure that the quality of teaching and learning in each of those subjects is the best it can be. That’s the best way to improve progress and thereby to come out with a positive Progress 8 score – looking after the pennies allows the pounds to look after themselves.

But where can schools find appropriate targets if the national picture is so volatile? FFT Aspire is one source. It uses the latest pupil-level data, in a similar way to the DfE, to produce for every pupil – in just about every subject – a range of estimates pitched at average performance, top 20% and top 5% to help with target-setting. They have two models, one including pupil and school context, the other just taking account of gender and month of birth to allow schools to use whichever is the more challenging in their circumstances.

They’re estimates, not targets in themselves, but they are based on the most recent performance of real pupils in real schools. These, too, will vary year on year more than they have in the past as a result of the changing national picture already discussed. But they avoid the uncertainties and guesswork encountered when attempts are made to aggregate individual pupil targets into the elements and slots of the Attainment and Progress 8 silos.

Now that the new accountability measures and GCSEs are with us, we need to let go of some of the practices that have become almost second nature: expected progress, targets that are set in stone and that focus on school benchmarks and rigid flightpaths from years 7 to 11 are among them.

That will allow schools to concentrate their energies on what really matters – high quality teaching and learning.


Read more blogs by Colin.

Follow SSAT on Twitter.

Find SSAT on Facebook.

Share this post on Twitter or on Facebook.



Tagged with:

2 thoughts on “Progress 8: don’t try to guess it

  1. Hi Colin, I am a senior leader who first noted ASCL mentioning this a year ago, that schools should avoid calculating Progress 8. Leaders will remain hungry though to know where they stand since P8 is a key performance measure. It’s very easy to obtain a P8 estimate for your current Y11 as companies such as SIMS and SISRA are willing to calculate these and that’s a nasty problem as they are doing so without taking into account the change in points attached to the legacy grades. I’m not sure if many schools and academies are really aware of that!

    With LOP a thing of the past as well I’ve been thinking of the best alternative for giving leaders enough data to know where they truly stand. I think you CAN get an idea of where your school stands with P8 if you substitute your own targets for the A8 estimates that just came out. It then just relies on appropriate targets and comparing your students against the highest points total that their targets can equate to. To my knowledge no one has approached P8 from that angle and develop such a system so I have! I invite you to visit my website and let me know of your thoughts.

    • Colin Logan on said:

      Hi Sandeep. It’s something I’ve been saying along with others for some time now. If schools set challenging subject targets and can show how they’ve arrived at them, if they then track students’ progress towards achieving them, that in itself is evidence of the progress being made – there’s no need to guess at a P8 figure – doing it this way is much more meaningful.

Leave a Reply

You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

Data at the heart of the classroom enables you to control your direction of travel

12 October 2016

Preparing students for secondary school: it’s not just how to get there

14 October 2016