Professor David Jesson, University of York; associate director, SSAT, 2003-2010
One of the defining roles of SSAT in its various incarnations has been to provide schools with the capacity to understand how well they contribute to their pupils’ progress and attainment, and the confidence to improve it. Alongside this essential ‘self-awareness’, is an insistence on challenging schools to explore whether their pupils’ performance over a whole range of subject areas and indicators is capable of improvement.
The early official method used for ‘comparing schools’ depended wholly on school league tables; against this, SSAT’s new emphasis required a context for such comparisons.
Working with the organisation’s growing number of schools, it soon became apparent that the next official context for such comparisons – the percentage of ‘disadvantaged’ pupils within each school – was, to say the least, of questionable value. Communities with around average levels of disadvantage might, on paper, appear similar: in reality they often included schools with huge differences in their pupils’ academic starting points.
The Trust, as it was commonly called in those days, was the very first agency in England (and possibly across the world) to develop and communicate useful value-added measures of schools’ performance. These provided effective ways of comparing both pupils’ and schools’ outcomes and offered powerful, yet straightforward, frameworks for allowing detailed interrogation of many aspects of pupils’ progress and performance.
This was the very first agency to develop and communicate useful value-added measures of schools’ performance
Ultimately well over 2,500 (specialist, CTC and academy) schools were involved in this exercise, and their detailed school-by-school value-added measures began to be published annually. One consequence of this was the provision of convincing evidence of the value of the specialist schools project (and its successors) in boosting the outcomes of many thousands of pupils in the schools involved.
Interestingly, SSAT’s straightforward pupil-level frameworks contrasted sharply with the next official value-added reports. Based on ‘contextual value-added’, these provided often large numbers of mysterious evaluations, which staff, governors and parents sometimes found impossible to understand. This was finally replaced in 2013 by the current progress measures, which show remarkable similarities to those developed earlier by SSAT.
This is part of SSAT’s continuing contribution to ensuring that schools take responsibility for, and appropriate action in, reviewing and improving their pupils’ performance.