SSAT senior education lead Colin Logan explains how Ofsted’s new approach to data focuses more on data reliability and trends over time.
Most of us will be aware that RAISEonline will be replaced by Analyse School Performance (ASP) later this term. We’ll have more information on it in a few weeks’ time. What is less known is that its sister resource, the Ofsted inspection dashboard, is also undergoing a major facelift.
It represents a significant shift in the way that data is used to inform inspection activity. It is also a recognition that, as all good users know, it is dangerous to set too much store on what data seems to be saying about small groups of pupils. This can happen very often with primary school data, and it’s particularly applicable when not just groups of pupils but sub-groups within them are identified and analysed. For example, if you look not only at pupil premium (disadvantaged) pupils but drill down to look at low prior-attaining, pupil premium girls, the numbers involved make it unlikely that any hypotheses drawn from the data will be worth acting upon.
If you drill down to look at low prior-attaining, pupil premium girls, the numbers involved make it unlikely that any hypotheses drawn from the data will be worth acting upon
As Sean Harford, Ofsted’s national director, says in the latest school inspection update, ‘Over focusing on the performance of groups when these intersections mean the data are less than robust, can result in schools taking actions with individual groups when effort would be better spent on approaches that have an impact for all pupils.’
Inspection Dashboard Summary Report (IDSR)
As a result, the new inspection dashboard – renamed the Inspection Dashboard Summary Report (IDSR) – contains fewer pupil groups than its predecessor, and places more emphasis on trends over time. This doesn’t mean that the focus on disadvantaged and other vulnerable pupils has been downgraded. It does mean that attention will no longer be paid to making judgements on small numbers of pupils within those main groups.
The ISDR will have a completely different look and feel to the old dashboard. Gone are the strengths and weaknesses on the front page; they are to be replaced by ‘areas to investigate’. It will contain information about a school’s context, pupil characteristics, year groups and prior attainment, and an indication of whether it is meeting the floor standards or the coasting schools definition.
Also new is the use of percentile ranks to compare a school’s progress scores with those of all other schools in the country in each of the past three years. These will be shown in quintiles together with an indication of a score’s statistical significance for each year.
Scatterplots, showing the performance of individual pupils compared with their starting points, will be included for key performance measures for the first time. Inspectors have been urged to exercise caution where a school’s progress measures are affected by outliers. And they have been reminded of some of the underlying reasons behind the low scores of individual pupils. Any ‘areas to investigate’ should already have taken account of these. However, a new team of regional data analysts will be on call to help inspectors interpret the impact of individual pupils on overall scores and to advise them on the reliability of any other data provided by a school.
A new team of regional data analysts will help inspectors interpret the impact of individual pupils on overall scores and advise them on the reliability of data
Absence, persistent absence and exclusions will continue to feature, although exclusions for SEN pupils have been removed; however this could still figure in the areas to investigate if necessary.
Although there are some mock-ups of the new IDSR in limited circulation, there is nothing at the moment that can be shared publicly. We will let you know when anything is released.