SSAT has received a number of requests from schools about what they should publish when the KS4 results are released. This guide has been written in response to these queries to help you present your own story – in a way, and to an audience, of your own choosing, while retaining some control.
Many schools were particularly alarmed by the lack of planning time allowed by the then-secretary of state’s announcement last autumn that only the first entry in Ebacc subjects would be counted in the 2014 performance tables. This will be extended to cover all other subjects in 2015 – guidance from the DfE can be found here. Some who had already committed themselves to early entries chose to continue as planned as a matter of principle, while other schools cancelled their planned entries. The impact of this change is that the published data in RAISEonline and the performance tables will, in many cases, be unlikely to show what schools would consider to be a true picture. Students, themselves, of course, will continue to have their final entry counted as normal for the purposes of further education or employment.
To counteract the risk of discrepancies when the official tables are published between official data and the ‘true’ picture, schools will largely need to provide the evidence themselves. As usual, RAISEonline will not appear until December, and in the meantime you will want to share your results with parents and the community, and to have the appropriate data ready in case of a phone call from Ofsted. Telling your own story is important – so while the recent move by headteachers to form their own alternative performance table is a welcome sign of the profession taking action to shape and lead our education system, SSAT believes there is an inherent issue with any published ‘performance’ tables in that they can be quickly turned into ‘league’ tables by newspapers and others. This can seriously compromise the original purpose of sharing information, and risks driving competition over collaboration.
In the past, SSAT collected early results from schools but has not done so for several years now. Not only was it impossible to validate the early analyses we received but the logistical process to collect and collate the results was extremely complex, and the subsequent use to which the very often incomplete data could be put was limited. We do not therefore propose to undertake or participate in a collection this year of ‘reported’ and ‘actual’ results. Therefore this guide shares some ideas about what data schools might produce for themselves and share with different audiences as appropriate – we hope you find it useful.
What Ofsted have said about early entry, and changes to RAISEonline
At the end of July, Ofsted published a revised handbook for inspections from September 2014. The only reference to early entry is one bullet point attached to paragraph 152:
Inspectors should consider how well leadership and management ensure that the curriculum:
• does not compromise pupils’ achievement, success or progression by inappropriate early or multiple entry to public examinations, for example in mathematics
Provided that a school is not compromising achievement by inappropriate early or multiple entry, Ofsted inspectors will be just as interested in a ‘true’ figure for school performance as the school itself.
Although the ‘first entry only’ policy will apply to data in the performance tables, RAISEonline – which is Ofsted’s own point of reference for historical school data – will help to fill in the gaps between reported and ‘actual’ figures.
There are some significant changes to RAISEonline for 2014 which we will share with SSAT members shortly. However, for the purpose of this communication, the following points are particularly relevant:
• There will be no distinction between ‘GCSE only’ and ‘GCSE equivalent’ qualifications anywhere in either RAISEonline or the performance tables. There will be one category: qualifications which are approved for use in the performance tables. This means that there will be no valid comparison possible between, for example, the 2014 5 ACEM figure and either the ‘GCSE only’ or ‘with equivalents’ figures from previous years as the new category takes account of the implementation of the Wolf recommendations concerning the inclusion of vocational qualifications and the restriction in their value to a single GCSE equivalent.
• There will be a single report in the summary RAISE document which will show results for ‘first entry’ and ‘best entry’ for key attainment measures such as 5 ACEM, average point scores and C+ percentages in the EBacc subjects – but not for any progress measures.
• Schools will be able to use the online data management facility of RAISE to replace the first entry results used for individual students with their best entry. This will allow all the charts and tables from RAISE to be re-created using the ‘best entry’ data. Before we all get too excited, however, be warned that it is a clunky process requiring data to be entered for each student individually. Nonetheless, we believe that this will be an activity well worth undertaking as it will provide an invaluable resource if Ofsted pays a visit – they will not have any access to this data otherwise.
The three key threshold measures are 5+ A*-C passes including English and mathematics, EBacc and the basics measure (% A*-C in English and mathematics). All these could be affected to varying degrees in different schools by the application of the early entry rule and the published figures would not be directly comparable with previous years, even if the changes resulting from the Wolf Review had not been implemented. We are suggesting to schools that they publish two versions of each of these measures for 2014: one with first entry only counting and a second with the best entry counting. The most closely corresponding figures for 2013 could also be shown for comparison purposes, perhaps with a health warning.
It is our understanding that school management information systems ought to be able to produce both ‘reported’ and ‘actual’ figures by selecting either ‘first entry’ or ‘highest entry’ as appropriate.
Average point scores are a more inclusive measure of performance compared with the threshold measures as every student’s result counts at all levels of attainment. We are suggesting that schools might like to publish three point score measures:
• Best 8 average point score – again showing both ‘reported’ and ‘actual’ figures. Using the current methodology, the Best 8 APS is calculated by aggregating students’ best 8 results in subjects which qualify for this year’s performance tables and then dividing the total by the number of students; any vocational qualification will count as a maximum of 1 GCSE towards the 8 and no more than 2 vocational qualifications can be included.
• Separate average point scores for English and mathematics – both ‘reported’ and ‘actual’.
Point scores, however, are not particularly meaningful in isolation, so we are suggesting that schools convert these point scores to grades using the DfE methodology found in the RAISE library. The relevant excerpt can be found here.
The Best 8 average point score is divided by 8 to give an APS per entry and this can then be converted to a fine grade using the look-up table.
Progress measures will also be affected by the early entry policy. The two key progress measures are ‘expected’ progress (i.e. levels of progress from KS2) and ‘value-added’ progress (a comparison of the progress students have made in a school across 8 subjects compared with the average progression from KS2 nationally).
It is possible for schools to estimate their own value-added score for 2014 by using the published co-efficients from 2013. However, because of the changes that will have occurred to schools’ curricula as a result of the Wolf review and the reduction in the equivalent value of vocational qualifications, the movement in many schools towards an Attainment 8/Ebacc-type offer and the late impact of the early entry announcement, it is quite possible that this year’s value-added figures could show a much greater variation from year to year than normal. If a school’s estimated value-added score using the 2013 co-efficients came out to, say, 990, this assumes that the progress achieved by all students nationally under the old criteria remained constant. The value-added score in one school depends on the score achieved in all other schools nationally and the national average benchmark of 1000 is established from that. If the national rate of progress in 2014 appears to be less than in 2013 – as a result of the changes to qualifications – the national average will be lower and the 1000 benchmark will therefore be pitched at a lower level. So the figure of 990 – 10 points below the national average in 2013 – might actually be, say, 5 points above the average in 2014. We are therefore suggesting that schools confine themselves for the time being to calculating an expected progress measure which limits itself to progress made in English and mathematics and at least provides some comparability with previous years.
Our suggestion is that schools publish the following data for each of English and mathematics:
• the percentage of students making at least expected progress (i.e. 3 or more levels) since KS2 for both first entry only and using the final result
• the percentage of students making greater than expected progress (i.e. 4 or more levels) since KS2 for both first entry only and using the final result.
An additional exercise which schools might like to undertake for their own purposes, but not necessarily publish, is the impact both of concurrent entry on student results (for example double entry of both GCSE and iGCSE in the same subject) and of the school’s policy on multiple entry on vulnerable groups, such as those who attract pupil premium funding. If a positive impact were identified, this would provide robust evidence to governors and Ofsted that the policy was having a positive impact and that it is not ‘inappropriate’. As with teaching, Ofsted does not have a view of what is right or wrong practice: what matters is the impact.
Sharing and explaining the above data with governors, parents and the community will help schools to tell their story more accurately in advance of the publication of the official performance tables next January, and will also form the basis of some key additional evidence for any school expecting an Ofsted inspection.
This template page is provided as an example of how the suggested data might be presented. This could be accompanied by a short commentary to highlight the key information revealed by the data.
SSAT provides a wide range of support to schools with data and preparing for Ofsted through publications and courses – including training for senior and middle leaders in the use of the new RAISEonline and of the new Fischer Family Trust ‘FFT Aspire’ resource.
SSAT provides a range of training and support around data, self-evaluation, and how to best prepare for Ofsted in a way that doesn’t conflict with your wider school improvement goals. Download our full secondary Accountability and Data CPD programme for 2014/15.