Ian Stokes Education
  • Sign In
  • Create Account

  • My Account
  • Signed in as:

  • filler@godaddy.com


  • My Account
  • Sign out

  • Home
  • About
  • Services
  • Customer feedback
  • Latest Info
  • Contact
  • More
    • Home
    • About
    • Services
    • Customer feedback
    • Latest Info
    • Contact
Ian Stokes Education

Signed in as:

filler@godaddy.com

  • Home
  • About
  • Services
  • Customer feedback
  • Latest Info
  • Contact

Account


  • My Account
  • Sign out


  • Sign In
  • My Account

10/10/19 New EIF inspections - are Ofsted still 'judging by numbers'?

As I've been sat waiting for the Ofsted IDSRs to be released so I can start writing another batch of reports, I decided to spend some time trawling through the Ofsted website, looking at the first batch of reports that have just been published for schools inspected under the new framework. I really know how to have a good time, don't I??

Actually, I'm glad I did, because what I've found out is really quite interesting. The new framework clearly states that inspectors won’t be considering school’s own internal data when they form judgements, but it does say that they will be looking at historic, published data – and from what I’ve seen so far, there is a very clear link between schools’ historic progress figures and their inspection outcomes under the new framework.

Of course, we can debate “cause and effect” at length: does the performance data dictate the outcome of the inspection, or are the inspectors’ judgements about the quality of education merely validated by the performance data? Either way, if the pattern in this (admittedly very small) sample is representative of what is to come, then we can be confident that schools’ progress figures will continue to be very closely linked to their inspection outcomes. It could be argued that this is ‘as it should be’ but current progress judgements are already very flawed, and will probably be increasingly unreliable from 2020 and beyond, when progress will have to be calculated from the very broad KS1 assessments produced by the new national curriculum assessment framework. I wonder how much inspectors take these flaws into account when they look at a school's IDSR?

Here’s some detail:


  • I found 40* primary schools that had been inspected between 10/9/19 and 24/09/19, and whose report had been published by 4/10/19.
  • None were judged to be ‘Outstanding’
  • 28 (70%) were judged to be ‘Good’
  • 12 (30%) were judged to ‘Require Improvement’
  • None were judged to be ‘Inadequate’
  • 13 were in the North West
  • 13 were in the Midlands

 * I didn’t include Infant schools and junior schools, independent schools, and schools with no published 2018 KS2 data.


  • None of the ‘RI’ schools had an ‘above average’ or ‘well above average’ KS2 progress score in any subject in 2018.
  • 10 (83%) of the ‘RI’ schools had ‘below average’ or ‘well below’ average progress in at least one subject in 2018.
  • Only 2 schools were judged as ‘RI’ despite having ‘average’ progress in all subjects. One of these schools had been judged as ‘RI’ at the previous inspection and the other had been judged as ‘Inadequate’ at the previous inspection.
  • 20 (71%) of the ‘Good’ schools didn’t have any ‘below average’ or ‘well below’ average progress scores in any subject in 2018.
  • 8 (29%) of the ‘Good’ schools had ‘above average’ or ‘well above average’ progress scores in at least one subject.
  • 6 of the ‘Good’ schools had ‘below average’ or ‘well below’ average progress scores in Maths, 5 had ‘below average’ or ‘well below’ average progress scores in Writing, but only 2 had ‘below average’ progress scores in Reading.
  • Only 1 (4%) of the ‘Good’ schools had ‘below average’ progress in all subjects in 2018.
  • Only 2 (7%) of the ‘Good’ schools had ‘well below average’ progress in one or more subjects in 2018.


Reading through the reports, there are some very strong and consistent themes coming out:

  • The strength of teaching in Phonics is mentioned in every report I’ve read so far, and seems to be a key determinant of the quality of teaching judgement. It would be interesting to be able to compare schools’ Screening Check results with the inspectors judgements about the strength of Phonics teaching but -alas- this isn’t possible as the relevant data isn’t published in the performance tables.
  • Unsurprisingly, curriculum planning and the sequencing of the curriculum are also a key focus in almost every report.
  • The outcomes of children with SEND children are a key focus, and seem to be mentioned more frequently than those of Disadvantaged children.
  • Quality of provision, outcomes and progress in the Early Years is a focus of nearly every report and is specifically mentioned more than any other key stage.

Are DfE adjustments to extremely low progress scores unfair?

FoI request shows some pupils are 30 times more likely to have their scores adjusted than others

As soon as the DfE published the technical guidance relating to their new methodology for adjusting the "extremely low progress scores" that some children generate at KS2, the alarm bells started ringing. On the face of it, it seems a reasonable and positive attempt to ameliorate the impact that individual children's very low scores can have on the overall score for the school. We've all seen examples where one child's very low progress score has meant that a  school's overall progress is described as below average. The DfE's guidance states that "about 1%" of children nationally will have had their scores adjusted, but they also state that rather than simply adjusting the scores of the lowest 1% of children nationally, they used standard deviation calculations for each of the prior attainment groups (PAGs) at KS1 to identify which children's scores to adjust. The effect of these calculations means that NO children with prior attainment of less than 6pts at KS1 could have their scores adjusted, because the theoretical thresholds set by these calculations were, in fact, lower than the lowest possible score a child could achieve! Not only this, but the thresholds for the prior attainment groups covering children with a KS1 APS of 6-12pts were much lower than those of children with above-average prior attainment at KS1. So, instead of getting a straight line threshold, we get a very strange looking curve, as illustrated in the chart on this page. For example, the threshold in Reading for a child with a KS1 APS of 7pts is -27.06, while the threshold for  a child with a KS1 APS of 21pts is only -11.55.


This apparently bizarre and unnecessarily complicated approach to setting the thresholds made me wonder whether it had been done in order to ensure that roughly equal numbers of children from across the prior attainment spectrum benefited from the adjustment. The DfE hadn't provided any figures relating to the numbers of children in each PAG who have had their scores adjusted as a result of this methodology, so the only way to find out was to submit a Freedom of Information request...


... after a few weeks of waiting, the response arrived, and the information provided is rather shocking. The total numbers of children across country in the first 8 PAGs are relatively small so let's take the  22,118 children in the PAG which covers >=9-<10 pts at KS1 as our first example. In Reading, only 52 of these children had their scores adjusted: that's just 0.2% of the group. In contrast, 527 of the 40,591 (1.3%) children in the PAG that covers >=15.5 to <16pts at KS1 had their scores adjusted. That means that a child in this PAG is 6 and a half times more likely to have their score adjusted than a child in the lower PAG.


The situation is similar in Maths, but the really extreme variations occur in Writing. Only 17 (0.1%) of the children  from the PAG which covers >=9-<10 pts at KS1 had their scores adjusted  compared to 1777 (3.3%) from the PAG which covers >=17-<18 pts at KS1. That means that a child in this PAG is 33 times more likely to have their score adjusted than a child in the lower PAG!


On top of this, we also need to consider the fact that the adjustments for children in lower PAGs can only be very minor because their thresholds are so low, while a child in a higher PAG can receive a very considerable adjustment because their thresholds are much closer to zero. 


This all points to a system which is weighted against children with lower  prior attainment. We can only speculate as to whether this has been done intentionally.  It could also be argued that this approach is also unfair for those schools which have lots of children with low prior attainment. We need to recognise that some children with low prior attainment often make excellent progress, but these children often have English as an additional language: my fear is that the schools which will suffer the most from this approach are those with high proportions of non-EAL children from deprived backgrounds, many of whom have SEN and low prior attainment at KS1. 


If you're interested, the spreadsheet containing the detailed figures from the FoI request can be downloaded below.

Downloads

Low Pogress FoI DfE Response (xlsx)Download

Copyright © Ian Stokes Education  Ltd - All Rights Reserved.

Powered by

  • Data Protection & Privacy