The IPPR's "Making the Difference" report was full of incorrect statistics. Part 1
In 2017, the IPPR published false and misleading information about exclusions. It has distorted the debate ever since.
Where do exclusion myths come from?
Over the past few months, I’ve noticed that many dubious statistics about exclusions from schools in England first appeared in a single report. Making the Difference was published by the IPPR1 think tank in 2017. Many false, or misleading, claims seem to have their origins in this report, and I think it deserves a closer examination. In particular, the claims in the “60-second summary” document, published with the report, have been widely circulated despite many inaccuracies.
This post will look at what that summary claimed about the risk factors for exclusion and the impact of exclusions. Part 2 will discuss other incorrect and misleading claims in the summary. Part 3 will examine the influence of the report.
A cornucopia of backward statistics
According to the summary of the Making the Difference report:
Excluded children are the most vulnerable: twice as likely to be in the care of the state, four times more likely to have grown up in poverty, seven times more likely to have a special educational need and 10 times more likely to suffer recognised mental health problems. Yet our education system is profoundly ill-equipped to break a cycle of disadvantage for these young people.
I have addressed the general point about excluded pupils being the most vulnerable before:
Blog post: What do we know about excluded pupils?2
As for the statistics in that paragraph, they seem to be incorrect. If you look at the main body of the report, you discover that these statements about probabilities are backwards (page 163).
The main report claims: “Children who have been taken into care are twice as likely to be excluded as those who have not”. The summary claims: “Excluded children are … twice as likely to be in the care of the state”.
The main report claims: “On average, poorer young people are four times more likely to be excluded than their wealthier peers”. The summary claims: “Excluded children are … four times more likely to have grown up in poverty”.
The main report claims: “Those with a recognised need are seven times more likely to be excluded than their peers without SEND4”. The summary claims: “Excluded children are … seven times more likely to have a special educational need”.
These should not have been difficult errors to spot. In particular, the claim about SEND is a mathematical impossibility. At the time of the report, 14.4% of pupils were identified as having Special Educational Needs. If excluded pupils were “seven times more likely to have a special educational need” then that would mean that 100.8% of excluded pupils would have a special educational need. This is good evidence that nobody was checking this report for false claims.
A mental health myth
I can find no mention in the main report for the claim that excluded pupils are “10 times more likely to suffer recognised mental health problems” which appears in the summary. However, this is not because the main report is more careful than the summary. I have looked at the report’s claims about mental health before:
Blog post: Another look at exclusions and SEND
According to Making a Difference (main report, page 16):
In 2015/16, one in fifty children in the general population was recognised as having a social, emotional and mental health need (SEMH). In schools for excluded pupils this rose to one in two. Yet the incidence of mental ill health among excluded pupils is likely to be much higher than these figures suggest. Only half of children with clinically diagnosed conduct disorders and a third of children with similarly diagnosed emotional disorders are recognised in their schools as having special educational needs. This means the proportion of excluded children with mental health problems is likely closer to 100 per cent.
This is ridiculously tenuous. As I pointed out in that previous blog post:
The errors of reasoning in this are incredible. SEMH is not synonymous with “mental health problems”; it’s a category that can include those whose difficulty is that they are badly behaved. “Schools for excluded pupils” here appears to be Pupil Referral Units (PRUs) which, while they are often attended by excluded pupils, are actually institutions for any students who are unable to attend school, including those who are unable to attend due to SEMH. Therefore, their SEMH figures tell us nothing about the rate of SEMH among excluded children. It is, of course, possible to find out the actual proportion of excluded students with a label of SEMH that year by looking at the figures. In 2015/2016 the number of excluded children labelled as having SEMH was 1 860 out of 6 685 or 27.8% (which is surprisingly low given that poor behaviour is a common reason to label a child with SEMH). The “clinically diagnosed conduct disorders” and “similarly diagnosed emotional disorders” were diagnosed from survey data (collected from parents, teachers and children themselves) by a method that found 6% of young people to have a conduct disorder and 4% to have an emotional disorder and not from direct assessments by clinicians. While the survey did find that a large minority of the former category, and almost two thirds of the latter category, did not have officially recognised Special Educational Needs at that time, this was not referring specifically to either permanently excluded children or children in PRUs which may be wildly different. Any one of these errors (assuming this is just an extremely unlikely series of mistakes, rather than a deliberate intention to deceive) would invalidate the argument; so many errors in one paragraph suggests the IPPR was not too bothered about factual accuracy.
Three claims that are probably true, but the evidence used to support them is terrible
The summary continues:
Excluded young people are more likely to be unemployed, develop severe mental health problems and go to prison.
I suspect these claims are true as a matter of correlation. However, it is misleading to imply poor outcomes for excluded pupils are a result of their exclusion. The poor outcomes could be a result of their behaviour, or the many risk factors for poor outcomes that are also risk factors for permanent exclusions. Even though I assume these correlations exist, I cannot ignore the fact that Making The Difference finds little evidence to support them. I have already observed that the claims about mental health were baseless. What about the other two claims?
Did the report find evidence that excluded pupils were more likely to be unemployed?
The claim about unemployment in the summary is based on the following part of the main report:
Excluded young people are very likely to experience long-term unemployment. The Youth Cohort Study showed that more than one in four (27 per cent) excluded young people were not in education, employment or training (NEET) for between one and two years by the time they were 19, compared to one in 10 young people who had never been excluded. Fifteen per cent were NEET for more than two years, compared with only 3 per cent of those who had never been excluded (DfE 2011).
You may have already noticed that while the first sentence is about “long-term unemployment”, the statistics used to support this claim are based on teenagers. The source of the statistics is from 2011 and is based on statistics for 19-year-olds in 2010. However, in 2015 the age for compulsory education or training was raised to 18. At that point, data from when the school-leaving age was 16 became outdated and irrelevant. It seems like the report’s writers started from the conclusion and then grasped for any statistic that seemed superficially relevant. I don’t think they even looked particularly closely at their source. It claimed that permanently excluded pupils were more likely to be in employment than 19-year-olds who hadn’t been permanently excluded (45% rather than 36%). Somehow, this was used to support the opposite claim: that excluded pupils were less likely to be in employment. Not surprisingly, permanently excluded pupils were more likely to be NEETs only because they were far less likely to continue into post-compulsory education.
Did the report find evidence that excluded pupils were more likely to go to prison?
As for the claim that excluded pupils were more likely to go to prison, this is, of course, likely to be true given that the behaviour that leads to exclusion is often criminal. However, the main report bases this claim on the following statistics:
The majority of UK prisoners were excluded from school. A longitudinal study of prisoners found that 63 per cent of prisoners reported being temporarily excluded when at school (MoJ 2012). Forty-two per cent had been permanently excluded, and these excluded prisoners were more likely to be repeat offenders than other prisoners (ibid).
The report to which this refers is a longitudinal study of a cohort of prisoners who had been sentenced in 2005-6. These statistics come from survey evidence that cannot be considered reliable, and it is striking that while the source refers to being “suspended or temporarily excluded”, Making the Difference avoids using the word “suspended” allowing the misleading first sentence in the above passage5. More up-to-date and relevant information about exclusions and offending could have been found in this 2016 report about young offenders, but it would have shown fewer than one in four young people in custody had ever been permanently excluded.
In Part 2, I will look at some other inaccuracies in the summary.
The typical excluded pupil is an older, white British boy without SEND. While excluded pupils are more likely to be disadvantaged, 48% do not qualify for Free School Meals. While they are more likely to be on the SEND register, most are not, The majority of those who are on the SEND register are listed as “SEMH”, a category that is often used for the badly behaved. Excluded pupils are also more likely to be older pupils in their year group. There is some overlap between the vulnerable and the excluded (for instance, children in care are over-represented), but it seems ridiculous to suggest that older white boys without physical disabilities are “the most vulnerable” in our education system.
The pages of the PDF and the page numbers don’t match up. All the references I have given use the page numbers.
Special Educational Needs and Disabilities.
The IPPR also exaggerated this statistic further in a 2018 press release claiming:
One in two prison inmates were excluded when at school
Excellent. But I am afraid those pushing this agenda will not, in any way, let the actual facts get in the way of their zealous narrative. Thank you again for exposing this.