KS4 2015 and the 2010 KS2 boycott

The background:

In 2010, many primary schools made the decision to boycott the Key Stage 2 (KS2) tests, over concerns about the tests and how their results were used or misused in league tables and so forth. Approximately 27% of tests were subject to the boycott and this coupled with children missing tests through absence and other ‘usual’ factors meant that around 30% of the 2010 cohort did not have KS2 results based on a KS2 test.

Five years later and the 2010 KS2 cohort became the 2015 KS4 cohort and secondary schools have had to attempt to untangle the impact of the boycott on their results, particularly progress outcomes. The boycott potentially affects all schools in the KS2-KS4 value added measure, this is because it is a zero sum game, in other words every gain (positive value-added score) has to be offset by a loss somewhere else. This means that even if a certain school had no students who did not sit the KS2 test their value-added score is still influenced by others in the system. The boycott also potentially has an impact on the expected progress threshold measures in English and maths for schools with students who were affected by the boycott.

The scale of the boycott:

So circa 27% of children across England did not have their KS2 test results reported in 2010, although obviously this was made on a school by school decision. Therefore the majority of primary/middle schools would either be reporting a 100% test completion rate or 0%. Often these patterns were localised, leaving some areas with hardly any year 6 students sitting the test and others where secondary schools will be wondering what all the fuss is about.

Individual secondary schools would therefore know what proportion of their 2015 cohort was affected by the boycott, and some if they dug around would have an idea if they were more or less affected than national. When the DfE produced the 2015 KS4 performance tables they did give a nod to the 2010 boycott by including a crude threshold indicator: “50% of prior attainment results are based on teacher assessments”. The DfE have not (to my knowledge) offered an explanation as to why 50% was deemed to be the cut-off for the threshold (when national is c. 27%). Nor is it evident why they have used a Yes/No indicator when almost every other data item on their performance tables is either a number or a percentage. I would argue here that there was an opportunity to provide greater transparency that has been missed.

However in the RAISEonline document, each school has been given their actual percentage of students that had prior attainment based on a teacher assessment and not a test. This is in table 3.1.1, which is page 20 of my RAISEonline report but may be slightly different in different schools. RAISEonline, however is only for the eyes of individual schools.

So to further investigate the 2010 boycott, I gathered this percentage from colleagues in other schools who shared an interest in the effect of the 2010 boycott. Many thanks to those 102 colleagues who provided information.

So, what does the spread of the boycott look like in these schools?scale

So, for example, in 16 (out of 103) schools there were between 30 and 40% of students without test outcomes on which to base progress measures.

What does the boycott mean for an individual schools data?

For KS4 progress measures, when the KS2 test result is missing the teacher assessment for the child is used as the baseline instead. There is no graduated scale for this teacher assessment, it simply comes as a whole level, i.e. Level 5. The DfE treat teacher assessments as being at the middle of the level, for example a level 4 teacher assessment is a 4b, or in fine levels – a 4.5. The same is true for average point scores, with a level 4 attracting 27 points, a level 5 33 points and so on.

The chart below shows three years’ worth of data for a secondary school where 95% of students were affected by the KS2 boycott in 2010. As can be seen, for year’s based upon KS2 test outcomes there is a fairly consistent distribution between fine-levels. (Fine levels are derived from test marks, and thus allows schools to understand which students just got a test level 4 and and which ones almost got a test level 5.)

finelev dist

Four major spikes on this chart, caused by teacher assessments (TA) in English and maths; TA 3 & TA 4 = 4.0; 4&4 = 4.5, 4&5 = 5.0, 5&5 = 5.5.

The spiked distribution in 2014-15 is clearly not typical, and it causes many data and accountability problems for the secondary school.

Firstly, the overall APS for the cohort is unduly influenced by the fine-level 5.5. It is evident that a typical level 5 student in this school carries a fine-level of about 5.2 on average. A fine-level of 5.2 is approximately 31 APS, whilst the 5.5 is 33 points. Therefore when the School KS2 APS is given as 26.7, it looks on average like a similar cohort to the 2014 students. However the impact of over-inflating the APS for 15+% of level 5 students means that actually this cohort would more likely to be closer to the 2016 cohort in terms of average prior ability.

Secondly, in terms of accountability the students at teacher assessment 5.5 are being treated the same as genuine test fine-level 5.5s. This means they have to achieve the same number of points at KS4 to be considered to have made a national rate of progress. In essence, they are being compared to peers that they are more likely to be below in terms of prior attainment than the same or above. This is why the spike at 5.5 is the most important, because the average of all the level 4s probably comes out at somewhere near the 4.5, so over the whole level the test prior attainment averages out to be closer to teacher assessment than it does at level 5. Therefore schools that usually have a low average fine-level for their level 5 students are affected more severely.

Thirdly, teacher assessments tend to be a little more generous that test outcomes. Think of a year 6 child who consistently scores close to the Level 4/Level 5 border on tests. On any given test child might come out with a fine-level of 4.9 or 5.0. The teacher assessment however could assume best-case scenario, so that student who is borderline level 5, suddenly in secondary accountability becomes a solid 5.5 and is expected to progress at a rate way above their natural “test” peers.

The impact of the boycott:

It would be sensible for me to start this section with a giant disclaimer, there have been so many changes to accountability policy, impact of the Wolf Report, impact of the early entry policy, exam boards, grade boundaries and so forth that it is difficult to ascribe any effect on outcomes to any single thing in particular.

With that said and in true education analysis style, I’ll press on regardless.

Lets look at how the test boycott affected the attainment 8 estimates between 2014 (based on KS2 test results from 2009) and 2015 (the year with a mixture of test and boycott).


I’ve highlighted the main point of interest. In 2014 the attainment 8 estimates increment steadily, whilst in 2015 the 5.5 estimate is below that of 5.4 (remember the only way to get a fine-level of 5.4 is via test outcomes, whilst 5.5 is a mixture of test outcomes and teacher assessments).

This means a number of things, firstly imagine a school with two students, one in 2014 and one in 2015, both with a prior attainment of fine-level 5.5 for both of whom this prior attainment is based on KS2 test outcomes.i.e. they were unaffected by the boycott. In both years the students attain the same, they both get 8 A grades, in both years 8 A grades equates to 70 points. Therefore in 2014 the student adds a slight positive to the overall progress 8 but in 2015, this is a much larger positive for the school, equivalent to over 4 grades across 8 (10 as E&M doubled) subjects.

Therefore in 2015 the school appears to have made much better progress with it’s student(s) but actually it has benefited from the boycott effect in other schools.

Secondly, and this is where secondary schools with large proportions affected by the boycott come in. Students with teacher assessment of level 5, where their true prior attainment would have been lower, are compared to the much higher benchmark of 65.67, rather than say 59.32. To put it a different way, they need to achieve over half a grade more on average per subject just to make the same progress as their peers than they would have done if their primary school had not boycott the test.

I have used progress 8 as an example, as it is easier to understand than value-added. Although the effect is very similar, you can see those estimates on page 19 of this document.

Of course, some of this depends on the numbers of and the natural average of your level 5 students, and this is for schools to know. If your usual average of level 5 test students is a fine-level of 5.1, all of this is having a greater impact that if the usual average is 5.4.

The crux of everything is that level 5 teacher assessed (boycott) students are being treated as something that they are more likely not to be.

Could the DfE have smoothed the effect better? I believe so. Why not produce attainment/value added estimates based on test and teacher assessments separately?

For example:

5.5 test = 69.44, 5.5 teacher assessment = 62.88 – then assess progress from these points.

This could have happened, but the boat has sailed.

I think it would be a sensible hypothesis to say that the greater the level of boycott affecting your students, the greater chance of volatility in the schools results. However, it would also be fair to assume that the boycott affects all schools in different ways, some may benefit from it, and some may suffer because of it.

This leads me back to looking at the national picture and some basic data analysis:

As I mentioned earlier, 102 colleagues kindly provided their actual boycott percentage whilst the DfE simply made public 50% boycott Yes or No. So let’s have a look at what that tells us

I am using KS2-KS4 value added (best 8) as the measure for progress, due to progress 8 figures only being officially published for schools that opted-in to the new accountability measures. All schools have value-added (VA) published.

For all mainstream secondary schools:

va all

So according to the DfE data schools who had more than 50% of their students affected by the boycott, scored a lower VA score than their counterpart schools.

How does it look for the 103 school sample:

va dms

Because we have actual figures for these schools we can better divide them up into smaller groups (although not too small as I only have 103 schools). In this sample, it appears that once over 25% of students are affected by the boycott, value-added outcomes fall away. Remember that VA and significance is plastered all over RAISEonline reports, inspection dashboards and so on.

Let’s have a look at the overall spread of school value added scores against KS2 test boycott:

boycott scatter va

Again, I stress it is important to think about the whole picture, this is only 103 schools and all of these schools will have different prior attainment profiles, different proportions of disadvantaged and non-disadvantaged students and multiples of other factors that could affect value-added scores. Any relationship is not black and white.

My personal feeling is that this phenomenon of the KS2 test boycott is important in schools have a usual level 5 profile that is towards the lower end of the level, where because of the boycott these students are being treated as 5.5 and therefore find it difficult to provide the school with positive value added scores. However greater clarity is required as to under what circumstances this effect is magnified. Equally it is important to note that as value-added and progress 8 (shadow or published data) is a zero sum game the boycott factor affects everyone in the system. One school’s loss, is in theory offset somewhere else by a gain. It is a complex arena and probably requires a professional academic examination using proper multi-level modelling techniques.

However I would make a plea to those organisations that are using 2015 results to make decisions about schools to consider the implications of the 2010 KS2 test boycott when they have discussion with/about and on visits to schools.

From a school perspective, if you think that you may have been adversely affected by the 2010 KS2 boycott, I believe it would be useful for you to know or have a handle on the following:


I hope this blog post has been helpful and provides food for thought.

Peter Atherton 22/04/2016




Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s