Sign In
Sign In
Register for Free
PrimarySecondaryAssessment

School data is often flawed, useless or even harmful

With inaccurate measures of progress and percentages, much of our data is at best useless and often actually harmful. So what should we do instead, asks Clare Sealy…

Clare Sealy
by Clare Sealy
Year 6 science worksheet
DOWNLOAD A FREE RESOURCE! Year 6 science – Animals including humans assessment worksheet with answers
PrimaryAssessmentScience

I blame Excel. Its seductive power to turn numbers into charts of many colours have lured us down the data dead end.

It makes learning seem like something that can be corralled, tamed, measured. But learning is invisible, unpredictable and illusive.

Progress isn’t linear; children move forward in fits and starts. No assessment is ever perfectly reliable, and the quest for the holy grail of accuracy is futile.

Yet we treat assessments as if we have measured children with some sort of ruler that can give us a definitive, replicable and accurate reading on how things really are.

But it’s more like we’ve tried to measure children with a ruler made from elastic; the benchmarks stretch and contract, the readings you get depend on how tight you hold the ends.

This is why standardised assessments come with confidence intervals, and with small cohorts the interval can be really large.

All sorts of factors can send a score soaring upwards or plunging downwards – a good night’s sleep, breakfast or the lack thereof, an argument with friends, the particular questions on that test on that day.

The attitude of the teacher and the school culture around assessment also influence results. Some schools are experts at reading the rules around tests and enable children to do as well as they can without crossing the line. Others perceive this as quasi-cheating.

Teacher assessment is no more accurate and even more prone to variations. What is even worse is that we take this flawed data and then use it to choose who gets extra help and what they get extra help with.

However, because the data is not accurate, some children who need help won’t get it and others who probably don’t, will.

Not all learning objectives are created equal. Some are much more crucial than others.

In a system that averages out results, it is possible to look like you are doing OK because you’ve scored a few marks in easier areas, while not being secure in basics without which later learning will surely flounder.

So data as we have previously known it, with its ‘measures’ of progress and percentages ‘on track’, is at best useless and often actually harmful.

So what should we do instead?

First of all, rid yourself of any notion that we can measure anything accurately, particularly progress. Remind yourself that any chart or graph, however impressive, is only as good as the data behind it, which, as we can now see, is not very good.

Learning is messy – embrace it!

Think about the key things you are actually going to teach your children. What are the high-dividend, high-leverage things that form the building blocks on which more sophisticated learning depends?

Track those: number bonds, phonics acquisition, fluency in reading, times tables, ability to use full stops and capital letters consistently, consistent use of tense, spellings, amount of time spent reading independently, basic geographical locational knowledge, ability to put previous historical learning on a timeline, understanding of key scientific vocabulary.

Learning about the Romans? Devise a simple multiple-choice quiz out of 10 and use this at the end of a unit to see what has actually been learnt. Revisit this quiz, or a close variant of it, a couple of months later, six months later, a year later; do they still know it?

When children have secure knowledge of these basics, then they are properly equipped to put these to use in more sophisticated tasks.

Standardised assessments have their place as long as we realise that no assessment can be perfect.

They are ‘standardised’ because a large number of pupils have taken them, allowing for statistical processes to be applied which enables us to compare how pupils are doing relative to a large data set of others.

This kind of comparison is much more useful when comparing whole groups of children than it is by taking the results of individual children too seriously.

It is good to benchmark the performance of a whole year group against others nationally from time to time. Once a year is probably plenty. And make sure that the test you use matches with the curriculum you teach!

There is no point whatsoever in doing an assessment that tests stuff you haven’t yet taught. This is an expensive waste of time that will only depress you. Don’t do it!

Governors may have grown used to being able to hold leaders to account via charts and graphs. However, we need to come clean and confess that the statistical significance of this kind of information is pretty suspect.

Instead, spend your valuable time collecting information about how children are doing that actually helps you teach them the things that they should know but don’t.

Find the bottlenecks getting in the way of them making progress and teach those. Data should be a prelude to action.


Clare Sealy is head of curriculum and standards for the States of Guernsey. Follow her on Twitter at @ClareSealy.

You might also be interested in...