When we are making judgements about whether pupils and students are making progress, there can be a tendency to think that data on a spreadsheet can indicate something absolute about standards.
I think the reason this happens is that the data generated through public tests and examinations at the end of primary and Key Stages 4 and 5 is able to tell us something about standards.
It’s never 100 per cent, but we can talk with some confidence about pupil and student outcomes.
We are able to do this because pupils have taken the same tests and papers as hundreds and thousands of other pupils. The papers have been marked and moderated at a national scale, and while not perfect they do give an indication of performance.
The same cannot be said for internally generated data. This is because same conditions don’t apply in terms of the scale and moderation which sit behind the national tests.
I think we have got to stop making out that they do have the same currency as externally validated data.
I have been a governor in three schools and a trustee in a MAT and I was never provided with information that gave a true indication of how pupils were getting on.
The numbers were there on the spreadsheets, but no-one was able to tell me what they meant other than they had gone up, so pupils must have made progress!
How do we get round this?
Well in schools that have recognised the difficulties of internally generated numbers, they have realised the power of sampling.
During staff meetings in primary or faculty and subject meetings in secondary, samples of pupils work are discussed and critiqued.
Are these ‘products’ as Tim Oates calls them, such as writing or low stakes quizzes for example reasonable for pupils and students of this age?
This sort of joint discussion and moderation is powerful.
First, it supports colleagues to come to an agreement about what quality work looks like.
Second it provides the language and terminology to help pupils and students improve.
And third, it provides pretty reliable information for senior leaders, governors, trustees and any external accountability processes.
It’s a healthier way of going about gathering evidence of how pupils are getting on.
The only issue is that it doesn’t sit neatly on a spreadsheet!
I have run two webinars to unpick this in relation to Key Stage 3, the first webinar on:
Some of the paradoxes of assessment
The difference between external and internally generated data
Why the data on a spreadsheet only tells part of the story
The role and impact of formative and summative assessment
Why we are better at making judgements than we think we are
The second session considers alternative options for making judgements about what students achieve at KS3.
Tim Oates, who led on the review of the national curriculum had some helpful insights about assessment.
He made the case that we can infer how well students are accessing and learning the curriculum through the things they ‘produce’.
I identify a number of ‘products’ that help give us insights.
Spoiler alert: they don’t sit neatly on a spreadsheet!
If you’d like to access the webinars they are part of the annual subscription to Myatt & Co.
If you or your school don’t already have an annual subscription, it’s easy to sign up and you get access to the popular subject networks, and ongoing professional development. (free trial)
Until next time
Mary