The Value of Value-Added Data
With CIE results released last week, and Edexcel soon to release theirs, schools are busily analysing their IGCSE and A Level results and preparing statistics & infographics to share with the wider parent community.
As parents, we like numbers. They are concrete, black and white, easy to understand. And in a competitive international schooling market like ours here in Malaysia, these statistics give us something solid with which to compare schools.
However, the number of A-A* grades and A*-C grades are just a starting point. There is another very important type of data that schools use - and that we, as parents, should be asking to see when we are trying to choose between schools. It's called value added (or 'residual') data.
What is Value-Added, or 'Residual' data?
In a nutshell, value-added data shows how much (or how little) progress a student has made during their schooling.
One website explains it nicely as follows : "Value Added is the actual difference a school makes to the educational attainment of students between Years 7 and 11 and is all about comparing start and end points, student by student, and then measuring the difference travelled."
Basically, value-added/residual data shows how much 'extra' achievement a child has made, when you take into account their natural ability. In a nutshell: it can be one way of determining how effective the teaching has been. That's why almost all international schools use 'residual data' as an indicator of Faculty achievement, rather than the amount of A/A* grades alone.
Where does the data come from?
Most British international schools use YELLIS testing to determine a child's 'starting point', or baseline ability. Each student sits their YELLIS test at the start of Year 10. They can't really study or prepare for it. The online test has 3 sections and lasts around an hour. It's adaptive, so the questions change and get easier or harder depending on the student's performance throughout.
CEM - the organisation who design and conduct YELLIS testing explains value-added extremely clearly on this document, if you would like to learn more. The testing is used globally by thousands of schools and is considered very reliable.
From this test, each child is given a 'band' of predicted achievement for each of the test's sections and an overall band: D, C, B or A.
Schools then compare this data with what the students actually achieve in their final IGCSE examinations. Without getting too complicated: the difference between these is the value-added score. The score will be either positive, neutral or negative, and is a strong indicator of the impact the school has had on the student's outcomes.
A value-added score of 0 indicates that students achieved what was expected of them, based on that YELLIS baseline testing.
A value-added score of 0.5 indicates that students achieved approximately half a grade higher than what was expected.
A value-added score of 1.0 indicates that students achieved a whole grade higher than what was expected.
A value-added score of 2.0 indicates that students achieved an average of two grades higher than what was expected.
International schools in Malaysia (and most countries) love to proclaim the percentage of. their students that achieved A/A*s. While these figures are often impressive - we have to remember that it is actually not impressive to get a high percentage of As and A*s if those students were very capable and should have got those results anyway. In such cases, that statistic just shows a normal result for those students - nothing surprising or impressive. The school hasn't really 'added' anything to those students' natural ability and expected achievement.
However, it would be an incredibly impressive statistic if actually most of those students were targeted B grades. In that case, the statistic would demonstrate that the teaching had had a profoundly positive impact on learning, and that many students had achieved amazing progress.
It would demonstrate that the school had really added value to those students (hence the name 'value-added') and would, in fact, suggest it was a much better school.
Examples of Value-Added Scores (2018)
Eton College: +0.17
Westminster School: +0.06
St Paul's School: +0.24
St Paul's Girls' School: +0.23
Wycombe Abbey School: +0.11
Finding Out Value-Added Data
In the UK, schools are required to publish their value-added scores in school performance tables (like this one) for parents to easily see.
However, there is no requirement to do this for international schools and thus many do not publish their value-added figures. It's therefore incredibly important that we ask questions!
Ask schools for their IGCSE and/or A Level value-added or residual scores. You can ask for the overall cohort's results, or for specific subjects.
So, you might ask a school, "What was the Science Faculty's residual score for IGCSE Biology last year?" (And, by the way, they should have this data to hand).
If they say -0.4, it would suggest that students underperformed by nearly half a grade (bad news!).
If they said +0.6, it would suggest that students exceeded expectations by around half a grade (good news!).
Some Final Notes
If an international school is selective with the students it admits - if it only accepts high-achieving pupils anyway - then of course the exam results should be impressive. Some international schools here in Malaysia are far more inclusive than others. That makes their strong academic results all the more impressive.
While value-added/residual data isn't the 'be all and end all', it is a very important piece of the puzzle when trying to evaluate the effectiveness of teaching and learning in a school.