Published at The Conversation, Tuesday 15 May
NAPLAN, the National Assessment Program – Literacy and Numeracy, has been a prominent part of Australia’s education landscape since 2008, when it was introduced by then Education Minister Julia Gillard.
It’s a controversial test, lauded by some but disliked by many.
Ten years on, the role of NAPLAN is under question, with some arguing it should be dropped entirely. Here’s why it’s a vital navigation tool for policy makers and researchers.
What is NAPLAN?
Every May, Australian school students in years three, five, seven and nine sit standardised tests in reading, writing, numeracy, spelling and grammar.
A great virtue of NAPLAN is that each domain is scored on a single scale. Achievement can be compared across different school year levels, courtesy of a common learning progression for all levels of the NAPLAN tests. This lets us analyse the learning growth of specific groups of students as they move through school.
I have consistently argued the best way to lift achievement is to maximise individual learning progress. The same theme underpins the Gonski 2.0 report. And if we want to lift learning progress at scale, we must be able to measure it.
What is NAPLAN used for?
There are many claims about the benefits of NAPLAN, each of which deserves scrutiny on its merits. For example, using NAPLAN:
- policy makers and researchers can better understand student performance, to inform system-wide policies, support and resource allocation for schools
- teachers can use the data as a diagnostic tool to improve teaching in the classroom
- parents can make more informed choices about where to send their children, via the My School website which publishes school-level results
- parents have more information about how their child is progressing relative to others.
Focusing just on the first point, here are five things we know a lot more about because of NAPLAN.
1. Achievement gaps for Indigenous students
Indigenous students don’t achieve at the same level as their non-Indigenous peers. While this has been known for decades, we would not know just how large some of these gaps are without NAPLAN, or how the gaps have changed over time.
At a national level, year nine Indigenous students are on average three years behind non-Indigenous students in numeracy, 3.4 years behind in reading, and 4.2 years behind in writing.
Indigenous students are three to four years behind by year nine
Translating NAPLAN scores into equivalent year levels makes it much easier to understand and compare performance across student groups. But the Indigenous gap is so large that no fancy mathematics is needed: year nine Indigenous students scored on average 465 in NAPLAN writing in 2017, below the 480 non-Indigenous students scored in year five.
The gaps are even larger in very remote areas where Indigenous students are more than seven years behind in writing.
2. Progress gaps for students in disadvantaged schools
Students in disadvantaged schools perform worse. Again, not news.
What’s more of a surprise is that, when we tracked a cohort of Victorian students across all four of their NAPLAN tests, the size of the gap tripled, from one year and three months in year three to three years and eight months in year nine.
Even more concerning was the finding when we compared students with comparable capabilities early in their schooling. From the same year three starting score, students in disadvantaged schools fall more than two years behind by year nine, with potentially high-achieving students missing out the most.
Students with similar early potential do worse in disadvantaged schools, especially high-achieving students
3. Comparison among states
The states and territories are responsible for running school education in Australia. Different states and territories take different approaches. In theory, this means jurisdictions can learn from each other. But this requires accurate comparisons, which take account of socio-economic differences. For example, parents in some states have higher levels of education than in others.
On a like-for-like basis, comparable students are achieving at very different levels depending where they live in Australia.
States and territories have very different levels of achievement when compared on a like-for-like basis
Looking at the next level of detail makes it clear no state or territory can afford to be complacent.
For example, New South Wales has the highest levels of achievement of any state for students whose parents have a university degree, but its disadvantaged students make less progress than the national average. By contrast, Victoria has the highest achievement levels for students whose parents didn’t finish school, but is not stretching its most advantaged students in the same way.
4. Changes over time
NAPLAN has now been running for long enough to identify trends over time. Too often, the story is one of stagnation. But there are bright spots, including the early years in Queensland.
Relative to the rest of Australia, Queensland has increased its year three numeracy and reading scores by three to four months since 2010
It’s interesting to note 2010 was the first NAPLAN cohort where Queensland students started school with a Prep year. This probably accounts for some of the improvement. But it’s also notable that the relative levels of achievement have improved over time, not just in a single step, suggesting Queensland’s education system is getting some other things right.
The richness of NAPLAN data allows us to spot much more subtle patterns as well. For example, while very remote Indigenous students are doing very poorly in writing, there are signs of improvement in this cohort in NSW. This level of granular analysis would not be possible without the NAPLAN tests being done every year, by all schools.
5. Identifying high-growth schools
The “holy grail” for many advocates of NAPLAN is to use it to identify the schools that are most effective in maximising student learning growth, and to apply lessons from those schools to others not adding as much value.
This is easier said than done, not least because the socioeconomic mix in each school affects the rate of learning growth as well as the students’ achievement.
New analysis, taking socioeconomic factors into account, shows that about 8% of schools have “beaten their odds” for all five cohorts for which we have reliable NAPLAN progress data. Given this would only occur 3% of the time for a coin toss, we can confidently say that at least 5% of Australia’s schools are routinely out-performing.
About 5% of schools are routinely doing better than we would expect given their student population mix
Of course, NAPLAN can’t tell us why these schools are different. Maybe it’s what the schools and their teachers are doing. Maybe it’s the nature of their incoming cohort. Whatever it is, we need to know.
Where to from here
NAPLAN is an imperfect navigation tool. It certainly doesn’t have GPS-like levels of precision. But giving up on NAPLAN would be like 19th-century sailors dumping sextants and chronometers in favour of returning to using the stars, wind and currents to navigate.
Maybe we need to rethink how NAPLAN is used, but overall, it should be kept.