Published in the Australian, Wednesday 27 January 2010
The My School website is the greatest increase in transparency in decades. Parents and families can use the website to compare the average score of students in their school with other, similar schools. Educators can assess their school against other. Policymakers and education administrators can learn about which initiatives make a difference. School improvement initiatives can be implemented for schools with low-performing students.
The My School website is a step in the right direction, but overseas experience shows these schools performance indicators can be biased against schools in lower socioeconomic communities. The solution is not to give up on transparency altogether, or to abandon the National Assessment Program – Literacy and Numeracy, as the Australian Education Union is advocating. The benefits to students are too great. We need to improve the My School website and publish value-added measures of school performance that focus on student progress.
We must improve performance in school education. Substantial increases in expenditure have not been successful. The performance of Australian students in international testing has stagnated, and actually declined in reading literacy, indicating we are investing in the wrong areas. We have a moral imperative to lift the progress made by 30 per cent of Australian year 9 students who perform at only the basic minimum standards of writing literacy.
The Grattan Institute report Measuring What Matters: Student Progress, released today, argues for replacing the school performance indicators on the My School website with more accurate value- added measures. These measure the progress made by students, more accurately calculating the contribution made by each school, accounting for differences in students’ backgrounds. For these reasons, teachers, school associations and unions in other countries have advocated the introduction of value-added school performance measures.
These measures assess the contribution schools make to student progress over time, in contrast to the present focus on student performance at a single point in time. So, instead of measuring a school’s performance with the average NAPLAN score of their students in, say, year 5 numeracy, value-added measures focus on student progress between years 3 and 5 – the increase from a student’s score in year 3 numeracy to their score in year 5 numeracy. Value-added measures calculate the progress made by each student and compare the progress made by students at different schools. The focus on student progress stresses what schools can influence and negates problems with the present measures that are too strongly influenced by factors outside the control of the school, such as students’ backgrounds.
Opponents of publishing measures of school performance say they are concerned about unfairly stigmatising schools in poorer communities and the publication of ill-conceived league tables. Our approach addresses both of these concerns.
In the absence of any other information, people will continue to produce league tables based on year 12 results, which are dominated by the quality of the students when they started at the school. The raw NAPLAN results could widen these comparisons beyond year 12, but still mainly reflect student backgrounds. Like-school measures are a step in the right direction but more accurate measures are needed. It would be much better to publish data on the progress made by students at different schools, comparing the value that each school adds to its students.
We hear a lot from the government about improving school choice for parents. We hear a lot from unions and others about being unfair to school principals and teachers, and calls to ban student assessments. But the focus should be on the students.
By itself, even the most accurate measure cannot make a difference. We also need to ensure that measures are used to improve instruction and learning. Overseas, where assessments were not followed up with action, results did not improve, leading to the erroneous claim that assessment measures have not improved results. However, in many countries assessment measures were accompanied by programs that required teachers, principals and education administrators to follow up on the results. Not surprisingly, results improved.
Value-added measures of student progress empower school principals and teachers who have the greatest impact on student learning.
Schools need to be able to identify for which students, in which subject areas and in which grade levels they are effectively contributing to student progress. Effective programs and instruction can be expanded and less effective areas developed. In the present system, the focus is on what students’ NAPLAN scores were last year. Measuring student progress shifts the focus to the student, how they learn, and their personal progress. This is essential given that differences in student performance within schools are large in Australia compared to other countries.
Significant improvements come from building individualised instruction and lesson planning around multiple assessments that identify each student’s learning trajectory.
We need to support effective student assessment and value-added measures of how schools best contribute to student progress. The benefits are too great and the problems in Australian school education too large to ignore. Enough of the debate has focused on parents, teachers and school principles. It should all be about lifting school progress.