Tuesday 22 December 2020.
NAPLAN seems to raise the temperature of many people, from teachers to academics and to parents. Sometimes it is a visceral response, and in some cases NAPLAN puts undue pressure on students and teachers. This includes for example commercial providers capitalising from resources and exploiting parent anxieties, making them feel like they should be doing more to ensure their child’s NAPLAN success. This is not what NAPLAN is about.
Australia needs a positive national testing approach, and the recommendations from the NAPLAN Review provide many ideas to help us move in the right direction. NAPLAN, as a national assessment is useful for benchmarking performance in numeracy and mathematics at a system level. Such assessment is a hallmark of high-quality education systems globally when the assessment is used as a tool to help deliver improved educational outcomes. Results can be used for multiple purposes providing opportunities for system and school improvement and to guide individual student learning. Without such a program, the value of mathematics in society is diminished and we would lose the only national census of numeracy ability.
No assessment is perfect, but The Mathematical Association of Victoria (MAV) is in agreement with the recommendation to move the assessment to the start of the year, this will stop the focus on ‘teaching to the test’. Pressure on parents and students needs to be reduced, and teachers who at times feel compelled to teach to the test, despite official advice advising against this practice.
It is important to remember that NAPLAN is a snapshot in time and is subject to many variables. When it is used well it can help identify areas for improvement. Results should not be used to compare students, or schools (as it is currently on the MySchool website), but to enhance opportunities, including the development of targeted interventions for students with identified gaps in learning. The right timing may help reinforce NAPLAN’s important status as a low-stakes national assessment program (formative assessment that can help identify where students need support), while reducing pressure on teachers to prepare students during otherwise useful teaching time.
NAPLAN data should be quickly available for teachers. This allows teachers to effectively carry out targeted teaching of students while the data is current. In particular, if the assessment is moved as recommended from Year 9 to the start of Year 10, at-risk students only have that one last year to improve their numeracy and literacy before they move on to various work and education pathways. Delays in data being available will mean less time to assist year 10 students to achieve their potential.
MAV believes in continuing the move towards online assessment. Initially this is required for the fast turnaround of results, as recommended in the NAPLAN review. Further, adaptive testing with questions selected based on the students’ earlier responses, can ensure the question items are not too easy, or not too difficult, and make them appropriately challenging. This allows assessments to go deeper into concepts and really identify students’ strengths and areas for support. The opportunity exists to build tools that better connect students results to curriculum resources and pedagogical strategies that directly support teachers in responding to assessment outcomes. Of course, equitable access to technology in this case must be a first priority.
The data set provided to teachers is a highly important consideration. The data supplied must be useful for them; including sufficient detail to enable teachers to identify where to target support based around mathematical thinking and knowledge. The item (question) level analysis is a valuable report which enables teachers to use the data effectively at the classroom level, making the assessment data more useful. As the recommendations are to retain the assessment for all students, the data will be available, and therefore should be given to teachers at the student level. The recommended focus on student growth data is thereby supported.
MAV wants to see consideration given as to how the data is reported publicly, to reduce the focus on comparing school performance. National assessments should not be about public comparisons, league tables and competition that result in undue pressure. However, having system level data to compare jurisdictions and schools is important to gauge and learn from what is working, thus allowing the development of evidence-based improvement initiatives.
MAV also supports the proposed expansion of NAPLAN to include STEM, and capabilities such as critical and creative thinking. Mathematics – the M in STEM – underpins the sciences, and related critical thinking skills, like problem solving, underpin many aspects of careers and life. How to successfully evaluate the capabilities is a challenge that needs deeper investigation, as it is not yet well understood how to do this. There is a risk that this may not be done well using multiple choice questions or whole of population testing, and other countries use PISA style assessment questions with sampling for this.
As we move forward to improving the national assessment system, perhaps a change of name for the program to ANSA (Australian National Standardised Assessment) could help leave behind some of the old baggage. After all, the assessment should be about continuous improvement for the assessment itself and our students’ own abilities through appropriate use of data and teacher time.
By Peter Saffin, CEO, on behalf of The Mathematical Association of Victoria.
A shortened version of this article was published in the following publications in November 2020. The MAV Board wanted to provide the full version to support educators in understanding its views.