The first robust cross-national evidence on the impact of COVID-19 on the learning outcomes of primary school age children was released yesterday. The 2021 Progress in International Reading Literacy Study, or PIRLS, was the fifth round of a survey which has been organized by IEA since 2001. Conducted every five years, PIRLS study looks at the reading skills of grade 4 students, who are on average about 10 years old. In total, students from 57 mostly upper-middle- and high-income countries took part. Progress relative to 2016 can be assessed for 32 of these countries.
This was a highly complex survey carried out under the most challenging conditions, in the middle of the unprecedented crisis of the COVID-19 pandemic. Unlike previous rounds, where countries were neatly assigned to two 3-month periods during which to collect data, school closures and the uncertainty brought about by successive pandemic waves made it necessary to spread the data collection over a longer period. New Zealand and Singapore were the first to begin collecting data in October 2020, while England only finished its data collection in June 2022. Some countries administered the assessment in schools that had been open for months, while others had to overcome many problems, not knowing until the last minute whether schools would be open or closed. The fact that 14 countries needed to delay the administration of the assessment places further obstacles to a comparison.
Overall, on a first glance, the 2021 PIRLS findings could be seen as confirming the common belief that COVID-19 had a substantial negative impact on learning: 21 of 32 countries had a lower performance compared to 2016, while 8 retained the same levels and 3 improved. In IEA’s words, these findings are showing ‘at least some widespread negative impact from the pandemic on reading achievement at the fourth grade’ (p.42).
But another way to interpret the results is that they are not as strong as might have been expected given the severity of the disruption. First, in 10 of the 21 countries whose achievement scores fell between 2016 and 2021, the score had also fallen between 2011 and 2016 – for instance, in countries such as Denmark, Germany and New Zealand. Second, in absolute terms, the average decline in the PIRLS score between 2016 and 2021 was 8 points, which is approximately equivalent to one fifth of what children learn in a school year.
The countries that took part in PIRLS were mostly among those that were best prepared to continue learning online; a small negative impact may therefore not be surprising after all – and it could also be a testament that these education systems were resilient.
But it is certainly too early to tell the full COVID-19 story even for these countries. A presumed defining characteristic of the pandemic is that the impact on learning has been unequal. This is one of the questions that cannot be answered yet by the PIRLS results. And as the PIRLS team went at great lengths to emphasize during the launch, country-specific studies will be needed to assess COVID-19 impact. Such work can only begin after the dataset has been released, which is currently scheduled to happen next month. One would need to interpret results in the context of the length of school closures and the coverage of distance learning programmes.
The data also shows that there remain quite high numbers of students excluded from the study in some countries for various reasons, including remoteness, disability and language. While no data set is ever perfect, it would be wrong to overlook the fact that the percentage of students excluded overall was 9% in countries such as Albania, Denmark and Türkiye.
From our global monitoring perspective, we welcome the participation of countries that do not tend to join cross-national assessments, such as Uzbekistan, where 70% of children achieved the PIRLS low international benchmark, which is equivalent to the minimum proficiency level for SDG 4 reporting.
Another issue to take into account is that 26 countries used a new and innovative interface, in which students’ reading was digitally assessed, allowing navigation through texts and activation of a panel that presented the questions. The other 31 countries administered the assessment in paper. The benefits and challenges of technology in assessment is one of the issues covered in the forthcoming 2023 GEM Report on technology in education due on July 26. This is clearly the way that many evaluations are going for, and with many benefits for tracking learning and providing feedback, but the shift is also worth taking into account when comparing one year’s results with another’s and drawing conclusions. The panellists at the 2021 PIRLS launch questioned the concept of reading we should be evaluating, as children are faced increasingly with more than just text.
For the time being, it is important to congratulate the IEA and country teams that worked had under adverse circumstances to deliver the PIRLS findings. The debates sparked by its release and what we can learn from it for our future work are testament of the importance of such surveys for global education monitoring and progress. Beyond the headlines of yesterday, we look forward to the analyses that will come out of the data.
Explore GEM Report content on learning here