Making Sense of the Nation’s Report Card

Making Sense of the Nation’s Report Card

I knew the freakout over the release of this year’s scores for the National Association of Education Progress (NAEP) test was going to be big, but I didn’t know it would be quite so mindlessly silly and counterproductive.

“The Pandemic Erased Two-Decades of Progress in Math and Reading” said the New York Times in a headline.

Not to be outdone, The 74 declared, “Two Decades of Growth Wiped Out by Two Years of Pandemic.”

The news that these headlines are meant to report is that the NAEP scores of 9-year-olds who took the tests of reading and math proficiency in early 2022 are significantly lower than the 9-year-olds who took the test in early 2020, prior to the onset of the coronavirus pandemic.

This framing of the lower scores as a “loss of two-decades of growth” is predicated on the fact that NAEP scores have been gradually rising over the last twenty years, so this year’s scores are roughly equivalent to those of students who took the exam in 2002. 

If there is a more alarmist way to report these scores, I can’t think of it, but needless to say that the nation’s 9-year-olds have not regressed to a minus-11-year-old level of proficiency on reading and math.

Lots of public people have seized on these results to advance their favored educational beliefs, that schools should never have closed, that remote learning is a disaster, that teachers unions are a scourge, that standardized tests are meaningless, etc…

But rather than wade into those arguments, I think it’s important to examine these scores with a more nuanced perspective, to see what they might be telling us, what they might not be telling us, and where we need to wait for more data.

Some propositions I’m confident in:

  1. School was definitely disrupted.

There’s no doubt that student schooling was significantly altered by the disruption of the pandemic. Not all of these challenges were felt equally, but every student experienced some level of disruption.


  1. The test was taken while the disruption was ongoing and this had a negative effect on scores.

Under normal circumstances as a tool, the NAEP is much more useful as a snapshot of the atmosphere and conditions students are working under, rather than an absolute measurement of how much students are “learning.” This is even more true with the most recent test, which was taken under unusual and challenging circumstances that disrupted the actual taking of the test. Imagine the potential for distraction and disengagement students were dealing with while being asked to take this exam. It’s like judging an athlete when they’re just starting to practice after coming back from an injury, rather than once they’ve had a chance to reacclimate and recondition. 


  1. Factors other than what students may or may not have “learned” contributed to lower scores.

It’s entirely possible that there was a lack of explicit test preparation, which left students unfamiliar with the specifics of how to do well on the exam.

Some students were experiencing the grief of having lost a caregiver to the pandemic.

More students than usual may have disengaged entirely from schooling, and not mustered even a token effort at performing well.


  1. It will not take 20 years for scores to return to the pre-pandemic status quo. 

This is perhaps the most frustrating part of the media’s framing. Consider a different metric disrupted by the pandemic, air travel. The number of domestic flights went from over 800,000 in 2019 to 335,000 in 2020. In 2021, as travel started to return to normal, that number increased to 605,000.

That 605,000 number is on par with the number of flights in 2003. How many stories about the collapse in air travel are you reading?

In fact, it’s the opposite, that airlines are having a hard time getting back up to speed after the disruption of the pandemic. 

Why would the lives of 9-year-olds be any different?

Or thinking about it another way, this year’s 9-year-olds scored at the same level as adults who are now in their late 20s. They seem to have survived going to school when students were apparently much less proficient in reading and math.


  1. There’s nothing in this data that tells us anything about the specific impact of schools being closed to in-person instruction.

Reading scores of students in city schools — those most likely to be closed the longest — were flat, while suburban and rural students saw declines.

All regions saw declines in scores, with no difference between the South where students returned to school earlier, and the Northeast, where closures tended to last much longer.

The much bigger story appears to be about disruption in general, something unavoidable no matter what policy choices were made.

Don’t get me wrong, I’m not saying we ignore findings like these — though we always overinterpret the meaning of scores on a single standardized test. I am saying that we need to consider this data for what it’s really telling us.

One clear message from below the topline numbers is that students who were already struggling experienced more struggles on the exam, with students in the lower percentiles seeing greater declines in scores than those in the higher percentiles. 

Access to resources also seems to have had an impact on scores. Of the students who scored at or above the 75th percentile, 83% reported having access to their own computer, versus only 58% of students scoring at the 25th percentile.

Having a quiet place to work and consistent access to a teacher was also significantly higher for those scoring at or about the 75th percentile.

These scores tell us something we already knew, a global pandemic disrupted our lives and threw us off-kilter. 

The students who always need more help need even more help. 

That story is as old as time.