National Assessment: What the Latest Results Should Tell Us

NAEP provokes important questions but doesn't provide the answers -- and that's a good thing as long as we're prepared to keep the discussion alive and not stifle it with premature, extravagant, and politicized rhetoric.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

The Nation's Report Card came home last week, and the results weren't pretty: American 4th and 8th graders seem to know less math and 8th graders seem to read less well than they did two years ago. Kids in only three jurisdictions -- the District of Columbia, Mississippi and Department of Defense schools -- showed gains in math. But the results are complicated. Although performance of black, white and Hispanic 8th-graders slipped in both subjects, the decline in performance for girls and white students seems to have been more pronounced than for black, Hispanic, poor and disabled students, and kids in cities seem to have fared better than their peers in the suburbs. 13 states showed gains in 4th grade reading and one showed a gain in 8th grade reading. Everyone seems to be better off than 25 years ago, but even though different groups have experienced fluctuations over the years this is the first time that the overall national average dropped.

These findings come from the National Assessment of Educational Progress (NAEP), which has been providing biennial reports about 4th and 8th grade students in math and reading since the early 1990s. NAEP is different from most standardized tests used for accountability and admissions, and because of its unique sampling and reporting methodologies provides the most trusted data on academic performance for the nation's students.

Nothing is more precious to the American people than the education of their children, and it's a good thing we have NAEP to show us how we are doing. Especially in an era of turbulence in the structure, governance, and content of schooling, having a reliable source of data is well worth the substantial investment we've made in the design and implementation of the program, which goes back to the mid 1960s. So when the NAEP results come out, we should pay attention.

Inevitably though, people will be tempted to rush to all kinds of judgments to explain the findings -- those who are worried about the reform movement, for example, might attribute the score decline to the "Common Core," while others who support the reforms will say it's time to dump NAEP. To put it bluntly, neither of those reactions -- to panic or to scoff -- is helpful in understanding and improving our educational condition.

Why not panic? For one thing, a two-point drop during a two-year period, though "statistically significant," doesn't exactly constitute a trend. Indeed, one of the advantages of NAEP is that it enables a longer-term view: yes, we should wonder -- and a little worrying is fine -- why on average 4th graders dropped from 242 to 240 in math, and why on average 8th graders dropped from 268 to 265 in reading. But we should also keep in mind that the 2015 averages, in both math and reading and at both grade levels, are significantly higher than in the early 1990s.

By comparison, it helps to remember that during the early 1970s American economic productivity growth dropped, which led to widespread panic about our international competitiveness, the quality of our workforce, and our standard of living. Only when the short-term data were put in their proper temporal perspective did we regain our senses and realize that while we had economic problems the sky hadn't really fallen. Learning from this lesson, let's use this month's NAEP data not as a reason for panic, but rather as an invitation to reflect about what they mean and what we should or could be doing differently in our vast and complex school system.

To those who scoff at NAEP because it seems out of touch with the current reforms, again I would suggest we cool it before inferring too much (or too little) from the latest results. There's no question that the 2015 assessments took place during a time of upheaval in schools and schooling: most states have been changing their academic standards (even those that have rejected the "Common Core"), and many states introduced new curricula and tests in just the most recent school year. Studies by our nonpartisan Center on Education Policy reveal how much things are in flux; but it's worth noting that 60 percent of school district leaders we surveyed did not expect to implement curricula aligned to the Common Core in all of their schools, or to adequately prepare all their teachers for the revised curriculum until school year 2014-15 or later.

Thus, the degree to which NAEP is "aligned" with the content and format of the new Common Core curricula and assessments is an important and complicating factor in understanding the latest results. The only reliable study to address that question, so far, paints a mixed picture and covered math only. So, if we're looking for a bottom line here about the validity of the NAEP results during this time of change, we will have to tolerate some squiggle. Most importantly, we need to ask whether we'd be better off without an assessment program that keeps a steady hand on the helm while our educational ship steams along through choppy seas.

NAEP was invented not as a tool of "accountability" but rather as a broad gauge of the condition of education. Its content and statistical quality have evolved over the years and the program continues to provide a trove of data, broken down into just about every conceivable demographic and geographic category. The results should be the centerpiece for rational debate over how we're doing and how we could do better. NAEP provides the data, not the explanations: for that we need good research and objective discussion.

As a proud resident of DC I'm of course delighted to see NAEP results that show we're improving locally. But as a social scientist with years of experience in the world of testing and assessment, I know I have to contain my exuberance while educators and statisticians dig deeper into the data and emerge with some scientifically defensible answers. NAEP provokes important questions but doesn't provide the answers -- and that's a good thing as long as we're prepared to keep the discussion alive and not stifle it with premature, extravagant and politicized rhetoric.

Popular in the Community