Distilling the Nation's Report Card on Reading

Are we to believe that in a short, two-year span, America's poorest readers dropped a full year in their ability to comprehend text?
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

When the National Endowment for the Arts reported in To Read or Not to Read earlier this month that the bottom 10% of 12th-graders had dropped 14 points on NAEP between 1992 and 2005, literacy expert Stephen Krashen observed that 10 of the 14-point decline came between 1992 and 1994. It has dropped only 4 points in the last 11 years.

I have a different take on this 10-point decline: I don't believe it. Ten points is almost a full year on NAEP. Are we to believe that in a short, two-year span, America's poorest readers dropped a full year in their ability to comprehend text? That our teachers failed this group so much worse? In this same two-year span, all ethnicities fell. Asian-Pacific Islanders dropped a whopping 12 points, blacks 8 and Hispanics 9. All of these groups bounced back in 1998 to virtually their same levels as in 1992.

Large declines were also shown for students from homes with all the NAEP levels of parental education: did not graduate high school, high school graduate, some college, and college grad.

This kind of consistent decline followed by a bounce-back strongly suggests to me a technical glitch. I don't know if was in the items, in the sampling or what, but I think a glitch is more likely than that these plummeting scores were "real." The Nation's Report card, as advocates call NAEP is not rocket science.

Perhaps more important, the data from the "regular NAEP"--the one that shows the declines--is not corroborated by the NAEP trend data (the graph on page 12 of the report is a great example of how to make small changes look big). Now, the regular and trend data are not directly comparable. Trend data comes from administering the same items over the years while regular data is open to changes with changing conceptions of the subject matter. The fall from 1992 to 1994 in the trend data for the bottom 10% of 17-year-olds is three points. OK. It's not directly comparable, but shouldn't it be similar?

I regret that Dana Gioia, Chairman of the National Endowment for the Arts has put his imprimatur on this report. At a conference held with the hope of re-instilling the liberal arts into K-12 schooling, Gioia gave a bravura performance of the impact of literature, art and music on his life (in the process forcing Checker Finn into the role of Juliet). The NEA report strikes me as just another in a long line of fear mongering documents starting with the hysteria that followed Sputnik (which little orb came along over a year after the U. S. had a satellite-capable rocket in the air but from which Eisenhower forbade Werhner von Braun from launching anything into orbit).

Even if fear mongering is not the NEA's intent, it is indeed, what happened in many quarters. In his preface, Gioia says "Strictly understood, the data in this report do not necessarily show cause and effect. The statistics merely indicate correlation." This is emphatically true. But there are hints at causation and many people have not been so strict in their interpretation of the correlations as Gioia.

It might be that some groups that the NEA compares are not, in fact, comparable. The report finds that people who have not attained high school read less well in 1992 than in 2003. Does that reflect a decline in reading? Or a shrinking of that group leaving only extreme non-readers? Or an increase in immigration? Similarly, with the press for more and more people to get college degrees, we might expect some change in the talent pool that wouldn't necessarily reflect a decline in reading. The data are much too squishy for the kinds of conclusions drawn.

But it might yet be true that, as the NEA says, people read less today for fun. When I was a teaching assistant for Psychology 1 at Stanford, one of the professors would hold up a (required) programmed textbook from B. F. Skinner and tell the students, "Read this and you will know everything there is to know about Skinnerian psychology." This was true, but it was also true that the students would never again go near a programmed textbook or Skinnerian psychology. These days, kids get to read real books only after the tests are given in the spring. The rest of "reading instruction" too often looks to me like a great method for insuring that kids never, ever, again pick up a book.

Popular in the Community

Close

What's Hot