Is PISA Data Worthless?
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

I have never been a huge fan of PISA data, the goulash of test results released internationally by the Organization for Economic Co-operation and Development (OECD) that’s responsible for a thousand chicken littling articles about how we’re getting smoked by the students of Estonia.

But in April this story dropped. Folks had begun a mild-tomedium freakout because the East Asian PISA math superpowers (South Korea, Taiwan, Hong Kong, Singapore, etc), the people whose program everyone else was trying to imitate, had seen their scores start to drop.

But now Andreas Schleicher, the official in charge of Pisa, has said that this fall may not be due to a drop in the performance of these Asian powerhouses. He said he was looking into whether the decline could be explained by the fact that Pisa used computers for the main tests for the first time in 2015.

In other words, data that is clearly presented as “comparable” in the study may not be comparable at all.

Which means the whole longitudinal game of charting PISA scores over time could be ruined, all those nifty charts now meaningless. There's another implication here as well. The Testocrats have been quietly assuming that taking a Big Standardized Test on a computer is exactly like taking it on paper. But what if that's not true? What if taking a math test involves not only math skills, but test-taking skills. And what if computer test-taking skills are not the same set of skills as pencil-and-paper test-taking skills?

What if the Big Standardized Tests aren't really measuring what they purport to measure at all, and the whole test-centered education model is built on a sham?

Originally posted at Curmudgucation

Popular in the Community

Close

What's Hot