Computer Analysis Overturns the Common Core: Educational Research Strikes Again!

In the midst of this raging Common Core battle now appears this new article on "complexity," calling into question the linguistic research behind the Common Core standards. It aims to be used as another weapon as the two armies struggle by night.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

In the 1940s, American newspapers began to simplify their prose by means of quantitative measures called "readability formulas." In an effort to preserve their circulation in the age of television, the wire services and newspaper and magazine editors set down rules for simplifying prose which all reporters were required to follow. Publishers of mass market books and magazines began following suit.

Not long after, these readability formulas came into use in our school books -- as the great reading researcher Jeanne Chall (herself a co-developer of a readabilty formula) showed. The story of verbal simplification in the schools was outlined in depressing detail by Harriet Tyson-Bernstein in her book A Conspiracy of Good Intentions: America's Textbook Fiasco (1988). She showed that, by following the mandatory readability formulas required by state education agencies, publishers had not only simplified the language of our schoolbooks, but by necessity, had also dumbed down their content -- often to the point of incoherency. Further, the economist John Bishop showed that student verbal scores had declined sharply: not just, famously, on the verbal SAT, but -- more tellingly for gauging widespread effects -- on an Iowa test given to every student in the state. Later the correlations between the language simplification in schoolbooks and declining verbal scores were analyzed in detail by a team of Cornell sociologists headed by Donald P. Hayes.

Okay. So why am I rehearsing all this well-established research about the dumbing down of our schoolbooks and the decline of verbal scores?

Well, it could be important to your coming life as a teacher or student. The scholars who developed the Common Core standards in English language arts were keenly aware of this history. And they were determined to reverse the downward linguistic trend. So they gave the language formulas a new name: "complexity," readability formulas having fallen into scientific disrepute. Complexity standards are meant to help reverse the decline induced by schoolbook simplification. This was a good idea, in principle. Complexity formulas can be applied intelligently -- or just as mindlessly as readability formulas were.

But now comes -- hot off the presses of the Educational Researcher, a journal at the heart of the educational establishment--a supposed refutation of all this past research by Chall, Bishop, Tyson-Bernstein, Hayes and others. Here's the abstract:

"Challenging the Research Base of the Common Core State Standards
A Historical Reanalysis of Text Complexity"
1. David A. Gamson
2. Xiaofei Lu
3. Sarah Anne Eckert

Abstract: "The widely adopted Common Core State Standards (CCSS) call for raising the level of text complexity in textbooks and reading materials used by students across all grade levels in the United States; the authors of the English Language Arts component of the CCSS build their case for higher complexity in part upon a research base they say shows a steady decline in the difficulty of student reading textbooks over the past half century. In this interdisciplinary study, we offer our own independent analysis of third- and sixth-grade reading textbooks used throughout the past century. Our data set consists of books from 117 textbook series issued by 30 publishers between 1905 and 2004, resulting in a linguistic corpus of roughly 10 million words. Contrary to previous reports, we find that text complexity has either risen or stabilized over the past half century; these findings have significant implications for the justification of the CCSS as well as for our understanding of a "decline" within American schooling more generally."

This abstract is silent on the most valuable portion of the paper -- its warning that complexity formulas have not been adequately tested, and can distract from more productive educational emphases. Instead, it focuses on defending the status quo in schooling (Our schools are fine. They did not decline!). This theme has been a thriving activity in educational research for many decades--of which the most celebrated instance was the book Manufactured Crisis (1995), co-authored by David Berliner, a writer who is cited in this article.

These defenses of existing schooling are problematic for the nation, not least because they contribute to the polarization of educational reform. One side shouts that the public schools are okay, while free-market reformers shout that they are anything but. Subtleties get drowned out in the furious side-taking about everything -- including the new Common Core standards, which could turn out to be good or bad, depending on how intelligently they are put into effect.

In the midst of this raging Common Core battle now appears this new article on "complexity," calling into question the linguistic research behind the Common Core standards. It aims to be used as another weapon as the two armies struggle by night. Few will actually read this article. Just citing the claims of the abstract will be enough.

So, what's the fact of the matter? After 1940, did schoolbook language get dumbed down and cause a verbal decline -- or didn't it? If you peruse the article, as I did, you will find that the self-described "independent," "interdisciplinary" method used by the researchers is as follows: Complexity is usually defined by average sentence length and average word rarity as determined by a stable word-frequency list. But, the authors say, let's be more historical. Let's analyze the language of schoolbooks against a changing baseline of word frequencies contemporaneous with the schoolbooks. The results will show (Surprise!) that while books and the popular press were getting deliberately simplified under the reign of readability formulas, so were schoolbooks, under those same formulas.

So when the authors ran texts from schoolbooks through a computer they found that even though the schoolbooks got simplified in absolute terms, they didn't get simplified by relativized historical measures. When popular books, newspapers, and magazines got dumbed down, schoolbooks did too. Hence the relationship between schoolbooks and other books stayed pretty constant. Schoolbooks did not get dumbed down after all -even though they got dumbed down!

Hence things are really just fine. There has been no "decline." and the research basis of CCSS has just been refuted.

Maybe not.

If you keep moving the goalposts up or down the field -- you can make a touchdown any time.

Intellectually and scientifically this article, which is the fruit of much hard and earnest work which I am sorry on a personal level to disparage, does absolutely nothing to change the interpretations of verbal decline offered by Chall, Bishop, Tyson-Bernstein, and Hayes. In fact, looked at from the right angle, the analysis justifies the alarm of those scholars, because it shows that between 1950 and the present, the whole literate culture had been diminished. As Christopher Jencks, the great liberal sociologist said in 1978, regarding the fall in the Iowa verbal scores: "Iowa students' total vocabulary has almost certainly contracted. It is hard to see how that can be a good thing either in Iowa or elsewhere."

Popular in the Community

Close

What's Hot