Most U.S. Students Lack Writing Proficiency, National Assessment Of Educational Progress Finds

In this Tuesday, May 8, 2012 photo, second-grader Jessica Levy, left, works on math problems at Moreland Hills Elementary Sch
In this Tuesday, May 8, 2012 photo, second-grader Jessica Levy, left, works on math problems at Moreland Hills Elementary School in Pepper Pike, Ohio. As Ohio prepares by 2014 to join other states that deliver building-by-building percentages on classroom spending to parents and politicians, school treasurers and state funding experts are struggling to shove expenses as varied as guidance counseling, teacher pensions, school buses, furnace ducts, and playground equipment into a single two-category system. (AP Photo/Tony Dejak)

Only roughly one quarter of eighth and 12th graders are proficient in writing, according to results from the National Assessment of Educational Progress' first-ever computer-based writing assessment. The new framework represents a move away from the traditional paper-and-pencil format that has dominated the testing scene for nearly four decades.

NAEP's exams are considered the gold standard measurement of student achievement. In May, results showed that about a third of eighth graders who took its science exam were proficient, a statistic Gerry Wheeler, interim director of the National Science Teachers Association, slammed as "unacceptable." Similarly, only 32 percent of students performed at the proficient level on NAEP's math exam in 2007, ranking the U.S. 32nd out of 65 countries that were tested on the 2009 Programme for International Student Assessment (PISA), NAEP's international equivalent. This trend also appears to hold true for writing, though the format may have changed.

Drawing from a sample of 24,100 eighth graders and 28,100 12th graders representing both public and private schools, the 2011 writing assessment asked students to complete two 30-minute tasks, each of which was designed to measure one of three communicative purposes: to persuade, explain or convey experience. The prompts were presented in multimedia formats that included video or audio segments, newspaper articles, real-world data and other materials around which students could formulate a response. They recorded their answers on a laptop that featured commonly used word-processing tools such as spell check and a thesaurus.

“[Those who developed the framework] felt it was definitely time that we start assessing our students using computers,” Dr. Mary Crovo, deputy executive director of the National Assessment Governing Board, said in a statement. “This is becoming more the norm than the exception in our nation’s schools, and it is certainly the way that students write and communicate in higher education and in the workplace. So we feel very strongly that this is a solid assessment for 21st century skills.”

Results showed 24 percent of students at both grade levels scored at the proficient level on the writing assessment, while 54 percent of eighth graders and 52 percent of 12th graders met the benchmark for "basic." Around 20 percent of both grades performed below basic, while only 3 percent scored at the advanced level.

Among eighth graders, Asians outperformed other racial/ethnic groups, averaging a score of 165 on a 300-point scale. A mean of 150 was set for both grades. At the 12th-grade level, however, white students, Asian students and students of two or more races performed comparably. In both grades, African American and Hispanic students had lower average scores than the other races.

In addition to assessing students’ writing ability, the new computer-based format of the exam allowed test administrators to collect extensive information on 24 separate student “actions,” including keystrokes, backspacing, deletions and their use of spell-checking programs. Results found that at both grade levels, students who used the backspace key and thesaurus tool more frequently scored higher than those who did not routinely engage in these practices. Furthermore, English language learners were less likely to use the thesaurus tool than non-English language learners.

Dr. Jack Buckley, commissioner of the National Center for Education Statistics, said in a press call that the standards of proficiency were tailored to reflect the computer-based nature of the assessment, and that students’ writing was evaluated holistically -- taking into account development of ideas, organization and language facility and conventions.

Thus, while the spell check tool might have provided students with an advantage they did not have when taking the old paper-and-pencil tests, spelling was only evaluated under the category of “use of conventions,” and to the degree that it might interfere with what the student was saying.

“The raters who are scoring the students’ results were asked to consider these as first drafts. They don’t expect to see a polished final report; they’re expected to see first-draft quality,” Buckley said, later pointing out that the word processor tool is not going to result in significantly better writing if the student is not already fluent in expressing his or her ideas.

While the new computerized framework makes it difficult to directly compare results to the past, Buckley acknowledged, “there was not a lot of difference in levels of proficiency” from 2007, when the most immediate prior writing assessment was administered.

On the 2007 pencil-and-paper tests, 35 percent of eighth graders and 25 percent of 12th graders scored at or above proficient -- on par with 2011’s results, at least for 12th grade.

Additionally, female students in both grades scored higher than their male counterparts on the 2011 writing assessment -- a pattern that is consistent with previous results, according to Buckley.

Crovo, the deputy executive director of the National Assessment Governing Board, said that the NAEP hopes to add fourth graders to the sample in the near future.

Said Crovo, “We’re hopeful this new 2011 computer-based assessment can serve as a baseline for looking at trends over time.”

CORRECTION: A previous version of this piece incorrectly identified Gerry Wheeler. We regret the error.