The New York City Department of Education released today a list of individual ratings of thousands of the city's schoolteachers, a move that concludes a lengthy legal battle waged by the local teachers' union and media.
The Teacher Data Reports rate more than 12,000 teachers who taught fourth through eighth grade English or math between 2007 and 2010 based on value-added analysis. Value-added analysis calculates a teacher's effectiveness in improving student performance on standardized tests -- based on past test scores. The forecasted figure is compared to the student's actual scores, and the difference is considered the "value added," or subtracted, by the teachers.
To some, the release means a step forward in using student data and improving transparency and accountability by giving parents access to information on teacher effectiveness. To others, it's a misguided over-reliance on incomplete or inaccurate data that publicly shames or praises educators, whether deserving or not.
In response, the union, the United Federation of Teachers, has launched a city-wide newspaper advertising campaign. The ad headlines, "This Is No Way To Rate A Teacher!" followed by a lengthy and complicated mathematical formula as well as a letter from UFT President Michael Mulgrew with a list of all the reasons he says the data reports are faulty and unreliable.
The ad will likely appear in the very publications being targeted for disseminating the Teacher Data Reports. Cynthia Brown, vice president of education policy at the Center for American Progress, issued a statement statement Friday drawing on findings from a November CAP report. The study concluded that publicly naming teachers tied to the performance and projected performance of their students actually undermines efforts to improve public schools, making it much harder to implement teacher evaluation systems that actually work.
"While we support next-generation evaluation systems that include student achievement as a component, we believe the public release of value-added data on individual teachers is irresponsible," Brown said Friday. "In this case, less disclosure is more reform."
Amid the report-releasing frenzy, GothamSchools is one news organizations that has stepped back from the crowd. It was one of the many news outlets that sought access to the Teacher Data Reports last year, but after internal deliberations, determined that they would not publish the raw database because "the data were flawed, that the public might easily be misled by the ratings, and that no amount of context would justify attaching teachers' names to the statistics."
The Times has publicly invited teachers to respond to their ratings, to be published side-by-side for readers to consider together: "If there were special circumstances that compromise the credibility of the numbers in particular cases, we want to know."
The reports were developed as a pilot program several years ago by then-Schools Chancellor Joel Klein as a part of the city's annual review of its teachers, and were later factored into tenure decisions. The ratings were intended for internal use and were not planned to be made public. Media organizations -- among them The Wall Street Journal, The New York Times and the New York Daily News -- sued for access to the data under the Freedom of Information Act. A court ruled in favor of the news organizations in August.
"When balancing the privacy interests at stake against the public interest in disclosure of the information ... we conclude that the requested reports should be disclosed," the court wrote, according to The Wall Street Journal. "Indeed, the reports concern information of a type that is of compelling interest to the public, namely, the proficiency of public employees in the performance of their job duties."
Criticism of the court's order comes predominantly from apprehension over using value-added analysis to assess teachers. Generally, value-added models don't control for demographic factors like poverty, race, English learner or special-education status that some argue are crucial to evaluating educators. Skeptics say that consequently, publicly naming teachers tied to their value-added ratings paints an unfair and incomplete picture of a teacher's effectiveness. Some believe that ratings will negatively affect teacher morale and teacher recruitment as well as reinforce a false notion in education that testing is everything.
Because the New York teacher ratings are based on small amounts of data, there exist large margins of error. To add to that, the test scores the analyses are based on were determined by the state Department of Education to have been inflated because the exams had become predictable and easier to pass -- to the extent that students were told incorrectly they were proficient in certain subjects.
Teachers of students who took those tests, according to the Daily News, could find themselves penalized in their Teacher Data Report ratings for not teaching to the test. Conversely, those who narrowed their curricular focus catered to the exam could be rewarded.
Other omissions and errors, like failure to verify class size and assignment for each teacher, will also skew results of the analysis.
New York high school teacher Stephen Lazar expressed on his blog -- and on a comment in The New York Times -- that he's disappointed by the decision of many publications to release the data. He points to the shortcomings of value-added systems, writing about how he spent six weeks teaching students how to do college-level research that likely cost his students 5-10 points on the Regents exam for lost time preparing for the test, but the teacher ratings "don't tell you that when you ask my students who are now in college why they are succeeding when most of their urban public school peers are dropping out, they name that research project as one of their top three reasons every time."
The city has defended the ratings, saying that they give administrators an objective way to evaluate teacher effectiveness, creating a system that identifies successful model teachers, struggling teachers who need assistance and those who should be removed.
"The reports gave teachers and principals one useful perspective on how well teachers were doing in their most important job: helping students learn," Schools Chancellor Dennis Walcott wrote in a letter to educators Thursday, The New York Times reports. "However, these reports were never intended to be public or to be used in isolation. Although we can't control how reporters use this information, we will work hard to make sure parents and the public understand how to interpret the Teacher Data Reports."
In a piece for the Times this week, Bill Gates, co-chair of the Bill & Melinda Gates Foundation and former Microsoft CEO, writes that value-added ratings are important in overall evaluations, as the multifaceted nature of teaching means that student test scores alone aren't comprehensive enough to measure effective teaching or identify areas for improvement. A reliable evaluation system, he says, must factor in other measures of effectiveness like student feedback and classroom observations.
"Putting sophisticated personnel systems in place is going to take serious commitment," Gates writes. "Those who believe we can do it on the cheap -- by doing things like making individual teachers' performance reports public -- are underestimating the level of resources needed to spur real improvement.