Problems With Pearson's Student Teacher Evaluation System -- It's Like Déjà Vu All Over Again

Hofstra University administrators recently received access to closed online edTPA material with sample teaching videos in different subject areas, but these were also unevaluated so again I could only guess at how they were rated.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

During the spring 2013, four of my secondary education social studies students participated in a field test of the Pearson edTPA (Teacher Performance Assessment). They created elaborate online portfolios demonstrating planning, instruction, and assessment and including video segments from three lessons and sample student work. I cannot provide specific details about the assessment for student teachers because in order for my students and Hofstra University to participate in the field test I was required to sign a confidentiality agreement. However, information about edTPA is available online from the American Association of Colleges for Teacher Education (AACTE).

Three months later the Hofstra University School of Education received from Pearson numbered ratings on each portion of the submission for each of the students. However, the evaluations did not include any comments on their strengths and weaknesses and there was no notice about what is considered a passing grade. The Pearson website simply says passing grades will be determined in the future for each cohort by individual states.

Hofstra University administrators recently received access to closed online edTPA material with sample teaching videos in different subject areas, but these were also unevaluated so again I could only guess at how they were rated.

University faculty who are supposed to prepare new teachers can only guess why the students who participated in the field test received the scores they received and how to help students improve in the future. Thirty-three states are listed as participating in edTPA, but only six of them have approved the performance assessment. They include New York, Tennessee, Georgia, Wisconsin, Minnesota, and Washington. I have no idea what State Education Departments in these states will decide constitutes an acceptable score for certification or if they will ever agree, but the new tests are scheduled to go into operation in May 2014 anyway.

The edTPA website has answers to commonly asked questions such as "What are the expected pass rates for edTPA?" Try to make sense out of the answer!

Following additional analysis of the field test, a recommended passing standard that uses a professionally acceptable and credible standard-setting approach will be provided as a guide for states. As is the case with current licensing exams, each state adopting edTPA can elect to set its own passing score to determine who is permitted to practice in that state. This state-level process will determine the ultimate percentages of teacher candidates who pass the assessment.

This response is especially curious because the national edTPA project is based on the Performance Assessment for California Teachers (PACT). In the 2009-10 school year, one-third of California's applicants to teacher certification participated in PACT and ninety-four percent of them passed all sections on the first try. According to Linda Darling-Hammond, a professor at Stanford University who worked with Pearson to design the edTPA, the national test, the high pass rate on the PACT was expected.

But three questions occur right away.

1. If everybody passes, what do these tests actually test?

2. Is the same thing going to happen in all states that use the Pearson edTPA?

3. Is it fair that states use different passing scores when this is supposed to establish a national standard for teacher certification?

What a mess!

Right now the edTPA system will only be used to evaluate student teachers applying for state teaching certification. However, once it is up and running, and even if the bugs are never worked out, the system could potentially be used to evaluate working teachers as well. An even bigger mess awaits.

In the meantime, other problems with Pearson's edTPA keep popping up.

I received an email from a faculty member at Hobart and William Smith Colleges in Geneva, New York that the Director of Teacher Certification & Placement applied to the New York State Education Department for edTPA scholarships for twenty-four students who receive federal Pell grants. Millie Savidge, Coordinator, Office of Research and Information Systems, Higher Education of the State Education Department (her email address is msavidge@mail.nysed.gov) replied to the request and informed the school that Pearson was responsible for distributing the vouchers.

On October 3, 2013, Eileen Cahill, Director of Client Program Management Evaluation Systems Assessment & Instruction at Pearson informed Hobart and William Smith Colleges by email (her email address is eileen.cahill@pearson.com) that the vouchers were in the process of being emailed to Deans of Education at the New York Institutions and should be distributed by end of day October 4.

On October 7, 2103 Hobart and William Smith Colleges received an email from ERM ER ESTestVoucher (ESTestVoucher@pearson.com) saying that New York State had "authorized Evaluations Systems to issue test fee e-vouchers for New York State Teacher Certification Examinations (NYSTCE) for students in your Teacher Education programs. The e-vouchers were allocated proportionally to institutions based on the number of undergraduate Pell recipients reported by your institution."

Hobart and William Smith Colleges, which has 24 eligible students, received two vouchers for written exams and one voucher for the $300 edTPA.

Faculty at the State University of New York are so outraged by the lack of direction from Pearson on the edTPA assessments and possible bias that they are organizing an edTPA Alternative Scoring Consortium. According to Julie Gorlewski of SUNY-New Paltz (gorlewsj@newpaltz.edu), they want to create their own alternative to the Pearson scoring set-up. In an email, Professor Gorlewski wrote "The worry is that edTPA will feed into an existing cycle in which the public school teaching population does not share cultural characteristics of the students it serves. A scoring consortium consisting of a partnership of public and private institutions reflecting a range of selectivity has the potential to address this critical issue . . . We would develop a scoring process that reflects our own goals and desires for new teacher graduates (not an easy, but hopefully a joyful task). Then, we would ask 5-10 students at each institution to submit their work for this alternative scoring process (simultaneous to submitting their work to Pearson for "official" scoring of their edTPA portfolio)."

One last new Pearson tidbit borders on the ridiculous. Faced with intense opposition to its testing programs for students and teachers, Pearson has decided to issue merit badges to teachers who participate in its online "My Education Community" for teachers. Teachers "can earn badges by adding to discussions, commenting on articles, completing polls, contributing case studies, sharing resources, etc. Some of our badges have levels-bronze, silver, gold and platinum-so the more you engage the higher your level of achievement. Once you earn a badge it is yours to keep." Just like in the Boy Scouts, teachers can earn badges in instructional technology, course design, and syllabus. Apple badges are awarded for "great comments" that help other members. However my favorite is the "Future Hero" badge which I guess is for future heroes.

I have frequently written about problems with Pearson testing programs. As former New York Yankee great Yogi Berra probably would have said, "It's like déjà vu all over again."

Popular in the Community

Close

What's Hot