This piece comes to us courtesy of Education Week, where it was originally published.
When Michigan officials suspended six teacher education programs at Lake Superior State University in 2012, citing falling licensing-test scores and other problems, the action prompted a period of deep soul-searching for the university’s top brass. And it made for some painful conversations with some of the school’s current teacher-candidates and incoming hopefuls.
But it was also a needed wake-up call, according to Donna Fiebelkorn, the assistant dean of the education school.
“My sense is that if there had not been some external force, things would not have changed,” said Ms. Fiebelkorn, who arrived at the school, in Sault Ste. Marie, just months before the programs were suspended.
Amid the intense recent policy interest in educator quality, the list of proposed remedies for improving teacher preparation has grown long. It ranges from using performance assessments to measure candidates’ classroom skill, to giving prospective teachers higher doses of hands-on “clinical” training in K-12 schools, to setting up charter-school-like preparation academies outside traditional teacher colleges.
Yet what has been all but ignored in such discussions is the crucial role that states play in auditing existing providers--and the power they have to shut down the weakest.
Lake Superior State’s experience stands out for two reasons: First, the state’s move to suspend the programs is among the factors that seem to have sparked a renaissance at the education school, leading the university’s leaders to scrutinize the suitability of the teacher-prep course offerings and to tackle the failure of the humanities, science, and education faculty to work together. Secondly, nationwide, the closures of teacher education programs are exceedingly rare.
In a five-year period between 2009-10 and 2013-14, states reported closing or preventing enrollments at fewer than 60 subject-area or grade-level teacher-preparation programs, according to an Education Week analysis. There are an estimated 25,000 such programs in the United States, mostly housed at colleges and universities.
The closure or suspension of entire teacher education schools, departments, or providers is rarer still; states reported just 12 such examples between 2009-10 and 2013-14, typically at smaller schools.
Overlapping federal, state, and accreditation mandates all require colleges that prepare teachers to report information on their programs. But the state requirements carry the most import for the simple reason that states alone have the authority to close the education schools, to bar new enrollments in their programs, or to otherwise penalize them.
Education Week reviewed states’ closure decisions vis-à-vis their own internal teacher-preparation standards. Its analysis is based on a survey sent to the 50 states and the District of Columbia, and a review of each state’s policies.
State education departments typically audit their teacher-preparation providers every five to seven years by visiting campuses, examining faculty qualifications and workloads, and determining whether coursework matches state teaching standards. Approved programs can recommend for a teaching license candidates who have fulfilled all state requirements.
In examining the states’ review processes, several observations stand out. The first is the vast array of different standards states use to review education schools and programs.
Second, the rules are generally lax in requiring specific consequences after deficiencies are found. Many states permit a form of conditional or provisional approval for programs with problems.
And finally, it is unclear how many states require the notification of candidates who are enrolled in programs with deficiencies, a policy often not spelled out in statute. (States that confirmed that they do require such notification include California, Michigan, Oklahoma, Tennessee, and Washington state.)
Alabama, Massachusetts, Minnesota, Oregon, and Wisconsin either did not respond to Education Week’s queries, or could not produce a tally of the number of programs closed.
While some state officials dispute the significance, the generally low rates of program closure across the country--even in places with large surpluses of teachers--suggest that states are reluctant to resort to drastic measures against faltering teacher education providers, in effect bypassing one of the tools at their disposal to improve overall educator quality.
“I haven’t visited a state where the political leaders are enthusiastic about the quality of ed. schools,” said Arthur E. Levine, the president of the Woodrow Wilson National Fellowship Foundation and the president emeritus of Teachers College, Columbia University. “They have the capacity to do a reauthorization of their existing programs, and they haven’t done it.
“It’s a matter of discipline,” said Mr. Levine, the author of a critical 2006 report on teacher education.
In fact, on occasion, states have stepped in to prevent consequences even when national accreditors found fault with schools.
When the education school at Wesley College, a private liberal arts college in Dover, Del., had its accreditation revoked by the National Council for Accreditation of Teacher Education in 2010, the state intervened to give it an extension. (Approval by NCATE, now succeeded by the Council for Accreditation of Educator Preparation, or CAEP, is a requirement in Delaware.)
The states’ review processes for alternative teacher-certification programs are even murkier. In at least nine states, those providers are held to standards different from those for programs housed at colleges. For that reason, the data collected by Education Week may not include all alternative providers.
What accounts for states’ overall reluctance to intervene in programs?
One factor is that states’ review processes are deeply insular, well known only to the preparation programs and the state officials who review them. Although summaries are sometimes listed, the program-level reports generated from reviewers’ visits are often not posted on states’ websites, and on occasion are available only through open-records requests. As a result, there is little public scrutiny that might prod state action.
Then there is the reality of local education labor markets, which can create powerful incentives to keep even weak programs open.
Especially in rural locales, a lesser-quality teacher-prep program might nevertheless be the only one around for miles. Such programs often have vocal constituencies advocating on their behalf, said Robert E. Floden, the co-director for the education policy center at Michigan State University, in East Lansing.
Even when clear problems exist, the evidence may fall short of forcing program closure.
“Most institutions can meet the letter of the regulations in terms of what they have to do to stay in operation,” Mr. Floden said. “To a large extent, it’s showing you’re offering the coursework the state requires you to offer.”
Political factors, while harder to quantify, also seem to influence the process.
“Every program has a state representative and a constituency of people who attend meetings,” said Phillip Rogers, who formerly oversaw program review in Kentucky. He recalled two cases in which the state moved to close teacher colleges or its satellite campuses—and both resulted in his being called before the state legislature.
“I had a legislator yelling at me after I spoke about how [the college] had a music teacher teaching how to teach math, and I had forgotten [the lawmaker] had been a music teacher. Those are the things you have to deal with, and what you accomplish depends on how much political capital you have,” said Mr. Rogers, who is now the executive director of the Washington-based National Association of State Directors of Teacher Education and Certification.
“Closing a program is very difficult,” he said, “and it’s often very unpleasant.”
An additional tension surrounding program-review processes may stem from a lack of consensus about whether they are primarily meant to ensure program quality or to give technical assistance.
New York state, for example, initiated the Regents Accreditation of Teacher Education, or RATE, process in 2004 to audit education schools that were not seeking national accreditation.
Over RATE’s six-year life, a subcommittee of the state’s Professional Standards and Practices Board recommended denying approval to seven institutions based on the audits, a move that would effectively have shuttered their teacher-preparation programs. But the recommendations went unheeded again and again by the state board of regents, which oversees both K-12 and higher education.
Interviews with New York officials reveal starkly different interpretations of those outcomes.
“We felt very strongly that the purpose of accreditation was not to put people out of business, but to grow the strength of our programs,” said Joseph P. Frey, then the associate commissioner of education in New York.
Nicholas Michelli, who chaired the professional-standards panel that recommended the closures, saw the situation differently. The discourse among policymakers at that time was that some programs were weak and ought not to be operating. But when it came down to brass tacks, the state regents chose not to act.
“I don’t think there was any interruption in the offering of a program,” said Mr. Michelli, a professor of education at the Graduate Center of the City University of New York. “Why bother to do accreditation if the outcome, after all that effort and all that work, is that everyone gets accredited? There were no consequences, except for the students in schools who ended up getting bad teachers, at least theoretically.”
RATE was scotched in 2010 for financial reasons, and the state now requires all its education schools to seek national accreditation instead.
Across the states, programs’ closure or withdrawal for reasons other than performance, such as declining enrollments or financial strain, are more common. And sometimes those scenarios dovetail with quality problems, as has been the case in Vermont.
“When we go on a visit and we see a department that doesn’t have any faculty or any portfolio examples, it’s difficult for us to extend approval to those programs,” said Shannon Miller, the state’s consultant who oversees the program-review process. As a result, three colleges in Vermont chose to withdraw their small education programs in recent years.
In addition, some state officials say, merely examining the rate of program closure doesn’t capture other elements of states’ regulatory oversight.
Issuing warnings, as many states do, can be a powerful signal to a program, triggering the ability to hire more faculty, direct resources to the college, or jump-start a restructuring, said Jennifer Wallace, the executive director of Washington state’s professional-standards board.
“It gives the opportunity to say to the program: ‘This is really serious, and candidates are going to know about it. You really need to focus here, and we’re going to be back in less than a year,’” she said. “I think it’s important to have that stage.”
Encouraging providers that don’t pass muster to withdraw their programs is also far less likely than outright closure to lead to a lawsuit, something that occurred in 2008 with one of Washington state’s public universities, Ms. Wallace added.
“It’s a terrible, litigious, extremely expensive way to do it,” she said.
In such cases, plaintiffs generally argue that state officials didn’t review all the evidence or follow procedures to the letter.
Some states also said they have used opportune shifts in policy to encourage weaker programs to withdraw voluntarily.
Louisiana officials noted that, between 2000 and 2010, some education schools chose not to participate in a state-required redesign of their programs and closed down instead. Minnesota reported a similar occurrence when its special education rules changed in 2012.
Still, the bottom-line monetary consequences of not permitting an institution to enroll new candidates can act as a powerful lever.
In Michigan, Lake Superior State was flagged through an annual quality audit that supplements less-frequent program- and college-level reviews. Among other indicators, the audit looks at whether, over three years, programs maintain an 80 percent passing rate on licensing tests.
However blunt an instrument those tests may be, they flagged significant structural weaknesses in Lake Superior State’s programming, state officials said.
“If that metric hadn’t had the power it had in the overall score, the dysfunctional interdepartmental relationship there wouldn’t have been exposed,” said Leah Breen, the interim director of the state’s office of professional-preparation services.
Michigan’s more hands-on approach didn’t mean programs were hung out to dry. A state-designated “committee of scholars” came in to work with faculty members at Lake Superior State. And state officials said that they’ve seen major improvements since, among them a formerly complacent English department that now realizes its future livelihood might well depend on the success of the teachers it helps prepare.
Ms. Fiebelkorn, the university’s assistant dean for education, seconded that opinion, noting that faculty across content areas came together to work with the state panel.
“We have 120 full-time faculty, and there were 25 of them here to meet with the committee,” she said. “They didn’t have to be here, but they were here to support teacher education. That rallying to the program and what it means for Lake State was significant.”
Ms. Fiebelkorn hastened to add that, in her view, Michigan’s review system isn’t perfect. She still finds it unfairly test-heavy, and she wonders if the state would have the political will to take on one of its large flagship institution’s programs if similar problems were identified.
Still, she said, the process, and especially the commitment of the university’s provost, president, and faculty, prompted change.
While acknowledging some of the historical difficulties with approval of teacher-preparation programs, state officials say that improving the process is high on their to-do lists.
The Council of Chief State School Officers, using close to $900,000 in grants, is prodding seven states--Connecticut, Georgia, Idaho, Kentucky, Louisiana, Massachusetts, and Washington--to come up with better ways of measuring program performance and intervening when necessary.
Other states are tackling the work on their own. By 2017, Missouri will move to a system of auditing programs annually, rather than every seven years; New Mexico begins an annual review this year.
Louisiana, North Carolina, Ohio, and Tennessee, among others, are at the forefront of another effort. They are issuing annual reports showing teacher programs’ ability to raise student achievement, usually by connecting K-12 students’ test scores to each institution that prepared those students’ teachers. The approach is controversial among teacher-educators and is the subject of much research, whose findings remain largely mixed.
What remains unclear about most of the new state efforts is whether they will result in more fine-grained distinctions and meaningful consequences for programs. Concerns also persist about capacity, given the skeleton staffing of many state agencies, Mr. Rogers of the state teacher education and certification directors’ group said.
If, in the end, the question comes down to one of willpower, states aren’t the only ones facing the challenge of mustering it. CAEP, now the national accreditor for teacher education, approved new standards amid considerable institutional pushback in 2013.
The standards, scheduled to go into effect in 2016, are widely considered to be more difficult than prior sets. But it is unclear how aggressive the accreditation council will be in enforcing them.
“CAEP must implement those standards with fairness, rigor, and consistency,” said James G. Cibulka, the council’s president said.
That will require retraining reviewers to understand and apply the new standards, and communicating with institutions about what evidence will be expected—and about old practices that will no longer suffice.
Crucially, Mr. Cibulka said, the CAEP effort will also largely depend on whether the 13 states that plan to permit its accreditation to supplement or stand in for their own review processes will share the same expectations for rigor as the national organization.
If history is a guide, that remains an open question.