A few weeks ago, I was talking with the former principal of a vocational high school. He was thoughtful and enthusiastic and made his former school sound very successful and interesting.
I asked him if the school had followed its students to see how they fared -- did they actually go into the fields they had prepared for in high school? Did they succeed?
Normally when I ask that question, educators say they wish they could follow their students systematically -- they bemoan the lack of money that hinders that kind of serious research into outcomes. Instead, they usually say, they have to rely on the students who stop by once in a while or keep in touch by e-mail.
But the former principal I was talking with had a different answer -- he said that as principal he had done what he knew to be best practice from the best available research and then left it at that, confident that he had done the best job possible. I found this interesting, because it showed how even committed educators can miss an important step in their job. It is of course important to start with what research has demonstrated to be best practice, but it isn't enough. Continual monitoring to see what is working and what needs adjusting has to happen if schools are going to improve.
That sounds a little abstract, so let me give a concrete example.
University Park Campus High School is a high-poverty high school in Worcester, Mass., that was founded with the idea that it would prepare students to be successful in college or other postsecondary education. Its instructional program was developed with careful attention to research and best practices. The school graduated almost all its students, and academic achievement was among the highest in Massachusetts.
They could have considered themselves successful and called it a day, like the former principal I talked with had. But that wasn't good enough for University Park educators, who procured a grant to study what happened to their graduates. They found that most of their graduates went to college but only about half persisted to graduation. Compared with other high-poverty high schools, that was pretty good -- but pretty good wasn't good enough for them. Former principal Ricci Hall said that it was a difficult moment when they realized that they hadn't served their students as well as they had thought. "We had to look in the mirror," he said.
In response, University Park educators reworked the senior year experience in ways they thought might better prepare their students for college. Among other things, they started three-day-a-week syllabus-based seminar classes and consciously cut back on the reminders about due dates that their students had been used to relying on. They also linked their graduates with alumni who could give advice on which colleges were more welcoming to poor students and which professors were the best.
When University Park educators followed up, they found that college persistence and graduation rates had gone up. They're continuing to think about new ways they can improve their numbers even more.
And that's how schools improve. They might start with existing research, but then they monitor and adjust and then monitor and adjust again. They do so from the deep belief that it is up to them to figure out how to make students successful.
All that ran through my mind when I read Dan Willingham and David Grissmer's op-ed in the New York Times arguing that any new investment in preschool programs should include money to monitor results and conduct further research to learn what differentiates effective preschools from ineffective ones.
They make the point that recent calls to radically expand preschool make it seem as if we know more than we do about how to make preschool programs successful. But even if we did know more and we were to follow the best possible research out there, it would still be necessary to continually monitor what happens under which conditions and what adjustments have which results.
This seems so obvious that we shouldn't need to talk about it -- but we do.