One of the most frustrating things I’ve ever been forced to do as a teacher is to ignore my students and concentrate instead on the data.
I teach 8th grade language arts at a high-poverty, mostly minority school in Western Pennsylvania. During my double period classes, I’m with these children for at least 80 minutes a day, five days a week.
During that time, we read together. We write together. We discuss important issues together. They take tests. They compose poems, stories and essays. They put on short skits, give presentations, draw pictures and even create iMovies.
I don’t need a spreadsheet to tell me whether these children can read, write or think. I know.
Anyone who had been in the room and had been paying attention would know.
But a week doesn’t go by without an administrator ambushing me at a staff meeting with a computer print out and a smile.
Look at this data set. See how your students are doing on this module. Look at the projected growth for this student during the first semester.
It’s enough to make you heave.
I always thought the purpose behind student data was to help the teacher teach. But it has become an end to itself.
It is the educational equivalent of navel gazing, of turning all your students into prospective students and trying to teach them from that remove – not as living, breathing beings, but as computer models.
It reminds me of this quote from Michael Lewis’ famous book Moneyball: The Art of Winning an Unfair Game:
“Intelligence about baseball statistics had become equated in the public mind with the ability to recite arcane baseball stats. What [Bill] James’s wider audience had failed to understand was that the statistics were beside the point. The point was understanding; the point was to make life on earth just a bit more intelligible; and that point, somehow, had been lost. ‘I wonder,’ James wrote, ‘if we haven’t become so numbed by all these numbers that we are no longer capable of truly assimilating any knowledge which might result from them.’”
The point is not the data. It is what the data reveals. However, some people have become so seduced by the cult of data that they’re blind to what’s right in front of their eyes.
You don’t need to give a child a standardized test to assess if he or she can read. You can just have them read. Nor does a child need to fill in multiple choice bubbles to indicate if he or she understands what’s been read. They can simply tell you. In fact, these would be better assessments. Doing otherwise, is like testing someone’s driving ability not by putting them behind the wheel but by making them play Mariocart.
The skill is no longer important. It is the assessment of the skill.
THAT’S what we use to measure success. It’s become the be-all, end-all. It’s the ultimate indicator of both student and teacher success. But it perverts authentic teaching. When the assessment is all that’s important, we lose sight of the actual skills we were supposed to be teaching in the first place.
The result is a never ending emphasis on test prep and poring over infinite pages of useless data and analytics.
As Scottish writer Andrew Lang put it, “He uses statistics as a drunken man uses lamp posts – for support rather than for illumination.”
Teachers like me have been pointing this out for years, but the only response we get from most lawmakers and administrators is to hysterically increase the sheer volume of data and use more sophisticated algorithms with which to interpret it.
Take the Pennsylvania Value Added Assessment System (PVAAS). This is the Commonwealth’s method of statistical analysis of students test scores on the Pennsylvania System of School Assessment (PSSA) and Keystone Exams, which students take in grades 3-8 and in high school, respectively.
It allows me to see:
- Student scores on each test
- Student scores broken down by subgroups (how many hit each 20 point marker)
- Which subgroup is above, below or at the target for growth
But perhaps the most interesting piece of information is a prediction of where each student is expected to score next time they take the test.
How does it calculate this prediction? I have no idea.
That’s the kind of metric they don’t give to teachers. Or taxpayers, by the way. Pennsylvania has paid more than $1 billion for its standardized testing system in the last eight years. You’d think lawmakers would have to justify that outlay of cash, especially when they’re cutting funding for just about everything else in our schools. But no. We’re supposed to just take that one on faith.
So much for empirical data.
Then we have the Classroom Diagnostic Tools (CDT). This is an optional computer-based test given three times a year in various core subjects.
If you’re lucky enough to have to give this to your students (and I am), you get a whole pile of data that’s supposed to be even more detailed than the PVAAS.
But it doesn’t really give you much more than the same information based on more data points.
I don’t gain much from looking at colorful graphs depicting where each of my students scored in various modules. Nor do I gain much by seeing this same material displayed for my entire class.
The biggest difference between the PVAAS and the CDT, though, is that it allows me to see examples of the kinds of questions individual students got wrong. So, in theory, I could print out a stack of look-a-like questions and have them practice endless skill and drills until they get them right.
And THAT’S education!
Imagine if a toddler stumbled walking down the hall, so you had her practice raising and lowering her left foot over-and-over again! I’m sure that would make her an expert walker in no time!
It’s ridiculous. This overreliance on data pretends that we’re engaged in programming robots and not teaching human beings.
Abstracted repetition is not generally the best tool to learning complex skills. If you’re teaching the times table, fine. But most concepts require us to engage students’ interests, to make something real, vital and important to them.
Otherwise, they’ll just go through the motions.
“If you torture the data long enough, it will confess,” wrote Economist Ronald Coase. That’s what we’re doing in our public schools. We’re prioritizing the data and making it say whatever we want.
The data justifies the use of data. And anyone who points out that circular logic is called a Luddite, a roadblock on the information superhighway.
Never mind that all this time I’m forced to pour over the scores and statistics is less time I have to actually teach the children.
Teachers don’t need more paperwork and schematics. We need those in power to actually listen to us. We need the respect and autonomy to be allowed to actually do our jobs.
Albert Einstein famously said, “Not everything that can be counted counts, and not everything that counts can be counted.”
Can we please put away the superfluous data and get back to teaching?
This article was originally published on Gadfly on the Wall blog.