Homework: New Research Suggests It May Be an Unnecessary Evil

Those open to evidence have been presented this Fall with yet another finding that fails to find any meaningful benefit even when the study is set up to give homework every benefit of the doubt.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

A brand-new study on the academic effects of homework offers not only some intriguing results but also a lesson on how to read a study -- and a reminder of the importance of doing just that: reading studies (carefully) rather than relying on summaries by journalists or even by the researchers themselves.

Let's start by reviewing what we know from earlier investigations.[1] First, no research has ever found a benefit to assigning homework (of any kind or in any amount) in elementary school. In fact, there isn't even a positive correlation between, on the one hand, having younger children do some homework (vs. none), or more (vs. less), and, on the other hand, any measure of achievement. If we're making 12-year-olds, much less five-year-olds, do homework, it's either because we're misinformed about what the evidence says or because we think kids ought to have to do homework despite what the evidence says.

Second, even at the high school level, the research supporting homework hasn't been particularly persuasive. There does seem to be a correlation between homework and standardized test scores, but (a) it isn't strong, meaning that homework doesn't explain much of the variance in scores, (b) one prominent researcher, Timothy Keith, who did find a solid correlation, returned to the topic a decade later to enter more variables into the equation simultaneously, only to discover that the improved study showed that homework had no effect after all[2], and (c) at best we're only talking about a correlation -- things that go together -- without having proved that doing more homework causes test scores to go up. (Take 10 seconds to see if you can come up with other variables that might be driving both of these things.)

Third, when homework is related to test scores, the connection tends to be strongest -- or, actually, least tenuous -- with math. If homework turns out to be unnecessary for students to succeed in that subject, it's probably unnecessary everywhere.

Along comes a new study, then, that focuses on the neighborhood where you'd be most likely to find a positive effect if one was there to be found: math and science homework in high school. Like most recent studies, this one by Adam Maltese and his colleagues[3] doesn't provide rich descriptive analyses of what students and teachers are doing. Rather, it offers an aerial view, the kind preferred by economists, relying on two large datasets (from the National Education Longitudinal Study [NELS] and the Education Longitudinal Study [ELS]). Thousands of students are asked one question -- How much time do you spend on homework? -- and statistical tests are then performed to discover if there's a relationship between that number and how they fared in their classes and on standardized tests.

It's easy to miss one interesting result in this study that appears in a one-sentence aside. When kids in these two similar datasets were asked how much time they spent on math homework each day, those in the NELS study said 37 minutes, whereas those in the ELS study said 60 minutes. There's no good reason for such a striking discrepancy, nor do the authors offer any explanation. They just move right along -- even though those estimates raise troubling questions about the whole project, and about all homework studies that are based on self-report. Which number is more accurate? Or are both of them way off? There's no way of knowing. And because all the conclusions are tied to that number, all the conclusions may be completely invalid.[4]

But let's pretend that we really do know how much homework students do. Did doing it make any difference? The Maltese et al. study looked at the effect on test scores and on grades. They emphasized the latter, but let's get the former out of the way first.

Was there a correlation between the amount of homework that high school students reported doing and their scores on standardized math and science tests? Yes, and it was statistically significant but "very modest": Even assuming the existence of a causal relationship, which is by no means clear, one or two hours' worth of homework every day buys you two or three points on a test. Is that really worth the frustration, exhaustion, family conflict, loss of time for other activities, and potential diminution of interest in learning? And how meaningful a measure were those tests in the first place, since, as the authors concede, they're timed measures of mostly mechanical skills? (Thus, a headline that reads "Study finds homework boosts achievement" can be translated as "A relentless regimen of after-school drill-and-skill can raise scores a wee bit on tests of rote learning.")

But it was grades, not tests, that Maltese and his colleagues really cared about. They were proud of having looked at transcript data in order to figure out "the exact grade a student received in each class [that he or she] completed" so they could compare that to how much homework the student did. Previous research has looked only at students' overall grade-point averages.

And the result of this fine-tuned investigation? There was no relationship whatsoever between time spent on homework and course grade, and "no substantive difference in grades between students who complete homework and those who do not."

This result clearly caught the researchers off-guard. Frankly, it surprised me, too. When you measure "achievement" in terms of grades, you expect to see a positive result -- not because homework is academically beneficial but because the same teacher who gives the assignments evaluates the students who complete them, and the final grade is often based at least partly on whether, and to what extent, students did the homework. Even if homework were a complete waste of time, how could it not be positively related to course grades?

And yet it wasn't. Again. Even in high school. Even in math. The study zeroed in on specific course grades, which represents a methodological improvement, and the moral may be: The better the research, the less likely one is to find any benefits from homework. (That's not a surprising proposition for a careful reader of reports in this field. We got a hint of that from Timothy Keith's reanalysis and also from the fact that longer homework studies tend to find less of an effect.[5])

Maltese and his colleagues did their best to reframe these results to minimize the stunning implications.[6] Like others in this field, they seem to have approached the topic already convinced that homework is necessary and potentially beneficial, so the only question we should ask is How -- not whether -- to assign it. But if you read the results rather than just the authors' spin on them -- which you really need to do with the work of others working in this field as well[7] -- you'll find that there's not much to prop up the belief that students must be made to work a second shift after they get home from school. The assumption that teachers are just assigning homework badly, that we'd start to see meaningful results if only it were improved, is harder and harder to justify with each study that's published.

If experience is any guide, however, many people will respond to these results by repeating platitudes about the importance of practice[8], or by complaining that anyone who doesn't think kids need homework is coddling them and failing to prepare them for the "real world" (read: the pointless tasks they'll be forced to do after they leave school). Those open to evidence, however, have been presented this Fall with yet another finding that fails to find any meaningful benefit even when the study is set up to give homework every benefit of the doubt.

NOTES

1. It's important to remember that some people object to homework for reasons that aren't related to the dispute about whether research might show that homework provides academic benefits. They argue that (a) six hours a day of academics are enough, and kids should have the chance after school to explore other interests and develop in other ways -- or be able simply to relax in the same way that most adults like to relax after work; and (b) the decision about what kids do during family time should be made by families, not schools. Let's put these arguments aside for now, even though they ought to be (but rarely are) included in any discussion of the topic.

2. Valerie A. Cool and Timothy Z. Keith, "Testing a Model of School Learning: Direct and Indirect Effects on Academic Achievement," Contemporary Educational Psychology 16 (1991): 28-44.

3. Adam V. Maltese, Robert H. Tai, and Xitao Fan, "When Is Homework Worth the Time? Evaluating the Association Between Homework and Achievement in High School Science and Math," The High School Journal, October/November 2012: 52-72. Abstract at http://ow.ly/fxhOV.

4. Other research has found little or no correlation between how much homework students report doing and how much homework their parents say they do. When you use the parents' estimates, the correlation between homework and achievement disappears. See Harris Cooper, Jorgianne Civey Robinson, and Erika A. Patall, "Does Homework Improve Academic Achievement?: A Synthesis of Research, 1987-2003," Review of Educational Research 76 (2006): 1-62.

5. To put it the other way around, studies finding the biggest effect are those that capture less of what goes on in the real world by virtue of being so brief. View a small, unrepresentative slice of a child's life and it may appear that homework makes a contribution to achievement; keep watching, and that contribution is eventually revealed to be illusory. See data provided -- but not interpreted this way -- by Cooper, The Battle Over Homework, 2nd ed. (Thousand Oaks, CA: Corwin, 2001).

6. Even the title of their article reflects this: They ask "When Is Homework Worth the Time?" rather than "Is Homework Worth the Time?" This bias might seem a bit surprising in the case of the study's second author, Robert H. Tai. He had contributed earlier to another study whose results similarly ended up raising questions about the value of homework. Students enrolled in college physics courses were surveyed to determine whether any features of their high school physics courses were now of use to them. At first a very small relationship was found between the amount of homework that students had had in high school and how well they were currently faring. But once the researchers controlled for other variables, such as the type of classes they had taken, that relationship disappeared, just as it had for Keith (see note 2). The researchers then studied a much larger population of students in college science classes - and found the same thing: Homework simply didn't help. See Philip M. Sadler and Robert H. Tai, "Success in Introductory College Physics: The Role of High School Preparation," Science Education 85 [2001]: 111-36.

7. See chapter 4 ("'Studies Show...' -- Or Do They?") of my book The Homework Myth (Cambridge, MA: Da Capo, 2006), an adaptation of which appears as "Abusing Research: The Study of Homework and Other Examples," Phi Delta Kappan, September 2006 .

8. On the alleged value of practice, see The Homework Myth, pp. 106-18, also available at http://bit.ly/9dXqCj.

Popular in the Community

Close

What's Hot