The Department of Education's College Scorecard

Although it provides college-bound students with some useful information, it misleads them about institutional performance by focusing on the six-year graduation rate to "score" colleges and universities. Use of the six-year rate is misleading for two primary reasons.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

I'm giving the Department of Education's recently released College Scorecard a D+. Although it provides college-bound students with some useful information, it misleads them about institutional performance by focusing on the six-year graduation rate to "score" colleges and universities. Use of the six-year rate is misleading for two primary reasons.

First, the Scorecard's focus on graduation statistics for first-time, full-time freshmen omits consideration of the many other college students who do not matriculate as first-time, full-time freshmen. Thus, the six-year rate is significantly under-inclusive when it comes to measuring institutional graduation outcomes. The College Scorecard at least alludes to this fact when it notes that "[g]raduation rate data are based on undergraduate students who enrolled full-time and have never enrolled in college before," and that "[t]his may not represent all undergraduates that attend this institution."

Second, and more significantly, use of the six-year graduation rate is misleading because the rate doesn't take account of the different student populations being served by different institutions. Student demographics such as age and income have an enormous impact on graduation rates, as do the academic credentials of entering students. In Crossing the Finish Line: Completing College at America's Public Universities, for example, William G. Bowen and his co-authors say what we know -- that students from the upper ranges of socio-economic status are more likely to graduate than those from the lower ranges, and to graduate more quickly. The six-year graduation rate has more to do with what students an institution excludes or doesn't serve than it does with the performance of the institution itself.

I don't mean to suggest that colleges and universities can't improve the persistence and graduation rates of their students. Demography is not destiny. We can help our students persist and graduate, even those students with special challenges. But it is not possible to judge the performance of institutions adequately without taking student demographics into account.

By ignoring student demography, the Department of Education's College Scorecard is not a scorecard at all, at least when it comes to graduation rates. It is mainly an admissions profile. It tells inquirers more about their likelihood of being admitted to an institution and being able to afford to attend that institution than about their likelihood of persisting at and graduating from that institution. This is not trivial information, but it is not the sort of information one would expect on a "scorecard."

When it comes to measuring performance, states interested in fairly evaluating their colleges and universities have better models than the College Scorecard. Tennessee, for example, adjusts its persistence and graduation results by giving extra credit to institutions serving adults and low income students. Use of the standard six-year graduation rate without adjusting for such student demographics would unfairly penalize institutions for serving such students and frustrate state and national efforts to increase the number of college graduates. It's a pity that the Department of Education's College Scorecard perpetuates a backwards and misleading understanding of graduation rates.

Popular in the Community

Close

What's Hot