Six Reasons Why the U.S. News College Rankings Stink
1. Colleges have a perverse incentive to juke the stats. Even if they're not fabricating data outright, colleges are doing their damnedest to massage the numbers. As College Confidential reports, "some schools have submitted data that excludes scores from 'special admissions' (e.g., athletes, students identified as learning disabled, etc.). One school reportedly left out the verbal scores of international students but kept the math scores." Admissions officers play the numbers game to the hilt because their jobs are on the line. The Atlantic published an article a few years back that offered a disturbing glimpse of this:
"There's pressure for rankings," says Tom Green, the associate vice-president for enrollment services at Seton Hall. "There's no doubt about it. Presidents get pressure from board members, from alumni. They'll say, You're number eighty-seven. How are you going to get to be number eighty-five?" Eugene Trani, the president of Virginia Commonwealth University, a Tier III school in the U.S. News rankings, carries a laminated card in his pocket to remind him of the school's strategic goal of making it to the next tier. For every year the school stays in the higher tier he will receive a $25,000 bonus—a fact first reported by AGB Priorities, an industry newsletter. A vice-president at Hobart and William Smith was fired when she failed to report current data to U.S. News and the school's ranking dropped from Tier II to Tier III.
Me College Monster! Me gobble up more students for better rankings! OM NOM NOM NOM |
3. Poor students are disproportionately punished. The Atlantic article from above also reports that "low-income students often suffer in this process. They drag on both revenue and academic profile, and it's hard for outsiders to tell when their numbers are reduced." As an example of this, observe the importance of SAT and ACT scores in college rankings. Those tests count for a larger percentage (7.5%) in the Best Colleges metric than class rank (6%), despite their inferiority to high school GPA at predicting academic performance in college. SAT and ACT scores correlate much, much better with family income than with academic performance for many reasons. When colleges try to boost their SAT and ACT scores for the sake of a boost in the rankings, poorer students are therefore the first to suffer, and the benefit to the learning environment is questionable at best.
4. The rankings foster cutthroat competition. The win-at-all costs mentality of Wall Street has spread to high schoolers and their parents. They spend thousands of dollars on SAT and ACT test prep, some of which goes to yours truly. A few (I'm sure you've met them) obsess over the rankings and the prestige of colleges, rather than figuring out which schools might actually be a good fit. This is understandable, given the extreme competitiveness of admissions at any of the top tier schools on the U.S. News list. I think, however, that we ought to push back against the notion that college admissions are a (deadly serious) game to be won. For starters, a study tracking the long-term success of students who were accepted at and rejected from "elite" colleges found no difference in their earnings 20 years on. The only exceptions to this rule were poor and minority students -- another way in which amped-up college competitiveness hurts the disadvantaged disproportionately. Jay Mathews of the Washington Post wrote a whole book about the silliness of the reverence we have for brand-name schools, arguing that we make an already stressful process even more so. And Mark Speyer, a college counselor at a prep school in Manhattan, laments the effect of high-stakes admissions on his students' very souls:
I believe that the new obsession with numbers is counter-educational: that it makes for a less educated and less educable kind of student; a less thoughtful, more cynical, more boring, and more exhausted kind of student, not at all the kind of student that colleges on a clear day know they want and need. Pumped-up SATs do not mean better students.
More rejection letters = $$$$$$$ |
6. The rankings are not actually objective. Those who call the the U.S. News rankings "objective" -- including Mr. Morse -- are suckered by what I call the "Objective Yardstick Fallacy". A single measure used to assess a complex situation only merits the term "objective" if people can generally agree on the value of the results. For instance, a yardstick is good at measuring length because we agree on the definition of length and on the units we use to measure it. When we're measuring weight or electrical current, we reach for a scale or an ammeter and set the yardstick aside without qualms. If we need to measure a length, a weight, and a current all at the same time, it makes as little sense to combine those results into a single number as it does to indicate by a single number whether one college has more "academic quality" than another. Not only do we not all agree on what constitutes "academic quality", but the aggregate ranking necessarily embodies the prejudices of the aggregator: it is the polar opposite of an "objective yardstick". Most students probably don't agree with Best Colleges that alumni giving rate (5% of the total) is more important than student/teacher ratio (1%), for example. (I plan to write a post at some point about how the Objective Yardstick Fallacy applies to the use of SAT scores in particular.)
What Do You Think?
That's what the U.S. News rankings look like from my perspective (which, in brief, is that of an SAT tutor who loathes the SAT). I'd love to know what your experience of Best Colleges as a parent, a student, or a teacher has been. Please let me know in the comments!
No comments:
Post a Comment