We all ought to be skeptical of this mania for lists. Many of them use questionable data (obtained from anonymous surveys or other sketchy sources) to plant numbers on aspects of college life that are very difficult to quantify. Even the lists that use easily quantifiable criteria, such as that one that everybody's heard of, skew the results according to their own methodology. In the U.S. News ranking formula, for example, "alumni giving rate" counts for 5% of a school's ranking, whereas "student/teacher ratio" counts only for 1%. It's nice to know that the alumni have deep pockets, but is that really five times as important as a critical aspect of the classroom experience?
That's why I argued for a kind of modular ranking system -- in which students and parents can choose which criteria are important to them -- in my post on this subject a few months ago. I also liked the idea of a ranking system based on a school's commitment to the core curriculum.
That's why I argued for a kind of modular ranking system -- in which students and parents can choose which criteria are important to them -- in my post on this subject a few months ago. I also liked the idea of a ranking system based on a school's commitment to the core curriculum.
If you're interested in learning more about college rankings, I encourage you to check out my other posts on the subject, as well as this extremely interesting article by Malcolm Gladwell on the misleading objectivity of lists.
No comments:
Post a Comment