Scamming the US News College Rankings Scam

The scam of the US News college rankings and the various ways in which colleges scam said scam rankings.

Stephen Budiansky, who worked at US News from 1986 to 1998, discusses the scam of the magazine’s college rankings and the various ways in which colleges scam said scam rankings.

To increase selectivity (one of the statistics that go into U.S. News’s secret mumbo-jumbo formula to produce an overall ranking), many colleges deliberately encourage applications from students who don’t have a prayer of getting in. To increase average SAT scores, colleges offer huge scholarships to un-needy but high scoring applicants to lure them to attend their institution. (The Times story mentioned that other colleges have been offering payments to admitted students to retake the test to increase the school average.)

One of my favorite bits of absurdity was what a friend on the faculty at Case Law School told me they were doing a few years ago: because one of the U.S. News data points was the percentage of graduates employed in their field, the law school simply hired any recent graduate who could not get a job at a law firm and put him to work in the library.

Their other tactic was pure genius: the law school hired as adjunct professors local alumni who already had lucrative careers (thereby increasing the faculty-student ratio, a key U.S. News statistic used in determining ranking), paid them exorbitant salaries they did not need (thereby increasing average faculty salary, another U.S. News data point), then made it understood that since they did not really need all that money they were expected to donate it all back to the school (thereby increasing the alumni giving rate, another U.S. News data point): three birds with one stone!


As someone who knew a little math, what really drove me bonkers about the college guide was:

(a) the logical absurdity of adding together completely unrelated statistics to produce a single measure of merit—the key point being that you can produce an astonishing range of different results depending  on the relative weight  each component factor is assigned. And there is simply no logical, a priori basis for establishing such a weighting objectively. Do SAT scores count 30% of the total score? 32.2%? 18.78234%? (How about zero?) It’s the classic apples + oranges – bananas/kumquats  = fruit salad approach to statistics, and is completely meaningless.

(b) the fact that the entire exercise was designed to emphasize noise over signal: tiny, random fluctuations from year to year would result in regular changes in the final rankings. Even within its own absurd methodology no one ever dared broach the question of the actual statistical significance of the differences between the “No. 1” school and say the No. 5 school. In fact, there was pretty clearly none. It is of course ridiculous to think that when Harvard, Stanford, Yale, whoever changed places from one year to the next in the final rankings this reflected any actual sudden change in the underlying quality of the schools. But the only way to keep selling the damned guide each year was to make sure things kept changing from year to year.

What’s amazing is not that the magazine used this gimmick to increase sales or that some try to game the system but that almost all the colleges and universities in America willingly went along with it. I can sort of understand why a 2nd or 3rd tier institution would tout its rankings in some tertiary category (Best value of any medium sized liberal arts college in the Southwest!) as a means of claiming prestige. But what did Harvard, Yale, Princeton, Stanford, and the like have to gain by playing along?

FILED UNDER: Education, Media, , , , ,
James Joyner
About James Joyner
James Joyner is Professor and Department Head of Security Studies at Marine Corps University's Command and Staff College and a nonresident senior fellow at the Scowcroft Center for Strategy and Security at the Atlantic Council. He's a former Army officer and Desert Storm vet. Views expressed here are his own. Follow James on Twitter @DrJJoyner.


  1. FWIW, I think the reinvention we need is much deeper than rankings:

    Nonetheless, it is interesting to speculate: Suppose the educational system is drastically altered to reflect the structure of society and what we now understand about how people learn. How will what universities teach be different? Here are some guesses and hopes.

    That by Lawrence Summers, here.

    … so, ranking in a broken model (oh! an “unsustainable” model), is kind of boring.

  2. To increase average SAT scores, colleges offer huge scholarships to un-needy but high scoring applicants to lure them to attend their institution.

    So all merit-based scholarships are a “scam” now? It seems to me this would exist even absent the ratings as universities would want to encourage the best and brightest to attend for purely academic reasons.

  3. @Stormy Dragon:

    But the scholarships are “huge,” and that makes all the difference 😉

  4. @john personna:

    I should note I attended Penn State on a full-tuition merit based scholarship. It should also be noted that even with that “huge” scholarship I was 20k in debt by graduation due to room and board, fees, text books, etc.

  5. @Stormy Dragon:

    And I should note that I attended a Cal State uni back in the day, with less than $100 tuition, per semester.

  6. Rob in CT says:

    @Stormy Dragon:

    That one jumped out at me too (one of these things is not like the other…).

    The rankings were pretty obviously silly, but people love silly rankings.

  7. murray says:

    In light of this I would be interested in seeing international ranking measures examined.

    I always found suspicious the overwhelming presence of American colleges in these rankings,

    Are our colleges really that better at teaching or are they better at skewing the measurements.

  8. mantis says:

    But what did Harvard, Yale, Princeton, Stanford, and the like have to gain by playing along?

    They fight over the top spots on the National Universities list. The rankings at the very top change very, very little, but there are occasional flips of, say, number two and number three. Those schools fight over that, and why not? They have plenty of money to pay people to do so.

  9. US News also ranks high schools, BTW. But get this: until recently, the only criterion used to rank high schools was the ratio of students taking Advanced Placement courses to AP students who actually take the year-end AP exam. That’s it. No other calculation.

    Of course, high schools scammed that. Here’s how my daughter’s high school did it. The week after school opened in August, they told AP students’ parents to pony up the entire fee for every AP exam for every AP course. So then they can immediately report to College Board, “One hundred percent!”

    Yet the fees are not due from the school to College Board until March. When my sons were in a different high school, we didn’t even have to make a deposit on the exam fees until January.

    I have politely told my daughter’s high school each August that we’ll send them a check in January.

    Since then, US News has changed its methodology, but it’s still far from scam proof.