The complaints about US News and World Report's rankings go back further than that, however. Here's a story from 1989 about college presidents meeting with the magazine in 1987 to complain.
33 years after that visit, the magazine is gone but the rankings persist.
I won't rehearse all the complaints about the rankings here. They're well-known, largely correct, and utterly ineffectual. The basic complaint is that they motivate bad behaviors and decision among students, high schools, and institutions.
What is the rankings could motivate good behaviors, however? That's what Sen. @ChrisCoons pushed US News to do in 2016, when he called on them to make accessibility and affordability part of the rankings. coons.senate.gov/news/press-rel…
US News did it. They added a social mobility component that looks at the graduation rates of students on Pell Grants and factored in how many students at a school got a Pell. usnews.com/best-colleges/…
It's hard to entirely disentangle all the inputs and assign a cause without full access to the data, but almost all of the 25 schools that are the worst at enrolling low income students dropped in the rankings. Miami (OH) and Dayton both dropped more than 20 spots.
Currently, the social mobility score counts for just 5% of a school's overall score. **Imagine if it counted for 25%!** I think we'd see a radical shift in the rankings, which would force a lot of schools to correct their priorities.
I'm aware that some other magazines place more emphasis on social mobility rankings, but there's 2 issues here:
1. Their rankings have methodological problems that perversely reward wealthy schools with low Pell shares. 2. USN has cornered the market.
I think that USNWR's Best Colleges lists actually do a better job at ranking social mobility by focusing on what we can focus on, based on the data available: access and completion.
Given the dire state of the economy and of university budgets, it's vital to protect the interests of low-income students. It's in everyone's interest to do so.
This is such good reporting from the @harvardcrimeson:
There are about 27,000 high schools in the U.S.
Over the past 15 years, 1 in 11 students at Harvard have come from just 21 high schools.
So 9.1% of Harvard students come from 0.07% of US schools. @nytdavidbrooks
This is no accident. It's a stated priority of Harvard admissions.
The longtime dean of admissions said they're in the business of creating 100 year relationships with schools. He said this in a trial where Harvard was, believe it or not, trying to show it's fair.
Legacy, too, plays its role, as these are the kind of schools where wealthy alumni send their kids.
The most heavily weighted single factor in the Best Colleges rankings is Undergraduate Academic Reputation, which USN calls "Expert Opinion."
Here's the thing: there is absolutely no way the presidents, provosts, and deans of admissions they send the survey to can be qualified to answer the questions, let alone claim expertise.
Let's talk about some dumb stuff people say about test optional admissions. 🧵
This might take a sec, so here's the tl;dr:
TO policies, in and of themselves, are neither a cure-all for what's wrong with American higher ed nor the end of what's good about it, but the evidence points to their doing some good and no harm.
Let's define TO first.
A test-optional policy is one that allows applicants to decide whether they want their test score to be considered. It does not "get rid of tests" or "ban tests."
Almost every 4-yr college in the US is currently test optional.
For decades, colleges, med schools, and law schools have all made the point that standardized tests exist to show readiness to succeed in college or grad school.
Rankings were one of the incentives to focus on scores well beyond the readiness threshold and overemphasize tests. That emphasis has excluded lots of people who were highly qualified to become lawyers and doctors.