The Volokh Conspiracy
Mostly law professors | Sometimes contrarian | Often libertarian | Always independent
The New Year Brings a New Formula for US News Law School Rankings
U.S. News and World Report is making some significant changes to the way it ranks law schools.
Today the Wall Street Journal reports that U.S. News & World Report is making some dramatic changes to the formula it uses to rank law schools, partially in response to complaints from law schools that objected to aspects of the rankings.
In a letter sent Monday to deans of the 188 law schools it currently ranks, U.S. News said it would give less weight in its next release to reputational surveys completed by deans, faculty, lawyers and judges and won't take into account per-student expenditures that favor the wealthiest schools. The new ranking also will count graduates with school-funded public-interest legal fellowships or who go on to additional graduate programs the same as they would other employed graduates.
U.S. News said its rankings team held meetings with more than 100 deans and other law-school administrators in recent weeks. They embarked on the listening tour after Yale Law School—perennially ranked at No. 1—said it would no longer provide information to help U.S. News compile its list. . . .
The shift in methodology may be due in part to necessity. Though U.S. News pulls much of its data from the American Bar Association and said it would rank schools whether or not they cooperated, it relies on schools to provide the spending figures and to complete peer-review surveys. . . .
Mr. Morse and Ms. Salmon said they also heard concerns in their meetings about how U.S. News considers diversity and loan forgiveness and potentially encourages awarding scholarships based on LSAT scores rather than on financial need. They wrote in the Monday letter that those issues "will require additional time and collaboration to address" so won't be overhauled now.
At his Excess of Democracy blog, Professor Derek Muller has some preliminary analysis of how these changes could affect the rankings, naming schools he expects to win and lose from the new formula. He concludes:
I feel fairly confident that a handful of the schools identified above as winners in several categories, including Alabama, BYU, Georgia, and Texas A&M, will benefit significantly in the end, but one never knows for sure. It also has the potential to disrupt some of the more "entrenched" schools from their positions, as the more "legacy"-oriented factors, including spending and the echo chamber of reputational surveys, will receive less value. Law schools must increasingly face the value proposition for students (e.g., lower debt, better employment outcomes), with some other potential factors in the mix, in the years ahead.
Editor's Note: We invite comments and request that they be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of Reason.com or Reason Foundation. We reserve the right to delete any comment for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
2023, Meet the New Year, same as the Old Year, (C'mon (Man!) it's 2023 already, where are the flying cars? MLB teams in Japan?? Porn you can download to your Dick Tracy watch?? OK, 1 out of 3 ain't bad...
Our Future(s) I foresee, a (Very Bad) Reverend Jerry Kirtland-Sandusky post that will mention a certain Law School in South Texas (I like the South, I like Texas, it's like the best of both)
Frank
It doesn't really matter what their methodology is, U.S. News & World Report is so biased no one takes them seriously.
The old 'system' favored the elite schools regardless of actual merit and the new 'system' favors the elite schools regardless of actual merit.
How about ranking them on the number of graduates that work as public defenders? Add extra points for each graduate with no student loans.
Forty years ago, US News and World Report was a respected weekly news magazine that people (including myself) paid money to have mailed to them each week. As the quality declined, a lot of people (including myself) stopped paying to have the magazine mailed to them each week -- and it stopped printing in 2010.
It's not bias as much as credibility -- they are living off a past legacy.
Add extra points for each graduate with no student loans.
I don't see the value of that. Schools with more rich students get a higher rating?
Instead, what about including tuition costs in the ranking - the lower the better.
The whole rankings business is largely nonsense. Any time you start assigning weights they are going to be arbitrary. Who says a factor that counts, say, ten points, is really twice as important as one that counts five points. And of course, it's not clear that the factors are independent of each other, which confounds the rankings.
Interesting ... makes me think that USN&WR should turn this into a javascipt page where you can fiddle the weights to match your own circumstances. Rich and don't care about tuition? Set that weight to 0. Got high enough grades and LSAT that you think you have a good shot at a scholarship? And so on.
It might even be useful enough to make some money off.
No need to disclose the raw data.
That would be amazing. Although I believe you'd be able to figure out the raw data by comparing enough inputs and outputs. But I don't really care about that much.
I've thought of similar things for public K-12 school districts. Instead of having report cards deciding what the measure and how to weight everything, just make massive amounts of data available and let people decide what's important to them. Average GPA, average SAT/ACT/PSAT, gains over time in those scores, property taxes, sports clubs, graduation rates, college acceptance rates, discipline rates, swimming pool size. Just put it all out there in one place.
Good idea.
People don't have the same preferences, after all, so why try to cram everyone into the same mold?
Why do you think those would be informative pieces of information in assessing school quality?
At the high end of the ranking scale, there is a near-perfect correlation between the rankings and average LSAT score of the class. That isn't going to change, except perhaps if many schools choose to abandon the LSAT. (A cynic might suggest that Yale's attempt to divorce itself from the US News ranking is merely a precursor to an announcement that it is abandoning the LSAT.)
And while Professor Muller's analysis is interesting (at least statistically), the reality is that no one should take law school rankings in a mathematical sense, such that #32 is better than #36, or #44. The rankings are only useful in a broad classification sense. Florida moving from #31 to #21, or Utah moving from #47 to #37, isn't going to change who chooses to go to those schools, or how those schools are perceived by employers.
What, no mention of the cheating scandal, the fudged figures supplied to USN&WR? I'd think it has at least a little to do with the impetus to change.
"The new ranking also will count graduates with school-funded public-interest legal fellowships or who go on to additional graduate programs the same as they would other employed graduates."
I seem to remember the ABA -- yes ABA having problems with this.
Back then, the law schools were outright hiring their graduates -- and then firing them when their being unemployed no longer counted against the law school in the ABA's metrics.
Same thing here -- the kid has a job for a year and then what?!?
Same thing with those who go on for further degrees after the JD -- fine, don't count them at all right now -- but count them when they graduate with the other degree -- and are still unemployed....
Does Leonard Leo personally choose the Federalist Society members who receive shout-outs from the Volokh Conspiracy and other white, male, right-wing legal blogs?
Is it practical to survey students in five years and see how they are doing?
My Med schools been doing that for 30+ years, haven't answered a one.
Is it practical to survey students in five years and see how they are doing?
Do you mean practical as in "Can it be done?" or practical as in "Would it be informative?"
It can be done. It would not be particularly informative. The homeless alums are unlikely to respond.
It could be done from IRS income data....
I agree that students are better off finding a school that’s right for them without worrying about differences in rankings, especially small ones.
They might want to consider surveying students five or ten years out to determine post-graduation outcomes rather than relying solely on the year after graduation. Lesser-ranked schools may give students skills or grit whose real value becomes more apparent later in their careers.
The current system wouldn’t have any way of detecting this. Even separate from their gameability, first job prospects may depend too much on the school’s ranking, which is essentially self-referential, rather than on anything outside the ranking that the school actually does for its students.