As usual, a ton of blogospheric attention has been devoted to the US News law school rankings. Over at PrawfsBlawg, Geoffrey Rapp has found a way to get the numerical rankings of law schools in the Third and Fourth Tiers. At TaxProf, Paul Caron ranks the law schools by reputation score. At Brian Leiter’s Law School Reports, Brian Leiter offers suggestions for improving the rankings. At Law Librarian Blog, Joe Hodnicki tracks law school rankings from 1996-present. I, too, have posted about the US News Rankings.
If we step back from this year’s frenzy, I believe that there’s an important fact about law school rankings that accounts for much of the displeasure about them. Law school ranking systems have contradictory goals. Here’s why. Law schools, like many institutions, are not incredibly dynamic and changing in the short term. They often change slowly, not dramatically. The result: We shouldn’t see much movement year to year in the rankings. Most schools should stay about where they are. A few schools might move over time, but any one year’s movement is not significant in the grand scheme of things. So to be accurate, rankings shouldn’t change all that much.
But rankings systems have a contradictory goal: They need to reflect some kind of change, or else looking at the rankings each year would be like watching glaciers move. There must be some drama in the rankings year by year. We eagerly await our rankings each year, and we don’t want rankings at five or ten year intervals. And we don’t want stable rankings — we want changes to cheer and kvetch about.
There is another value in rankings reflecting some degree of change each year beyond our enjoyment of babbling on about them. Law schools work very hard on hiring new and lateral professors, promoting their reputations, improving their schools, increasing their admissions selectivity, and so on. We want our work to be reflected in a tangible manner. We want results for a year’s worth of hard work in improving the school. We don’t want to wait a decade or longer to see results. Unfortunately, the US News rankings often don’t reflect this work very well. But they do show that something is happening. We can then complain about the disconnect between what we’re doing and our ranking: “We did all this, and our ranking hasn’t moved. Damn that US News for their flawed system!” Or, we can justify rises in our rankings: “We’ve moved up several spots in the rankings. This is, of course, due to all the wonderful improvements we’ve been making to our school.” Either way, at least we have something to talk about.
The reality is that probably very little we do has much effect vis-a-vis our ranking with other schools over a period of time. We might improve our faculty by hiring some great laterals, but over the course of time, our competitor schools will also likely have done the same. True, one school might outpace another, but big shifts are the exception not the norm.
So the rankings need to reflect a state of affairs that is largely static, with a few gradual changes over the course of a long time. They must do so in a way that keeps people interested and excited. The rankings must display glacial change in a dramatic way. To use another metaphor, the rankings must make a turtle race seem exciting.
A few years ago, Dan Filler and I created a chart of the US News rankings for the top 25 law schools from 1997 to 2006. The interesting thing about the chart is how little movement most schools demonstrated over the course of time. Let’s look at Cornell Law School. In 1997, they were 12, then their ranking went like this over the next decade: 12, 10, 10, 12, 13, 10, 12, 11, 13, 12. When they drifted from 10 to 13 over the course of a few years, there were probably cries of outrage for dropping out of the top 10. When they suddenly jumped from 13 to 10, they probably celebrated with great cheers. Headline: “Cornell dramatically rises to the top 10!” In reality, Cornell is trapped in an orbit around 11.5 (that’s their average ranking over the past decade). And they barely go much higher or much lower than that. From year to year, it appears that there is something going on — Cornell appears to be moving. But it’s just a clever illusion, created by US News to achieve the two contradictory goals of rankings.
Paul Caron provides links to law schools responding to this year’s rankings. David Lat collects emails from law schools responding to rankings fluctuations.
At the end of the day, I believe in the following points:
1. For the average law school, US News ranking doesn’t change that dramatically. Only a few law schools make any major advances or drops in rankings.
2. In reality, schools don’t change that rapidly. Some schools that appear to have moved significantly in their US News ranking may have moved due to changes in methodology more than actual changes in the institution.
3. The legal world goes into a frenzy each year when the rankings come out, but changes in the rankings from one year to the next can’t possibly have any meaning. What matters is changes that occur over the course of a long period of time.
4. US News knows how to sell issues. Its rankings must change each year, or else nobody would care to buy the issue each year. It knows the two contradictory goals for rankings systems. It’s solution is a rankings system that shuffles things around a little bit each year, enough to give us the drama we crave. Although most schools go up and down each year, over the course of time, they basically stay in the same place.
Is is true that some schools move significantly over a period of time. So there are exceptions, but there aren’t very many.
To be more meaningful, rankings should probably be done in five-year intervals rather than one-year intervals. Information over the course of five years should be factored into the rankings, not just information for any one given year. Would such a ranking system be successful? Probably not. US News wants to sell issues each year, not every five years. Moreover, we don’t want to wait every five years for a new ranking. We want something exciting to talk about each spring.
And so the game will continue on. . . .
Originally Posted at Concurring Opinions
* * * *
This post was authored by Professor Daniel J. Solove, who through TeachPrivacy develops computer-based privacy training, data security training, HIPAA training, and many other forms of awareness training on privacy and security topics. Professor Solove also posts at his blog at LinkedIn. His blog has more than 1 million followers.