Predictive Test Scores and Diploma Privilege


The International Baccalaureate program, which credentials high-school students who take college-level classes, canceled exams this year because of COVID-19. But that did not stop the program from granting exam grades. Exam grades were given based on a predictive algorithm. According to Wired, "The system used signals including a student's grades on assignments and grades from past grads at their school to predict what they would have scored had the pandemic not prevented in-person tests."

Unsurprisingly, this is controversial. And without the precise formula or algorithm that the program used to calculate grades, it is difficult to assess. But the core idea makes a lot of sense: Measure students with a system of decentralized grades, then normalize the grades from different schools or teachers based on how predictive those grades have been of some other performance measure. At least, this seems like a reasonable approach if it is impossible to give the test and we still need to distinguish students. Use the best information available, including historical data.

Indeed, it would be an improvement over current practice if U.S. News and World Report normalized students' grades in this way. How would this work? In effect, a predicted LSAT score would be calculated based on one's GPA, given the GPA distribution and LSAT distribution at one's undergraduate school. So, if one received a 75th percentile GPA, that would be translated into a 75th percentile LSAT score for the same school. With enough data, the prediction might be based on GPA across majors, to account for tougher grading in some departments than others. A student would receive an actual LSAT score too, but this approach would make comparisons of GPAs across schools much more meaningful.

Law schools already have access to this information. Admissions offices receive data from the Credential Assembly Service, including the distribution of grades and of LSAT scores at an applicant's undergraduate school. Whether or not they make explicit computations, law school admission officers have a sense of differences in quality across undergraduate schools and of differences in grade inflation. But because U.S. News asks law schools to report undergraduate GPA, rather than a normalized measure that would be modestly more difficult to calculate, many law schools place considerable weight on the unnormalized result. And because U.S. News cares only about median GPAs and LSAT scores, a law school can improve its standing with a splitter strategy, including sometimes accepting a student with a high GPA from a low-ranked school with grade inflation, even if that student has a very low LSAT score.

Predictive grading--that is, a system of decentralized grading normalized based on overall performance of students on a standardized test--ought to be welcomed by those who are skeptical that standardized test scores are a good measure of individual talent. True, with such a scheme, standardized test scores are still used as the normalization measure. Thus, if a school teaches to the test and artificially increases its standardized test scores at the expense of more important learning, it will benefit. But schools already have an incentive to teach to the test, and this approach reduces the incentive of students to study to the test. One might rationally combine a switch to predictive grading with reduced emphasis on the standardized test as a criterion for assessing each individual. Grades provide a metric of how students perform on a daily and weekly basis. The only reason we need standardized tests is that grades aren't standardized, but we can standardize (normalize) grades based on aggregate standardized test performance.

Where possible, it might be preferable to normalize based on some other measure, such as law school grades. Each law school might administer a statistical model to predict, based on a student's undergraduate grades and school, what the student's expected law school GPA would be, given the past performance of students from that undergraduate program at that law school. In principle, law schools could coordinate by sharing their models, so that one would be able to predict a student's law school performance at every participating law school based on the student's undergraduate performance. This would have the side benefit of making grades across different law schools easier to compare. My guess is that considering individual LSAT or GRE scores would still have some predictive value in such a world, but much less relative value than would be the case today, where the undergraduate GPA measure is not normalized.

If such a system can be used to normalize high school grades and college grades, such a system could also be used for law school grades to determine whether students can be admitted to the bar. There has recently been a push for diploma privilege. The justification is the COVID-19 pandemic, and there is a powerful argument that this is not a good time for standardized testing. But if one accepts the claim that the bar exam helps to protect clients from poorly qualified lawyers, diploma privilege is difficult to countenance. Why should lawyers' welfare be placed above clients'?

There are longstanding arguments that the bar exam is just a tool of the lawyer cartel for reducing entry into the profession. Diploma privilege can be seen as a step toward ending or reducing the significance of the bar exam. But those who take the cartel argument seriously should also be skeptical of requirements that students attend three years of law school. Will a client be safer hiring a lawyer with two years of law school from the top of the class or a lawyer with three years of law school from the bottom of the class? Unsurprisingly, law school deans advocating diploma privilege have not also been suggesting that we relax requirements for legal education. No one seems to be making the argument that because the third year of law school will be an inferior online product this year, we might as well let students skip it.

A compromise on diploma privilege would be to use a predictive grading approach. Bar examiners could create a simple, school-specific model forecasting bar exam performance as a function of law-school grades. Then, they might decide that any student who is predicted more likely than not to pass the exam receives grade-based diploma privilege. Or bar examiners might choose other thresholds. If one is particularly concerned about false positives (students admitted to the bar who would not have passed the exam), the examiners might demand a higher percentage, and if examiners are particularly concerned about false negatives (students whose grades predict failure but who would have passed the exam), one might demand a lower percentage. The core point is that states should use the information available to them, at least if it is infeasible to generate information in the form of a bar exam.

One could also imagine such a system having some role in the future. Even if all students take the bar exam, one could imagine a system in which qualification for the bar depends on both the bar exam score and one's law school grades, normalized based on the bar exam scores of the entire population of students from the law school. This approach is responsive to those who believe that law school grades earned over three years are a better measure of ability than a short exam, while providing a means for comparing grades across schools and still giving students an incentive to take the exam seriously.

NEXT: Justice Fields's Docket Book (October Term 1885)

Editor's Note: We invite comments and request that they be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of or Reason Foundation. We reserve the right to delete any comment for any reason at any time. Report abuses.

  1. If the LSAT is going to be set from grades, why not just skip the LSAT altogether and only use grades?

    I mean .... I mean, I'm just sitting on the Group W bench here ...

  2. While normalizing grades seems like a good idea it seems like a mistake to use a predictive algorithm to infer likely IB grades...unless there is some specific legal meaning they have.

    Wouldn't it be strictly better to simply report normalized grades in IB courses than to generate this test number. After all, the whole point of the test statistic is to measure a different, not fully correlated, ability. Namely what the student actually learned not how much grunt work the student put into the class (as you might guess I always did great on things like AP courses but could never figure out why I should waste my time practicing stuff over and over I knew how to do perfectly well).

    The fact that you can't give the test this year can't be helped but at least one could avoid encouraging schools to treat the predicted IB score as if it was the same kind of measurement it is other years.

    Then again maybe I'm assuming too much competence on the part of colleges and they would just ignore the normalization entirely if IB didn't do it.

  3. I don't like this idea one bit. If it weren't for my LSAT scores, I would never have gotten into my law school. (I spent my undergraduate years, uh, pursuing other interests than a high GPA. In law school, I did quite well, graduating in the top 10% of my class. Not high enough for law review, but high enough to score a decent job.)

    1. Engineer here. My grades were decent by the standards of my engineering school - I parked myself somewhere in the middle of the class, despite quite a few medical catastrophes - but dismal compared to the liberal arts students.

      My LSAT score was the best predictor of my law school grades.

      The real issue is that college grades are just such a mess - you compare, say, astrophysics students at schools without grade inflation with a criminal justice person from Slippery Rock - that you need the LSAT to provide some meaningful comparison. That leads to complaints about the LSAT, but the LSAT is the best predictor of law school grades (because everyone is taking basically the same classes and tests) and bar passage rates.

      The thing is, people complain that the SAT and the LSAT are biased against minorities or people who are bad at taking tests or whatnot, but is there any evidence that admissions committees do not take this into consideration? Some law schools asked for SAT scores, because they wanted to know if a student over-performs or under-performs in the classroom relative to their testing. Students have historically been allowed to submit statements explaining low grades, low grades for a particular semester, or other pertinent factors.

      1. "but dismal compared to the liberal arts students."

        Well, of course. The liberal arts students were taking liberal arts classes. Those are the classes your typical engineer takes for relaxation.

        1. And you can tell they are relaxing because they aren't very good at them.

    2. My proposal is not necessarily to stop considering individual LSAT scores. Indeed, they will have to have some weight; otherwise, no one will take them seriously, and it will be hard to normalize grades. But I think that if we normalize grades, that should be a better measure, so they should receive at least some greater weight than they otherwise should receive. Of course, we may already be giving too much weight or too little weight to grades.

      1. Also, if normalizing grades by major, that would take care of the problem of comparing astrophysics and criminal justice.

        1. Can you even do that? Over a five-year time period, how many astrophysics students take the LSAT?

          As every astrophysics student has a GPA, class rank within the department might be a far better measure.

        2. How much predictive power would undergrad grades have for those applying to law school years after completing undergrad? For me it was more that 5 years - I did very poorly in undergrad, did well on the LSAT, then did very well in law school.

  4. It is funny how this guy goes on and on about using data, but then just ASSUMES that online classes will be "inferior."

    Why that assumption? Because in-class has been the way "it always has been done" and therefore must be superior?

    Here is some anecdotal evidence. I got a BS at a well-respected school. And I got an MS at a well-respected school. The BS was in-person and the MS was online. But the classes for the MS were actually superior.

    It turns out that when lectures are recorded to be used again, the professors put more effort into them.

    Whether online education is superior or inferior is a matter of how it is implemented and a matter of DATA.

    You would think it might occur to Abramowicz to actually think about DATA rather than making ASSUMPTIONS in a post where he is suggesting that data might substitute for examinations...

    Just saying.

    1. In my experience with the one recorded class I did, it was sharper, clearer, and more focused. But law schools focus on Socratic method and class dialogue; things that clearly do not translate well to an online environment (whether live or recorded.) So even though recorded lectures may make a better lecture, tit is clear that they do not make better a better law school class.
      Now you could ask if the data shows traditional law school classes make better lawyers, but that is a separate issue.

    2. David, A fair point. I don't have data on that, so I probably should have said "possibly inferior."

  5. Sounds like there are too many dials and sliders available to adjust. I always distrust metrics that can be arbitrarily tweaked behind the scenes.

  6. "International Baccalaureate program, which credentials"

    Accurate summary of IB, it is a mere "credential" program for kids.

  7. Clients would be safer if the disciplinary system was able to intervene earlier and actually bothered to seriously discipline attorneys, including by removing incompetent/unethical ones from the profession.

  8. "Bar examiners could create a simple, school-specific model forecasting bar exam performance as a function of law-school grades."

    Isn't this what happens in Wisconsin? I think the students seeking diploma privilege have to take certain classes beyond the usual first-year mandatory classes and get at least a certain grade in them.

    But that works for Wisconsin, which has two law schools: the highly-ranked University of Wisconsin-Madison, and the respectable (ranked ~100) Marquette. I'm trying to imagine implementing this in California or Massachusetts, making a rule that applies to Stanford or Harvard as well as fourth-tier institutions. Your span in bar passage rates is just too big.

  9. The hypocrisy of the law school deans cannot be overstated. They want the bar canceled, but not the third year of law school, but this despite the fact that they could actually make the bar easier for all of their students by teaching bar review in that third year of law school. The reason they don't is because their professors make boatloads of money on the side teaching bar review courses and the bar review companies sponsor all sorts of law school events and seed the campuses with money.

    The entire thing is corrupt.

  10. We homeschool our children (sorry, Kirkland). We suspect that college admissions officers will discount or disregard entirely our kids’ GPAs. (To be clear, I fully understand this. If we were dishonest, we could put down almost any GPA we want. Although I think there is a problem of grade inflation at some public and private schools too.) So it’s just a given for our children that they need to do very well on the SAT or ACT in order to show the admissions officers that their high GPAs aren’t fake. Fortunately, so far they have been very good test takers, and the ones who have graduated each scored in the 1500s. (They also start taking community college classes in 11th grade, and this can show objective grades compared to their older classmates.)

    But if colleges do away with, or discount, standardized tests scores in the admissions process, that will certainly disadvantage homeschooled students. The UC system announced that it is not “requiring” SAT/ACT scores for the next few years. I assume, but do not know, that they will still accept and consider SAT scores, so if our son is able to take the rescheduled SAT in August (a big IF at this point), he should be okay. But how long will it be before SAT scores are no longer even considered? I’m not confident GPAs are better indicators of college performance than standardized test scores.

  11. One of the major reasons for the rise of standardized testing (LSATS, IBs, SATs, etc) was that grades in school were.

    1. Very variable between the schools
    2. Didn't necessarily translate to who was best suited to the advanced course of study.

    To turn around and issue a "prospective" test score based on grades, turns the entire idea on its head.

    1. This is exactly right.

      It's not just that grades are variable; it's that there's no reason to think that grades correlate very well with LSAT.

      Incidentally, schools do apply "splitter" strategies, but that ALWAYS involves going after high LSAT, low GPA students. There is a massive pool of applicants with high GPAs, but high LSAT scores are relatively rare (at least where "high" is relative to median 50s).

  12. Very good to excellent college grades — excellent LSAT — average UCLA law school grades — excellent review practice test scores — cruised through the CA Bar — finished early and drove to the Sequoias that night.

Please to post comments