The law school world was scandalized in February when Villanova University School of Law announced that its former dean and admissions officials had for years inflated the Law School Admission Test scores and grade-point averages of the school’s incoming classes.
On Sept. 11, officials at the University of Illinois announced that they were investigating the veracity of the same statistics reported by its College of Law after getting a tip that the numbers released for its new class were wrong.
It remains to be seen whether Illinois did, in fact, report bogus numbers this year or in the past, or whether it was done on purpose. But the fact that a second law school had fallen under suspicion within a year raised questions. How widespread is the inflation of the academic credentials? What is being done to ensure law schools are honest?
“It really makes you wonder,” said Sarah Zearfoss, senior assistant dean for admissions, financial aid and career planning at the University of Michigan Law School. “There have been schools that my colleagues and I thought were cheating, because we knew enough about their applicant pools that their numbers didn’t seem credible. Maybe they really weren’t credible.”
Plenty of attention has been paid during the past two years to what critics see as the manipulation of graduate employment and salary data by law schools. The American Bar Association (ABA) has adopted reforms intended to clamp down on the misrepresentation of jobs data and to increase accountability.
The accuracy of the grades and LSAT scores that law schools report each year — which U.S. News & World Report weights heavily in its annual law school rankings, and which are taken seriously by prospective students and employers — had not been a major focus, however.
“We’re concerned with all consumer information, whether it’s on the front end or the back end,” said Kyle McEntee, executive director of Law School Trans­parency, a nonprofit organization formed in 2010 by two then-Vanderbilt law students that advocates for more accurate jobs data.
“We have been more focused on increasing the amount of employment data. That’s probably the biggest priority for law students. And it’s hard to know how widespread it [the misreporting of LSATs and grade-point averages] is. It could be a huge problem, but that seems unlikely.”
Just how many law schools are goosing their GPA and LSAT numbers is an open question. Law schools self-report those statistics to the ABA, which does not perform any regular auditing, said Hulett “Bucky” Askew, the ABA’s consultant for legal education. An ABA representative does examine each law school’s records as part of its accreditation site visit every seven years.
“We don’t audit the data that a school produces, but we do continuously look at the data to see if there are anomalies,” Askew said. “We’re further developing our process for identifying anomalies.”
The ABA’s internal review process wasn’t triggered for either Villanova or Illinois, both of which self-reported problems. (Illinois has yet to report its 2011 figures to the ABA. However, the university said it has “credible information” that the statistics for its incoming class published on the law school’s Web site and in other publications were inaccurate. The university was examining admissions data from 2011 and previous years for possible problems.)
McEntee suggested that an obvious way to eliminate any suspicion about incoming student data or temptation by admissions officers to lie would be to have the Law School Admission Council calculate the statistics. The council administers the LSAT and operates the credential assembly service, a centralized computer application system used by nearly all U.S. law schools. It holds records for every enrolled law student, including his or her LSAT score and undergraduate GPA.
Taking responsibility for calculating and reporting LSAT and GPA data out of the hands of individual law schools would ensure that all institutions are playing by the same rules, McEntee said.
“There’s no reason not to,” he said. “It’s all in their database, and there are no student privacy concerns because it’s all calculated in the aggregate. If I had access to their database, I could probably generate these reports in a couple of hours. The [ABA's Section of Legal Education and Admissions to the Bar] has to realize that the outsourcing of some functions is not a bad thing.”
McEntee was not the first to propose that the council assume this task. Several law school administrators asked the council to intervene in 2006, when several schools were suspected of reporting only the highest LSAT scores of their incoming students despite an ABA rule that they report the average score when applicants took the test more than once. The council declined those requests, and in 2006 the ABA changed the rule to allow schools to report only applicants’ highest LSAT scores.
“That’s just not something we have done historically, and I don’t see why we would,” said council President Dan Bernstine. “We’re not in the reporting business. We don’t distinguish between our [law school] members.”
Askew said there has been no discussion between the ABA and the council about vetting LSAT scores and GPAs.
The single biggest reason for law schools artificially inflating the data is U.S. News‘ influential annual law school rankings. A law school’s selectivity accounts for 25% of its score, and median LSATs and GPAs are the two biggest factors in determining selectivity.
Even small differences in test scores and GPAs can have a large effect on a school’s ranking. The investigation at Villanova revealed that in 2009 it reported a median LSAT score of 162 — three points higher than the true median. When the law school accurately reported its 2010 median score of 160, it dropped from the No. 67 spot to No. 84.
“Villanova’s drop was primarily due to reporting its LSAT and GPA scores accurately for fall 2010 class versus the inflated version it had done earlier,” said Robert Morse, director of data research at U.S. News.
A 2009 academic study by two sociologists found that law school administrators feel extreme pressure to keep their ranking up, and that some schools have employed ethically questionable tactics to that end. Schools have categorized students as part-time or probationary so their LSAT scores would not count, although U.S. News recently started including part-time students in its analysis, according to the paper, “Fear of Falling: The Effect of U.S. News & World Report Rankings on U.S. Law Schools,” by Northwestern University associate professor Wendy Espeland and University of Iowa associate professor Michael Sauder.
Espeland and Sauder found that some schools cut first-year class sizes then aggressively recruit transfer students, or hire graduates on a temporary basis so they would be considered employed for the U.S. News survey. “I’ve had people tell me that they felt pressured by their deans to do things they didn’t think were ethical,” Espeland said. Regarding the suggestion that at least one law school had falsified data, she said, “I guess I’m not surprised, but it’s disappointing.”
Admissions officers faced extra pressure during the 2011 admissions cycle, because applicants to ABA-approved law schools declined by 10% — the largest drop in a decade. That left admissions officers at most schools with a smaller pool of applicants, and maintaining median LSAT and GPA figures was more difficult. “The pressure [of the U.S. News rankings] has been enormous, and it’s so bad for decision-making,” said Michigan’s Zearfoss. “There could be more funny stuff going on this year than in previous years.”
Conversely, the decline in applicants created an incentive for schools not to artificially inflate LSAT and GPA scores, since they don’t want to discourage potential applicants who might falsely believe their grades and scores are too low to gain admission, Zearfoss said.
Villanova’s deceptive practices began long before the law school applicant pool began shrinking, according to documents released by the ABA. An investigation concluded that “inaccurate admissions data had been reported to the ABA since at least 2002.” The school discovered the inaccuracies after a faculty group launched a study into the correlation between LSAT scores and bar-passage rates. In addition to reporting higher LSAT scores and GPAs, the investigation revealed that Villanova had underreported the number of admission offers it extended between 2007 and 2009, presumably to boost its reported acceptance rate. (Acceptance rates account for a small percentage of U.S. News‘ rankings.)
Villanova responded to the scandal by firing or accepting the resignations of three admissions officials. Dean Mark Sargent had already stepped down.
The ABA could have revoked Villa­nova’s accreditation or placed it on probation, but instead opted in August for a public censure because the school self-reported the inaccuracies. The school must post the letter of censure on its Web site for two years and hire an outside auditor to monitor its reporting of data for two years.
The decision garnered some criticism from those who felt it was not harsh enough, with the legal blog Above the Law calling it a “light slap on the wrist.”
The Illinois investigation “will probably reignite the discussion about whether the sanctions against Villanova should have been stronger,” McEntee said. “I don’t really believe that a censure is going to have much effect, as far as sending a message.”