Man vs. Machine: Maybe Computers Aren't the Best at Predicting Recidivism
A Dartmouth College study found that individuals without expertise in criminal justice may be as accurate as trusted court software in predicting recidivism.
January 18, 2018 at 09:43 AM
4 minute read
In technology, conventional wisdom says that machine learning can typically make better predictions than humans, weighing out biases and increasing accuracy by staggering amounts. A new study out of Dartmouth College, however, is challenging that assumption, particularly when it comes to the fate of those in the criminal justice system.
According to research from Dartmouth College, the Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) risk management tool is no more accurate at predicting recidivism than individuals with “little or no criminal justice expertise.” COMPAS, widely used among U.S. courts to determine recidivism risk, has, according to the research, been used in assessing more than one million offenders since 1998.
Carried out by a student-faculty research team, the Dartmouth study gave a group of nonexperts—workers contracted through Amazon's Mechanical Turk online marketplace—short descriptions of pretrial defendants taken from a database in Broward County, Florida, from 2013-2014. The descriptions provided seven features of a pretrial defendant, including age, sex, crime they were charged with, whether the crime was a misdemeanor or felony, and previous criminal history. Using this information, participants, through a survey, were asked to predict whether a defendant would recidivate.
The research was conducted among a total of 800 participants, divided into two groups of 400. One group was allowed to see the pretrial defendant's race, while the other wasn't.
While COMPAS takes into account 137 features in determining recidivism, the tool's results (65.2 percent accuracy) in this instance were “statistically the same” as those of the group (67 percent), a statement from Dartmouth said.
“As machine learning and artificial intelligence tools emerged in criminal justice, they kind of bypassed this middle step in ensuring they're as accurate as we think they are,” Julia Dressel, who conducted the research for her undergraduate thesis in computer science at Dartmouth, told LTN. “People are quick to assume they're accurate and objective and think of course these things should be used. … We have to step back and realize that might not always be the case.”
“Right out of the gate, you know something is concerning when the accuracy is 65 percent,” Hany Farid, professor of computer science at Dartmouth College and co-leader of the study, told LTN. “People answering an online survey as accurately as the software: That should give us more pause.”
He added that many judges might look positively on using analytics tools because of their perceived accuracy, but “I think you would weigh that prediction very differently if I told you, 'Hey, I polled 12 people online, and this is what they said.'”
COMPAS's proprietary algorithm is unknown outside of its developer, Northpointe Inc. In 2017, The New York Times reported that a Northpointe executive said, “We've created [the algorithms], and we don't release them, because it's certainly a core piece of our business.”
The Dartmouth research took the seven pieces of information given to the study's human participants and fed it into “the simplest possible machine algorithm, the kind of thing you would teach in an undergraduate course,” logistic regression, and “it got 65 percent [accuracy], right out of the gate.”
Taking it a step further, the researchers gave the algorithm two pieces of information—age and prior convictions—and it achieved 65 percent accuracy, the same as COMPAS.
COMPAS has previously been challenged in the courts. In one case, a Wisconsin man was sentenced to six years in prison by a judge who cited a COMPAS assessment score. The man appealed, and the case made it to the Wisconsin Supreme Court, which ruled against him. In 2017, the U.S. Supreme Court declined to hear the case.
COMPAS is also no stranger to criticism. A 2016 analysis by ProPublica found “that black defendants were far more likely than white defendants to be incorrectly judged to be at a higher risk of recidivism.” Whites, conversely, were more likely “incorrectly flagged as low risk.”
The ranking of recidivism by race is partially due to limitations of algorithms. Researching the ProPublica dataset—that same used by Dartmouth—The Washington Post found that, while COMPAS doesn't account for race directly in its algorithm, many attributes it considers in predicting multiple-time offenders vary by race, like prior arrests, which black defendants are more likely to have.
Citing a different review, the Dartmouth study also noted that accuracy wasn't just an issue for COMPAS and that “eight out of nine [algorithmic] approaches failed to make accurate predictions.”
This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.
To view this content, please continue to their sites.
Not a Lexis Subscriber?
Subscribe Now
Not a Bloomberg Law Subscriber?
Subscribe Now
NOT FOR REPRINT
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
You Might Like
View AllRussia-Linked Deepfakes Are Hitting the US Election. Will It Spur Congress to Act?
Legaltech Rundown: Uptime Legal Systems Acquires LexCloud, Spellbook Integrates With Practical Law, and More
3 minute readTrending Stories
- 1Infant Formula Judge Sanctions Kirkland's Jim Hurst: 'Overtly Crossed the Lines'
- 2Guarantees Are Back, Whether Law Firms Want to Talk About Them or Not
- 3Election 2024: Nationwide Judicial Races and Ballot Measures to Watch
- 4How I Made Practice Group Chair: 'If You Love What You Do and Put the Time and Effort Into It, You Will Excel,' Says Lisa Saul of Forde & O'Meara
- 5Abbott, Mead Johnson Win Defense Verdict Over Preemie Infant Formula
Who Got The Work
Michael G. Bongiorno, Andrew Scott Dulberg and Elizabeth E. Driscoll from Wilmer Cutler Pickering Hale and Dorr have stepped in to represent Symbotic Inc., an A.I.-enabled technology platform that focuses on increasing supply chain efficiency, and other defendants in a pending shareholder derivative lawsuit. The case, filed Oct. 2 in Massachusetts District Court by the Brown Law Firm on behalf of Stephen Austen, accuses certain officers and directors of misleading investors in regard to Symbotic's potential for margin growth by failing to disclose that the company was not equipped to timely deploy its systems or manage expenses through project delays. The case, assigned to U.S. District Judge Nathaniel M. Gorton, is 1:24-cv-12522, Austen v. Cohen et al.
Who Got The Work
Edmund Polubinski and Marie Killmond of Davis Polk & Wardwell have entered appearances for data platform software development company MongoDB and other defendants in a pending shareholder derivative lawsuit. The action, filed Oct. 7 in New York Southern District Court by the Brown Law Firm, accuses the company's directors and/or officers of falsely expressing confidence in the company’s restructuring of its sales incentive plan and downplaying the severity of decreases in its upfront commitments. The case is 1:24-cv-07594, Roy v. Ittycheria et al.
Who Got The Work
Amy O. Bruchs and Kurt F. Ellison of Michael Best & Friedrich have entered appearances for Epic Systems Corp. in a pending employment discrimination lawsuit. The suit was filed Sept. 7 in Wisconsin Western District Court by Levine Eisberner LLC and Siri & Glimstad on behalf of a project manager who claims that he was wrongfully terminated after applying for a religious exemption to the defendant's COVID-19 vaccine mandate. The case, assigned to U.S. Magistrate Judge Anita Marie Boor, is 3:24-cv-00630, Secker, Nathan v. Epic Systems Corporation.
Who Got The Work
David X. Sullivan, Thomas J. Finn and Gregory A. Hall from McCarter & English have entered appearances for Sunrun Installation Services in a pending civil rights lawsuit. The complaint was filed Sept. 4 in Connecticut District Court by attorney Robert M. Berke on behalf of former employee George Edward Steins, who was arrested and charged with employing an unregistered home improvement salesperson. The complaint alleges that had Sunrun informed the Connecticut Department of Consumer Protection that the plaintiff's employment had ended in 2017 and that he no longer held Sunrun's home improvement contractor license, he would not have been hit with charges, which were dismissed in May 2024. The case, assigned to U.S. District Judge Jeffrey A. Meyer, is 3:24-cv-01423, Steins v. Sunrun, Inc. et al.
Who Got The Work
Greenberg Traurig shareholder Joshua L. Raskin has entered an appearance for boohoo.com UK Ltd. in a pending patent infringement lawsuit. The suit, filed Sept. 3 in Texas Eastern District Court by Rozier Hardt McDonough on behalf of Alto Dynamics, asserts five patents related to an online shopping platform. The case, assigned to U.S. District Judge Rodney Gilstrap, is 2:24-cv-00719, Alto Dynamics, LLC v. boohoo.com UK Limited.
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250