Recently, the Los Angeles Police Department announced it would stop using algorithmic-based programs to identify who’s most likely to commit violent crimes, according to The Los Angeles Times. An audit of the program by the department’s inspector general found, among other things, that the police department used inconsistent criteria to label people “chronic offenders.”

The LAPD is one of many U.S. police departments and courts leveraging artificial intelligence-backed software to assist in policing, bail and sentencing decisions. While such tools are grabbing headlines for disproportionately targeting blacks and Latinos, observers say the root of these tools’ problems is biased data.

This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.

To view this content, please continue to their sites.

Not a Lexis Subscriber?
Subscribe Now

Not a Bloomberg Law Subscriber?
Subscribe Now

Why am I seeing this?

LexisNexis® and Bloomberg Law are third party online distributors of the broad collection of current and archived versions of ALM's legal news publications. LexisNexis® and Bloomberg Law customers are able to access and use ALM's content, including content from the National Law Journal, The American Lawyer, Legaltech News, The New York Law Journal, and Corporate Counsel, as well as other sources of legal information.

For questions call 1-877-256-2472 or contact us at [email protected]