When filling out a job application, it may not be anyone from human resources reading your career and education history. Instead, it could be software scanning applications for the best candidates. But alongside artificial intelligence’s reported efficiency are growing reports and criticism that such algorithms can be coded with biased data that could foster discrimination in housing, loan approval and employment.

Notably, federal courts have split if disparate impact, a discriminatory policy or procedure that appears neutral but adversely affects members of a protected group, applies to job applicants. However, without disparate impact coverage, AI-backed decisions can’t run amok, management-side lawyers warn.

This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.

To view this content, please continue to their sites.

Not a Lexis Subscriber?
Subscribe Now

Not a Bloomberg Law Subscriber?
Subscribe Now

Why am I seeing this?

LexisNexis® and Bloomberg Law are third party online distributors of the broad collection of current and archived versions of ALM's legal news publications. LexisNexis® and Bloomberg Law customers are able to access and use ALM's content, including content from the National Law Journal, The American Lawyer, Legaltech News, The New York Law Journal, and Corporate Counsel, as well as other sources of legal information.

For questions call 1-877-256-2472 or contact us at [email protected]