Thinking it was a standard search engine, an attorney used the advanced AI natural language generation model ChatGPT to conduct legal research. Counsel must have thought, as it demonstrated an understanding of the law sufficient to pass a bar exam, why not use ChatGPT to conduct legal research! Counsel after using ChatGPT, however, did not follow-up and conduct independent research to confirm that the ChatGPT-identified cases stood for what they claimed to be, let alone that they actually existed. Who would have thought! Unfortunately, the cases ChatGPT identified were, in fact, bogus judicial decisions with bogus quotes and bogus internal citations.

In response to a court filing that relied on those bogus decisions, the court in Mata v. Avianca, Inc. (2023 U.S. Dist. LEXIS 94323, 2023 WL 3696209 (S.D.N.Y. May 4, 2023) and Opinion and Order on Sanctions, 22-cv-1461 (PKC) (S.D.N. Y. June 22, 2023)). stated:

[t]he Court is presented with an unprecedented circumstance. A submission filed by plaintiff’s counsel in opposition to a motion to dismiss is replete with citations to non-existent cases.

This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.

To view this content, please continue to their sites.

Not a Lexis Subscriber?
Subscribe Now

Not a Bloomberg Law Subscriber?
Subscribe Now

Why am I seeing this?

LexisNexis® and Bloomberg Law are third party online distributors of the broad collection of current and archived versions of ALM's legal news publications. LexisNexis® and Bloomberg Law customers are able to access and use ALM's content, including content from the National Law Journal, The American Lawyer, Legaltech News, The New York Law Journal, and Corporate Counsel, as well as other sources of legal information.

For questions call 1-877-256-2472 or contact us at [email protected]