US lawyer sorry after ChatGPT creates 'bogus' cases

US lawyer sorry after ChatGPT creates 'bogus' cases

Since the chatbot ChatGPT burst onto the public stage late last year, generative AI content has mushroomed
Since the chatbot ChatGPT burst onto the public stage late last year, generative AI content has mushroomed. Photo: Richard A. Brooks / AFP/File
Source: AFP

What happened when a US lawyer used ChatGPT to prepare a court filing? The artificial intelligence program invented fake cases and rulings, leaving the attorney rather red-faced.

New York-based lawyer Steven Schwartz apologized to a judge this week for submitting a brief full of falsehoods generated by the OpenAI chatbot.

"I simply had no idea that ChatGPT was capable of fabricating entire case citations or judicial opinions, especially in a manner that appeared authentic," Schwartz wrote in a court filing.

The blunder occurred in a civil case being heard by Manhattan federal court involving a man who is suing the Colombian airline Avianca.

Roberto Mata claims he was injured when a metal serving plate hit his leg during a flight in August 2019 from El Salvador to New York.

After the airline's lawyers asked the court to dismiss the case, Schwartz filed a response that claimed to cite more than half a dozen decisions to support why the litigation should proceed.

Read also

AI chatbots offer comfort to the bereaved

PAY ATTENTION: Follow us on Instagram - get the most important news directly in your favourite app!

They included Petersen v. Iran Air, Varghese v. China Southern Airlines and Shaboon v. Egyptair. The Varghese case even included dated internal citations and quotes.

There was one major problem, however: neither Avianca's attorneys nor the presiding judge, P. Kevin Castel could find the cases.

Schwartz was forced to admit that ChatGPT had made up everything.

"The court is presented with an unprecedented circumstance," judge Castel wrote last month.

"Six of the submitted cases appear to be bogus judicial decisions with bogus quotes and bogus internal citations," he added.

The judge ordered Schwartz and his law partner to appear before him to face possible sanctions.

'Ridiculed'

In a filing on Tuesday, ahead of the hearing, Schwartz said that he wanted to "deeply apologize" to the court for his "deeply regrettable mistake."

Read also

CNN chief Chris Licht ends turbulent run at network

He said his college-educated children had introduced him to ChatGPT and it was the first time he had ever used it in his professional work.

"At the time that I performed the legal research in this case, I believed that ChatGPT was a reliable search engine. I now know that was incorrect," he wrote.

Schwartz added that it "was never my intention to mislead the court."

ChatGPT has become a global sensation since it was launched late last year for its ability to produce human-like content, including essays, poems and conversations from simple prompts.

It has sparked a mushrooming of generative AI content, leaving lawmakers scrambling to try to figure out how to regulate such bots.

A spokesperson for OpenAI did not immediately respond to a request for comment on Schwartz's snafu.

US developer OpenAI created the artificial intelligence program ChatGPT
US developer OpenAI created the artificial intelligence program ChatGPT. Photo: JACK GUEZ / AFP/File
Source: AFP

The story was first reported by The New York Times.

Schwartz said he and his firm, Levidow, Levidow & Oberman, had been "publicly ridiculed" in the media coverage.

Read also

AI boss says 'heavy regulation' now could block progress

"This has been deeply embarrassing on both a personal and professional level as these articles will be available for years to come," he wrote.

Schwartz added: "This matter has been an eye-opening experience for me and I can assure the court that I will never commit an error like this again."

Source: AFP

Authors:
AFP avatar

AFP AFP text, photo, graphic, audio or video material shall not be published, broadcast, rewritten for broadcast or publication or redistributed directly or indirectly in any medium. AFP news material may not be stored in whole or in part in a computer or otherwise except for personal and non-commercial use. AFP will not be held liable for any delays, inaccuracies, errors or omissions in any AFP news material or in transmission or delivery of all or any part thereof or for any damages whatsoever. As a newswire service, AFP does not obtain releases from subjects, individuals, groups or entities contained in its photographs, videos, graphics or quoted in its texts. Further, no clearance is obtained from the owners of any trademarks or copyrighted materials whose marks and materials are included in AFP material. Therefore you will be solely responsible for obtaining any and all necessary releases from whatever individuals and/or entities necessary for any uses of AFP material.