‘Humiliated’ lawyer apologizes over ChatGPT court flub

A New York attorney who used ChatGPT to prepare a legal brief and cite a bogus case issued a bitter apology in court on Thursday, emotionally explaining that he had been “cheated” by an artificial intelligence chatbot. became.

Stephen Schwartz of Tribeca law firm Levidor, Levidor & Overman told a federal judge in Manhattan he was “humiliated” by the blunder and said he used a “search engine.” He said he believed and had no idea that an AI app would provide fake case law.

“I would like to offer my sincerest apologies to Honor, the Court, the defendants, and my office,” Schwartz said at a packed court hearing.

“I deeply regret my actions that led to today’s hearing,” Mr. Schwartz said, his voice trembling. “I suffered both professionally and privately. [because of] The widespread publicity this issue has caused. I am ashamed, humiliated and deeply regretful. ”

Schwartz used ChatGPT to search for case law to enhance his client’s case, but the bot completely fabricated the case without the lawyer’s knowledge.

In the end, he filed a legal brief citing a fictitious case, and Judge Kevin Castell called for him to be taken to court for a hearing on whether to sanction the company for the scandal. rice field.

Steven Schwartz came under intense scrutiny from a judge for using ChatGPT to prepare legal briefs, resulting in a bogus case law.
Stephen Hirsch

As Mr. Schwartz apologized, he was forced to stop and compose himself.

“I’ve never been involved in anything like this in 30 years,” said Mr. Schwartz, forcing him to stop and calm himself down while apologizing.

“I can assure the court that something like this will never happen again,” he argued to the judge.

Stephen Schwartz off the court.
Schwartz profusely apologized to the judge for the incident.
Stephen Hirsch

Schwartz appeared in court in a 2022 lawsuit filed by his firm on behalf of Robert Mata, who sued Columbia Airlines Avianca for injuring his knee on a flight to New York from a metal catering cart. I had submitted a draft.

Fictional cases cited by Schwartz include Miller v United Airlines, Petersen v Iran Air, and Varghese v China Southern Airlines.

Castell spent about two hours on Thursday asking Schwartz how the accident happened.

Stephen Schwartz
Mr. Schwartz said he did not believe chatbots would reveal false legal precedents.
Stephen Hirsch

Mr. Schwartz reiterated that he never imagined that ChatGPT would completely fabricate case law.

“I could never have imagined that ChatGPT would fabricate a case,” Schwartz said. “I thought I was using a search engine that used sources I didn’t have access to.”

Schwartz said ChatGPT did not specify that these incidents were not real, adding: “I kept getting scammed by ChatGPT.”

Chat GPT
Schwartz said he thought of AI bots as being like search engines.

“I never thought it would be a hoax,” he explained. “It made me wonder if I didn’t have access to the entire case.”

Schwartz said he would never have filed a brief if he thought the case law was untrue, but admitted he should have done more due diligence first.

“In retrospect, I should have done that, but I didn’t,” he said.

Stephen Schwartz
The lawyer said he was “humiliated” by legal abuse.
Stephen Schwartz/ LinkedIn

Schwartz also admitted he was unable to find the entire case on the Internet, but said he thought it might be an pending appeal or an undisclosed case.

At one point, Castell read an excerpt from one of the hoaxed lawsuits and urged Schwartz: “Can I agree that it’s legal gibberish?”

“Looking at it now, yes,” Schwartz admitted.

ChatGPT: Optimize the language model for interactions.
Schwartz vowed never to use ChatGPT again.

Schwartz vowed never to use the app again and said he has taken AI training courses to improve his knowledge of the technology.

Another lawyer for the firm, Thomas Corbino, told the judge that the firm had never been sanctioned and Mr. Schwartz had always been a stand-up lawyer.

Another attorney at the firm, Ronald Minkoff, said the judge’s lawyer is notoriously notorious about technology.

“There was no willful misconduct here,” Minkoff said. “This was the result of ignorance and carelessness. It was not intentional and certainly not malicious.”

The judge said he would decide at a later date whether to impose sanctions.

Leave a Reply

Your email address will not be published. Required fields are marked *