ChatGPT has become a very popular artificial intelligence tool that the public at large can use to do many different tasks, from generating written content to photos to helping with responding to social media posts.
It’s a tool that many people are loving for the convenience and flexibility that it provides. But, one lawyer from New York is facing possible sanctions because he used the AI tool to draft a brief that was riddled with errors.
Reuters reported recently that Steven Schwartz, from the law firm of Levidow, Levidow & Oberman, will have a sanctions hearing on June 8 before a U.S. district judge for admitting to using ChatGPT to help him file a brief. That was done in a personal injury lawsuit where he was representing a client against Avianca Airlines.
While the brief included many different errors, the most egregious ones were the fact that it cited six court decisions that don’t even exist.
ChatGPT has been lauded for its ability to produce responses that are human-like using a treasure trove of data, but clearly, there are some serious limitations to how it can – and should – be used. And filing legal briefs on behalf of clients in cases that are very important to them is probably one of those instances where it shouldn’t be used.
In a court filing in advance of his sanctions hearing, Schwartz wrote that he “greatly regrets” relying on ChatGPT to file his brief. He also said he was “unaware of the possibility that its contents could be false.”
Why he didn’t double check the brief – and the cases that it cited – is certainly something that will likely come up during the sanctions hearing.
The usage of ChatGPT by Schwartz was identified by lawyers who were on the team for Avianca. They alerted the court to the fact that Schwartz’s brief was citing cases that didn’t exist.
The Model Rules of Professional Conduct that are produced by the American Bar Association don’t mention artificial intelligence at all. However, experts say that there are several ethics rules that already exist on the books that would apply to this case.
One of those experts is Daniel Martin Katz, who serves as a Chicago-Kent College of Law professor and who studies artificial intelligence as it applies to law, told Reuters:
“You are ultimately responsible for the representations you make. It’s your bar card.”
Lawyers are required to provide a competent representation to all of their clients, and they have to be knowledgeable about all current technology. Any technology that they use has to provide accurate information, which is sure to be a focal point of the sanctions hearing against Schwartz.
This is going to be a major concern not only in this case, but in potential future ones as well. Some AI tools, including ChatGPT, have been discovered to completely create things out of thin air.
That’s why many legal experts warn that lawyers shouldn’t overly rely on these tools, because they might actually be introducing mistakes into their case, which could irreparably harm their clients.