You might think that lawyers (an ostensibly smart bunch of people) would have figured out by now that relying on ChatGPT for everything related to their work might be a bad idea. , here again the judge is scolding the law. We are determined to do just that.
Lawyers at the New York-based Cuddy Law Firm sought to use OpenAI's chatbot to justify the high costs of a recently won case, which the losing side is expected to pay.
But New York City U.S. District Judge Paul Engelmayer rejected the amount submitted, awarding less than half of what Cuddy had requested, and harshly reprimanded the lawyers for using ChatGPT to collate the numbers. The summary generally cites ChatGPT output to support the stated hourly rate, which will vary depending on the level and amount of research, preparation, and other work done.
Cuddy told the court that “the requested hourly rate is supported by feedback received from the artificial intelligence tool ChatGPT-4,” Engelmayer wrote in the order. [PDF]refers to the GPT-4 version of OpenAI's bot.
“Suffice it to say that Cuddy Law Firm's use of ChatGPT to support its aggressive fee bids is completely and extraordinarily unconvincing.”
For Judge Engelmayer, he had previously objected to the bill on other grounds, including the use of “questionable resources” to arrive at the final bill amount of $113,484.62, but that justifies the high fee. Using ChatGPT was the last straw.
“As we should have known, we should have used ChatGPT's conclusions as a useful guide to reasonable charges for the work of lawyers with specific backgrounds who perform bespoke assignments for clients in niche practice areas.'' Treating it as a yardstick was wrong from the beginning,'' Judge Engelmayer said.
You don't need to look into how the legal profession is being misguided by modern generative AI, but Judge Engelmayer doesn't. He cites two of his cases in the U.S. Court of Appeals for the Second Circuit (which includes New York City) to make his case. case.
“In two recent cases, Second Circuit courts have accused lawyers of relying on ChatGPT, which proved unable to distinguish between real and fictitious case citations,” the judges wrote in Mata. I wrote this with reference to the Avianca case and the Park v. Kim case. These cases each involved his ChatGPT being used to generate fake judicial opinions and fake authorities.
And the problem is not limited to these examples. Even former Trump fixer Michael Cohen was arrested for relying on false precedents generated by AI.
So, there's a long way between ChatGPT being good for a laugh and being a good replacement for a paralegal.
Benjamin Kopp, an attorney with the Cuddy Law Firm, said: register His company's use of AI was nothing like the case of fabricating court documents. This particular situation had nothing to do with influencing legal proceedings.
“The mentioned ‘cross-checking’ means that, in addition to supporting the rate based on the evidence provided, that rate is something that parents… Find out if you want to use it,” Kopp said.
The problem the judge had with Kadi's fee schedule was not just his use of ChatGPT. The judge said in his order that the company did not disclose the input information it used to obtain ChatGPT price feedback or whether the data used was synthetic.
“The court therefore summarily rejects ChatGPT's conclusions regarding the appropriate chargeable fees here,” Judge Engelmayer said. “Unless there is a paradigm shift in the reliability of this tool, Cuddy Law Firm strongly recommends that references to ChatGPT be removed from future fee filings.”
In the end, Cuddy won just $53,050.13. As for the future use of ChatGPT as a legal-related tool, Kopp said, “As the court noted, we do not anticipate prohibiting the use of ChatGPT.” ®