Chat GPT's False References to Law That Trapped a Lawyer

 

Chat GPT's False References to Law That Trapped a Lawyer
Chat GPT's False References to Law That Trapped a Lawyer

In the US, a New York lawyer found himself in court after his company used the artificial intelligence program ChatGPT for research.


In court, the judge said he was faced with an unusual situation because the case filed by the lawyer's company contained legal precedents that did not actually exist.


The lawyer told the court that he was not aware that the material in the case could be false. It should be noted that ChatGPT is a program that, when a content request is put on it, appears within minutes, however, there is a warning that this program may also provide false information.


The case for which ChatGPT was used was a suit for damages by an individual against an airline company. The lawyer's team tried to justify why the case should proceed by citing precedents of past decisions.


However, when the airline's lawyer reviewed the case, he informed the judge that he could not find references to the cases in it.


In his order seeking clarification, Judge Castel wrote that six of the cited cases are bogus with bogus references.


The mystery was later resolved that the research was not done by a lawyer named Peter Ludoka himself but by a colleague of his company. Steven Schwartz, a lawyer with 30 years of experience, had used ChatGPT before.


In his written statement, Steven explained that Peter Ludoka was not involved in the research and that he was completely unaware that ChatGPT had been used.


Steven Schwartz regretted his mistake, writing that he had not used the program for legal research before and was not aware that it could also produce false content.


They have promised not to use artificial intelligence for legal research in the future until they have verified the validity of its content.


The court was also shown screenshots of a conversation between Chat GPT and him in which a message was written asking if Varghese is a genuine case.


Chet GPT answered yes, then the lawyer wrote what is the source?


ChetGPT, after double-checking, wrote that this is the original case cited in databases of legal cases such as Lexus Nex and Westlaw.


The two lawyers, who work for the law firm Levido & Oberman, have been asked to explain why they should not be prosecuted on June 8.


It should be noted that Chat GPT, which was launched in November 2022, is used by millions of people.


The program answers questions that appear to be written by a human. However, concerns have been raised about it, including the possibility of spreading misinformation and bias.









Post a Comment

Previous Post Next Post