Painful lessons on the limitations of AI software in law | Fieldfisher
Skip to main content

Painful lessons on the limitations of AI software in law

Conor Folan



There has been great publicity and speculation around the use of AI software such as Chat GPT in recent months. There has been significant anecdotal evidence of its widespread use, with everyone from college students completing their assignments, to businesspeople managing their stock investments, availing of its benefits. However, is there scope for effective use of Chat GPT for lawyers?
By way of background, ChatGPT operates by generating realistic responses to queries asked of it by making guesses about which fragments of text should follow other sequences, based on a statistical model that has ingested billions of examples of text pulled from all over the internet. Some enterprising individuals believed that it could be used in the legal profession to assist with a wide range of tasks, such as legal research, document drafting, and case analysis. The hope was that this could help to improve the efficiency and accuracy of legal work, and could potentially allow lawyers to handle more cases and provide better service to their clients. So if a legal question was put to it, could it be relied upon?

The recent case of Roberto Mata v Avianca Inc, highlighted the pitfalls starkly. The New York case saw the plaintiff’s solicitors submit a brief with over a half-dozen prior court decisions referenced in support of the arguments made.  The problem was that no one, not even the judge could find any of those prior decisions when he sought them out. This was because they did not in fact exist. All of the caselaw had been created by Chat GPT. When this came to light, the plaintiff’s solicitors admitted to the judge that they had used the artificial intelligence programme to do their legal research but stated that they had also asked Chat GPT about the legitimacy of the cases it cited and it said they were all legitimate. Therefore they "had no reason to doubt it’s sincerity".

As a result of the case, Judge Castel held, in an order that he had been presented with “an unprecedented circumstance,” a legal submission replete with “bogus judicial decisions, with bogus quotes and bogus internal citations”. He duly ordered a subsequent hearing to discuss potential sanctions against the plaintiff's lawyers.

While it is important to embrace innovative technologies, legal precedents cannot just be created, they must be grounded in actual caselaw and substantive decisions. This is something that clashes with the functionality of Chat GPT. So while AI may bring extra speed and efficiencies to many industries, care must be taken to ensure that there are no gaps or deficiencies that may bring about unintended consequences, such as those highlighted above. Lawyers in particular should know to double check advice before despatching it to a client, if painful lessons are to be avoided.

Written by: Alanna O'Byrne and Conor Folan