Prosecutors' office in California faces backlash over use of AI in court filing with "hallucinations"
A recent case has highlighted the risks of relying on artificial intelligence (AI) for legal work, particularly when it comes to filing motions in criminal cases. According to the Nevada County District Attorney's Office, a prosecutor used AI to prepare a motion that contained errors known as "hallucinations", which can be misleading and unreliable.
The incident occurred in one case involving Kyle Kjoller, who was represented by a public defender and had his lawyers from the non-profit Civil Rights Corps. The prosecutors' office allegedly used AI to generate legal citations for the case, but these citations were inaccurate and led to errors being discovered later on.
Kjoller's lawyers claimed that similar errors appeared in another filing made by the prosecution's office. Despite this, the appeals court denied their request for sanctions, which could have resulted in severe consequences for the prosecutors.
However, Kjoller's lawyers took further action by filing a petition with the California Supreme Court, citing three cases they believed contained AI-generated errors typical of generative AI. The court has yet to issue a decision on whether it will take up the case.
Critics argue that the use of AI in legal work can undermine due process rights and erode trust in the courts. "Prosecutors' reliance on inaccurate legal authority can violate ethical rules, and represents an existential threat to the due process rights of criminal defendants," said Kjoller's attorneys in their filing.
In response, the Nevada County District Attorney's Office admitted to using AI in one of its filings but claimed that other mistakes were caused by human error. The office has since implemented new training programs for staff and established an AI policy to ensure reliable citations.
The use of AI in court filings is not a new phenomenon, with cases having been reported in various countries including Canada, Australia, the UK, and Israel. However, the California case appears to be the first instance where a prosecutors' office has used generative AI in a court filing in the United States.
A recent case has highlighted the risks of relying on artificial intelligence (AI) for legal work, particularly when it comes to filing motions in criminal cases. According to the Nevada County District Attorney's Office, a prosecutor used AI to prepare a motion that contained errors known as "hallucinations", which can be misleading and unreliable.
The incident occurred in one case involving Kyle Kjoller, who was represented by a public defender and had his lawyers from the non-profit Civil Rights Corps. The prosecutors' office allegedly used AI to generate legal citations for the case, but these citations were inaccurate and led to errors being discovered later on.
Kjoller's lawyers claimed that similar errors appeared in another filing made by the prosecution's office. Despite this, the appeals court denied their request for sanctions, which could have resulted in severe consequences for the prosecutors.
However, Kjoller's lawyers took further action by filing a petition with the California Supreme Court, citing three cases they believed contained AI-generated errors typical of generative AI. The court has yet to issue a decision on whether it will take up the case.
Critics argue that the use of AI in legal work can undermine due process rights and erode trust in the courts. "Prosecutors' reliance on inaccurate legal authority can violate ethical rules, and represents an existential threat to the due process rights of criminal defendants," said Kjoller's attorneys in their filing.
In response, the Nevada County District Attorney's Office admitted to using AI in one of its filings but claimed that other mistakes were caused by human error. The office has since implemented new training programs for staff and established an AI policy to ensure reliable citations.
The use of AI in court filings is not a new phenomenon, with cases having been reported in various countries including Canada, Australia, the UK, and Israel. However, the California case appears to be the first instance where a prosecutors' office has used generative AI in a court filing in the United States.