Colorado lawyer disciplined for using ChatGPT in brief | Courts

A Colorado lawyer has received a suspension for using artificial intelligence to generate fake case citations in a legal brief and then lying about it.

The presiding disciplinary judge’s imposition of punishment on Zachariah C. Crabill is apparently the first to implicate the improper use of AI. The head of Colorado’s attorney regulation office, Jessica E. Yates, told Colorado Politics she was unaware of any similar cases in the state.

The state Supreme Court has not yet weighed in on the legal and professional dilemmas of AI, but earlier this year, Justice Melissa Hart acknowledged during a panel discussion that the court has an obligation to educate itself about emerging technology.

“We need to be thinking about how we change our rules to accommodate that and think about what lawyers need to be doing,” she said. “But also we will have cases that will present these questions and we are going to have briefs written to us that are written by artificial intelligence. … It will be effectively operating as an associate to the lawyer.”

In the agreed-upon narrative by Crabill and attorney regulators filed with the office of the presiding disciplinary judge, Crabill’s firm took on a client in April who had previously represented themselves. An El Paso County judge denied the client’s motion for summary judgment in a civil case, and Crabill’s job was to draft a motion to set aside the judgment.

Crabill “had never drafted a MSA. He went through past motions from the firm and sought to find templates for making an argument to set aside the judgment,” the filing stated.

After filling in the template with case-specific details, Crabill wanted to bolster his legal citations. He used the AI program ChatGPT to search for cases that appeared to support his client’s position. In the belief that he was using his client’s money efficiently and reducing his own stress close to the deadline, he added the AI case citations to his brief without verifying their accuracy.

The morning of a hearing before El Paso County District Court Judge Eric Bentley, Crabill realized his brief contained false citations.

“I think all of my case cites from ChatGPT are garbage,” he texted his paralegal. “I have no idea what to do. I am trying to find actual case law in our favor now to present to the judge.”







Zachariah Crabill text messages about AI case citations

The text message conversation between attorney Zachariah C. Crabill and his paralegal.



At the May 5 hearing, Bentley was the one who first mentioned the non-existent cases. Crabill, apologizing for the errors, blamed them on “a legal intern in this case, who, I believe, got some mistake.”

Crabill later attributed his untruthful response to becoming “panicked.” He said it “never dawned” on him that AI technology could be deceptive. Crabill later filed a corrected motion, which Bentley denied on separate grounds from the “fictitious case citations.”

Crabill and the Office of Attorney Regulation Counsel agreed he violated his professional duties to act competently, diligently and honestly. Because he had no prior discipline, took responsibility and was experiencing personal struggles at the time of his mistake, the parties agreed on a two-year suspension, only 90 days of which Crabill would serve as long as he otherwise completed a probationary period.

Presiding Disciplinary Judge Bryon M. Large signed off on the punishment in a Nov. 22 order.

Sean D. Williams, a researcher in professional and technical communication at Arizona State University, told KRDO in June that users of artificial intelligence cannot trust it to always provide accurate results for prompts.

“After we’re confident and comfortable with the information that we have, then we assume that the information it’s giving us is correct,” Williams said. “This case shows it wasn’t true and that’s the downside.”

Earlier this year, a federal judge in New York handed down sanctions for a pair of lawyers who used an AI chatbot that similarly generated false case citations. The judge noted the effect of such misconduct is to force the opposing party to spend time and money exposing the deceit, weaken clients’ arguments and promote skepticism of the legal system.

“And a future litigant may be tempted to defy a judicial ruling by disingenuously claiming doubt about its authenticity,” wrote U.S. District Court Senior Judge P. Kevin Castel.