
Photo Credit: Anthropic
The high-profile copyright lawsuit between major music publishers and Anthropic’s chatbot Claude just took a dramatic turn. Anthropic’s legal counsel is accused of submitting a court filing containing AI-generated hallucinations to an academic citation that does not exist.
Today a federal judge in San Jose ordered Anthropic to address allegations that one of its expert witnesses referenced a non-existent academic paper in the company’s court filing. The citation is purportedly from the journal American Statistician and was included in the filing to bolster Anthropic’s argument that the reproduction of copyrighted song lyrics is a “rare event.”
Attorneys representing Universal Music Group, Concord, and ABKCO discovered that the cited article from the court filing does not exist. Upon checking with both the alleged author and the journal, the plaintiffs confirmed the citation was a complete fabrication. Attorney Matt Oppenheim, who represents the music publishers, suggested that expert witness Olivia Chen relied on Anthropic’s own AI tool Claude to generate both the argument and supporting authority.
Oppenheim stopped short of accusing Chen of deliberate misconduct, but he emphasized the seriousness of submitting a court document citing AI-generated falsehoods in court. Meanwhile, Anthropic’s legal team has characterized the incident as an accidental citation mistake, noting the incorrect citation seemed to reference the correct article but linked to a different one entirely.
Music publishers allege that Anthropic unlawfully used lyrics from hundreds of songs from Beyoncé to The Rolling Stones to train Claude—and that Claude often returns the lyrics verbatim in response to certain user prompts. This isn’t the first time AI-generated hallucinations have ended up in court, either.
One of the first incidents was the Mata v. Avianca case in New York in 2023. Two New York attorneys representing a plaintiff in a personal injury suit against Avianca Airlines used ChatGPT to generate their legal research. The AI produced several non-existent cases, which the attorneys cited in their filings. After a judge discovered those fabrications, he issued a $5,000 sanction against both.
At least seven cases across the United States have seen courts question or discipline lawyers for submitting AI-generated hallucinations in their legal filings.
Artificial Intelligence (AI), Copyright, Intellectual Property Protection, Music Law, Pop Culture, 5:24-cv-03811
This post was originally authored and published by Ashley King Digital Music News via RSS Feed. Join today to get your news feed on Nationwide Report®.