News
The lawyers blamed AI tools, including ChatGPT, for errors such as including non-existent quotes from other cases.
Anthropic has responded to allegations that it used an AI-fabricated source in its legal battle against music publishers, ...
Hallucinations from AI in court documents are infuriating judges. Experts predict that the problem’s only going to get worse.
By Ronil Thakkar / KnowTechie Anthropic's lawyer admitted using a fake citation by Claude in a legal case against Universal ...
The flawed citations, or "hallucinations," appeared in an April 30, 2025 declaration [PDF] from Anthropic data scientist ...
Claude hallucinated the citation with “an inaccurate title and inaccurate authors,” Anthropic says in the filing, first ...
The AI chatbot was used to help draft a citation in an expert report for Anthropic's copyright lawsuit.
Anthropic on Thursday admitted that a faulty reference in a court paper was the result of its own AI assistant Claude and ...
Attorneys for the AI giant say the erroneous reference was an “honest citation mistake,” but plaintiffs argue the declaration ...
The chatbot added wording errors to a citation that made ... that it used an AI-fabricated source in its legal battle against music publishers, saying its Claude chatbot made an “honest citation ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results