A federal judge’s denial of Anthropic’s bid to delay a copyright trial over alleged book piracy could provide a precedent for the music industry’s looming legal battle against the artificial intelligence company.
On Monday (August 11), US District Judge William Alsup rejected Anthropic‘s motion to stay the case while the AI company pursues appeals of earlier rulings, including class certification and a partial loss on its summary judgment.
Judge Alsup’s decision, which you can read in full here, maintains the trial date on December 1, 2025.
The most memorable line in Judge Alsup’s write-up? That covers his rejection of Anthropic’s argument for a “death knell” in the case – effectively a suggestion that the appeals process would be so costly that it would harm the business’s ability to operate.
Alsup’s conclusion on that point: “If Anthropic loses big it will be because what it did was also big.“
The lawsuit, filed by authors including Andrea Bartz and Kirk Wallace Johnson, centers on the $61 billion-valued startup’s alleged downloading of millions of copyrighted books from pirate websites to build what the company described as a “research library.”
The judge wrote: “The machinations of Anthropic’s downloading of pirate libraries and its deployment of bit-torrenting to do so looms large in this case and should be fully before the jury, district judge, and court of appeals before we say the definitive extent to which this copying of pirated materials was fair or not.”
“The machinations of Anthropic’s downloading of pirate libraries and its deployment of bit-torrenting to do so looms large in this case and should be fully before the jury, district judge, and court of appeals before we say the definitive extent to which this copying of pirated materials was fair or not.”
William Alsup, US District Judge
Judge Alsup added: “Similarly, the cause-and-effect relationship involved in its abandoning of bit-torrenting in favor of purchasing and scanning books for its ‘research library’ deserves to be explained in the record.”
“Anthropic has refused to come clean on this, even now, and for all we know, most were never used (or not solely used) to train LLMs.”
The judge also criticized what he called Anthropic’s “sweeping rule” that would allow AI companies “to pirate copyrighted materials without ever accounting for the extent to which the pirated materials were ever actually and solely used for a fair use.”
Judge Alsup stressed that fair use determinations require examining how much copyrighted material was taken and its purpose.
“Here, we know that many books were downloaded from pirate sites with zero of their content ever being used to train an LLM. It’s the copyist’s burden on fair use to show how much it copied was used and for what.”
William Alsup, US District Judge
He wrote: “Here, we know that many books were downloaded from pirate sites with zero of their content ever being used to train an LLM. It’s the copyist’s burden on fair use to show how much it copied was used and for what.”
The latest ruling on the authors’ case could set a precedent on whether Anthropic can claim fair use protection for using copyrighted material beyond what it actually uses for training.
The decision comes as the music industry also lodged legal challenges against Anthropic and other AI companies. In 2023, music publishers, including Universal Music Publishing Group, Concord, and ABKCO sued Anthropic, accusing the AI firm of using copyrighted lyrics to train the Claude chatbot and alleging that Claude regurgitates copyrighted lyrics when prompted by users.
Back in May, US Magistrate Judge Susan van Keulen ordered Anthropic to respond to claims that a data scientist it employed relied on a fictitious academic article, likely generated by AI. The judge described the situation as “a very serious and grave issue.”
In response, lawyers for the AI company apologized to court for using an incorrect citation generated by Anthropic’s AI in a court filing.
The publishers filed an amended lawsuit against Anthropic in May, which say they “bolsters the case against Anthropic for its unauthorized use of song lyrics in both the training and the output of its Claude AI models.”
Music Business Worldwide




