Stay Ahead of the Curve

Latest AI news, expert analysis, bold opinions, and key trends — delivered to your inbox.

Federal Judge Rules in Anthropic’s Favor—Mostly: Fair Use for Purchased Books, But Piracy Claims Head to Trial

3 min read A judge just ruled that Anthropic can train AI on legally purchased books—calling it “spectacularly transformative.” But the 7M pirated books in its library? That’s going to trial in December, with up to \$150K in damages per book on the line. June 25, 2025 12:10 Federal Judge Rules in Anthropic’s Favor—Mostly: Fair Use for Purchased Books, But Piracy Claims Head to Trial




In a landmark ruling, a federal judge has determined that Anthropic’s use of legally purchased books to train its AI qualifies as fair use—a major win for AI developers. But the court also rejected any fair use defense for the 7 million pirated books allegedly stored in Anthropic’s digital library, allowing a class-action lawsuit filed by authors to proceed.

What the court said:

  • “Spectacularly transformative”: The judge likened Anthropic’s Claude AI to an aspiring author learning from the greats, not a plagiarist. That transformative use, the court said, leans in favor of fair use.

  • Authors’ case fell short: The plaintiffs couldn’t show Claude generating outputs substantially similar to their original works—undermining claims of direct competition or market harm.

  • Millions spent on books: Legal filings revealed Anthropic spent “many millions” purchasing physical books, which were then digitally scanned for AI training—a move the court saw as lawful.

  • But piracy crossed a line: Anthropic also downloaded and permanently stored millions of pirated books, which the judge ruled is a clear violation of authors' rights.

  • Trial set for December: Anthropic now faces a willful infringement trial over the pirated materials, with damages of up to $150,000 per book on the table.

Why it matters.

This ruling offers a crucial early precedent: courts may consider AI training on legally obtained content as fair use—a green light for some, but far from a full legal shield. With AI labs under fire over opaque training datasets, this case is just the opening salvo in a long legal war over the soul of AI training data.




User Comments (0)

Add Comment
We'll never share your email with anyone else.

img