Landmark Anthropic Ruling: What it means for copyright owners in the age of AI

What matters

What matters next

The rise of AI has sparked legal disputes over using copyrighted data for training. In the US and UK, copyright holders have sued developers. A US court recently ruled that Anthropic’s use of digitised books to train AI qualifies as fair use.

The rapid evolution of AI has led to concerns within the creative industries around the legality of using data in the form of images, test or other copyright works in training AI systems.

In the absence of firm legal guidance, a number of claims have been brought by copyright owners against AI systems developers, both in the US and UK courts.

One of these was recently the subject of a summary judgment application, the U.S. District Court for the Northern District of California ruling in favour of AI company Anthropic PBC, finding that its digitising of books (purchased by it in print form) and their subsequent use to train Anthropic’s large language models (LLMs), including its Claude chatbot, constituted fair use under U.S. copyright law.

The case, Bartz et al. v. Anthropic PBC, was brought by a group of authors who alleged that Anthropic had used their copyright works, some of which were purchased and others allegedly pirated, to train its AI systems without permission or compensation. The claimants argued that this amounted to large-scale copyright infringement.

A win for AI developers - with limits

The court sided with Anthropic on two fronts. Firstly, it held that the purpose and character of using books to train LLMs was spectacularly transformative, likening the process to human learning. The judge emphasized that the AI model did not reproduce or distribute the original works, but instead analysed patterns and relationships in the text to generate new, original content. Because the outputs did not substantially replicate the claimants’ works, the court found no direct infringement.

Secondly, the court ruled that Anthropic’s digitisation of lawfully purchased print books, by scanning them for internal use, also amounted to fair use. This was seen as a form of format shifting, akin to converting a CD to MP3 for personal use, and did not involve redistribution or commercial exploitation of the scanned copies.

However, the court did not grant Anthropic a complete victory. It declined to give summary judgment against the authors’ claims regarding the use of pirated digital books to proceed to trial. The judge criticized Anthropic’s alleged effort to build a “central library of all the books in the world,” including millions of unauthorised digital copies downloaded from pirate sites. This aspect of the case remains unresolved and could still result in liability for the company.

Implications for Copyright owners

This ruling is the first major judicial endorsement of fair use in the context of AI training and sets a powerful precedent in the U.S, for how courts may treat similar claims. For copyright holders, the decision is a double-edged sword:

  • on one hand, it narrows the path to successful infringement claims if the AI model’s outputs do not closely mimic the original works.
  • on the other, it leaves open the door to litigation where pirated or unauthorized copies are used, or where outputs are substantially similar to protected content.

Despite having no binding effect in the UK, the ruling highlights the urgent need for legislative clarity for AI developers and copyright owners within the UK. As AI systems become more sophisticated and widely deployed, the legal framework governing the use of copyrighted material in training remains fragmented and uncertain, especially outside the U.S.

For authors, publishers, and other rights holders, the Anthropic decision is a wake-up call: monitoring how their works are used in AI training and advocating for clearer licensing mechanisms may be the most effective path forward.

Disclaimer

This information is for general information purposes only and does not constitute legal advice. It is recommended that specific professional advice is sought before acting on any of the information given. Please contact us for specific advice on your circumstances. © Shoosmiths LLP 2025.

 

Insights

Read the latest articles and commentary from Shoosmiths or you can explore our full insights library.