Authors sue Anthropic for copyright infringement over AI training
Anthropic logo is seen in this illustration taken May 20, 2024. Photos via link
Aug 20 (Reuters) - Artificial intelligence company Anthropic has been hit with a class-action lawsuit in California federal court by three authors who say it misused their books and hundreds of thousands of others to train its AI-powered chatbot Claude.
The complaint, filed on Monday, opens new tab by writers and journalists Andrea Bartz, Charles Graeber and Kirk Wallace Johnson, said that Anthropic used pirated versions of their works and others to teach Claude to respond to human prompts.
A spokesperson for Anthropic said on Tuesday that the company was aware of the lawsuit and assessing the complaint but declined to comment further, citing pending litigation. An attorney for the authors declined to comment. The lawsuit joins several other high-stakes complaints filed by copyright holders including visual artists, news outlets and record labels over the material used by tech companies to train their generative artificial intelligence systems.
Separate groups of authors have sued OpenAI and Meta Platforms (META.O), opens new tab over
the companies' alleged misuse of their work to train the large-language models underlying
their chatbots.
The case filed Monday is the second against Anthropic following a lawsuit brought by music
publishers last year over its alleged misuse of copyrighted song lyrics to train Claude.
The authors said in their complaint that Anthropic has "built a multibillion-dollar business
by stealing hundreds of thousands of copyrighted books." Anthropic has drawn financial
backing from sources including Amazon (AMZN.O), opens new tab, Google (GOOGL.O), opens new
tab and former cryptocurrency billionaire Sam Bankman-Fried.
According to the complaint, the authors' works were included in a dataset of pirated books
that Anthropic used to train Claude.
The lawsuit requested an unspecified amount of monetary damages and an order permanently
blocking Anthropic from misusing the authors' work.
Eiusmod tempor incididunt ut labore et dolore magna aliqua. Photos via link
Don't miss a story
Subscribe to our email newsletter:
Don't worry we hate spam as much as you do