AI Lawsuits Are Here, and They Could Change Everything

In 2021, OpenAI released the first version of DALL-E, forever altering how we think about images, art, and the ways in which we collaborate with machines. Using deep learning models, the AI system output images based on text prompts — users could create anything from a romantic shark wedding to a puffer fish who swallowed an atomic bomb.

DALL-E 2 followed in mid-2022, using a diffusion model that allowed it to render far more realistic images than its predecessor. The tool soon went viral, but this was just the beginning for AI art generators. Midjourney, an independent research lab in the AI space, and Stable Diffusion, the open-source image-generating AI from Stability AI, soon entered the scene.

While many, including those in Web3 embraced these new creative tools, others staged anti-AI protests, expressed ethical concerns surrounding copyright law, and questioned whether these “artists” collaborating with AI even deserved that title.

At the heart of the debate was the question of consent. If there is one thing that can be said about all these systems with certainty, it is that they were trained on massive amounts of data. In other words, billions and billions of existing images. Where did those images come from? In part, they were scraped from hundreds of domains across the internet, meaning many artists had their entire portfolios fed into the system without their permission.

Now, those artists are fighting back, with a series of legal disputes arising in the past few months. This could be a long and bitter battle, the outcome of which could fundamentally alter artists’ rights to their creations and their ability to earn a livelihood.

Bring on the Lawsuits

In late 2022, experts began raising alarms that many of the complex legal issues, particularly those surrounding the information used to develop the AI model, would need to be answered by the court system. These alarm bells changed to a battle cry in January of 2023. A class-action lawsuit was filed against three companies that produced AI art generators: MidJourney, Stability AI (Stable Diffusion’s parent company), and DeviantArt (for their DreamUp product). 

The lead plaintiffs in the case are artists Sarah Andersen, Kelly McKernan, and Karla Ortiz. They allege that, through their AI products, these companies are infringing on their rights — and the rights of millions of other artists — by using the billions of images available online to train their AI “without the consent of the artists and without compensation.” Programmer and lawyer Matthew Butterick filed the suit in partnership with the Joseph Saveri Law Firm.

1/ As I learned more about how the deeply exploitative AI media models practices I realized there was no legal precedent to set this right. Let’s change that.

Read more about our class action lawsuit, including how to contact the firm here: https://t.co/yvX4YZMfrG

— Karla Ortiz (@kortizart) January 15, 2023

The 46-page filing against Midjourney, Stable Diffusion, and DeviantArt details how the plaintiffs (and a potentially unknowable number of others impacted by alleged copyright infringement by generative AI) have been affected by having their intellectual property fed into the data sets used by the tools without their permission. 

A large part of the issue is that these programs don’t just generate images based on a text prompt. They can imitate the style of the specific artists whose data has been included in the data set. This poses a severe problem for living artists. Many creators have spent decades honing their craft. Now, an AI generator can spit out mirror works in seconds. 

“The notion that someone could type my name into a generator and produce an image in my style immediately disturbed me.

Sarah Andersen, artist and illustrator

In an op-ed for The New York Times, Andersen details how she felt upon realizing that the AI systems were trained on her work.

“The notion that someone could type my name into a generator and produce an image in my style immediately disturbed me. This was not a human creating fan art or even a malicious troll copying my style; this was a generator that could spit out several images in seconds,” Anderson said. “The way I draw is the complex culmination of my education, the comics I devoured as a child, and the many small choices that make up the sum of my life.”

But is this copyright infringement?

The crux of the class-action lawsuit is that the online images used to train the AI are copyrighted. According to the plaintiffs and their lawyers, this means that any reproduction of the images without permission would constitute copyright infringement. 

“All AI image products operate in substantially the same way and store and incorporate countless copyrighted images as Training Images. Defendants, by and through the use of their AI image products, benefit commercially and profit richly from the use of copyrighted images,” the filing reads.

“The harm to artists is not hypothetical — works generated by AI image products ‘in the style’ of a particular artist are already sold on the internet, siphoning commissions from the artists themselves. Plaintiffs and the Class seek to end this blatant and enormous infringement of their rights before their professions are eliminated by a computer program powered entirely by their hard work.”

However, proponents and developers of AI tools claim that the information used to train the AI falls under the fair use doctrine, which permits the use of copyrighted material without obtaining permission from the rights holder. 

When the class-action suit was filed in January of this year, a spokesperson from Stability AI told Reuters that “anyone that believes that this isn’t fair use does not understand the technology and misunderstands the law.”

What experts have to say

David Holz, Midjourney CEO, issued similar statements when speaking with the Associated Press in December 2022, comparing the use of AI generators to the real-life process of one artist taking inspiration from another artist.

“Can a person look at somebody else’s picture and learn from it and make a similar picture?” Holz said. “Obviously, it’s allowed for people and if it wasn’t, then it would destroy the whole professional art industry, probably the nonprofessional industry too. To the extent that AIs are learning like people, it’s sort of the same thing and if the images come out differently then it seems like it’s fine.”

When making claims about fair uses, the complicating factor is that the laws vary from country to country. For example, when looking at the rules in the U.S. and the European Union, the EU has different rules based on the size of the company that’s trying to use a specific creative work, with more flexibility granted to smaller companies. Similarly, there are differences in the rules for training data sets and data scraping between the US and Europe. To this end, the location of the company that created the AI product is also a factor,

So far, legal scholars seem divided on whether or not the AI systems constitute infringement. Dr. Andres Guadamuz, a Reader for Intellectual Property Law at the University of Sussex and the Editor in Chief of the Journal of World Intellectual Property, is unconvinced by the basis of the legal argument. In an interview with nft now, he said that the fundamental argument made in the filing is flawed.

He explained that the filing seems to argue that every one of the 5.6 billion images that were fed into the data set used by Stable Diffusion are used to create a given image. He says that, in his mind, this claim is “ridiculous.” He extends his thinking beyond the case at present, projecting that if that were true, then any image created using diffusion would infringe on every one of the 5.6 billion images in the data set.

Daniel Gervais, a professor at Vanderbilt Law School specializing in intellectual property law, told nft now that he doesn’t think that the case is “ridiculous.” Instead, he explains that it puts two significant questions to a legal test. 

The first test is whether data scraping constitutes copyright infringement. Gervais said that, as the law stands now, it does not constitute infringement. He emphasizes the “now” because of the precedent set by a 2016 US Supreme Court decision that permits Google to “scan millions of books in order to make snippets available.” 

The second test is whether producing something with AI is infringement. Gervais said that whether or not this is infringement (at least in some countries) depends on the size of the data set. In a data set with millions of images, Gervais explains that it’s unlikely that the resulting image will take enough from a specific image to constitute infringement, though the probability is not zero. Smaller data sets increase the likelihood that a given prompt will produce an image that looks similar to the training images.

Gervais also details the spectrum with which copyright operates. On one end is an exact replica of a piece of art, and on the other is a work inspired by a particular artist (for example, done in a similar style to Claude Monet). The former, without permission, would be infringement, and the latter is clearly legal. But he admits that the line between the two is somewhat gray. “A copy doesn’t have to be exact. If I take a copy and change a few things, it’s still a copy,” he said.

In short, at present, it’s exceptionally difficult to determine what is and isn’t infringement, and it’s hard to say which way the case will go. 

What do NFT creators and the Web3 community think?

Much like the legal scholars who seem divided on the outcome of the class-action lawsuit, NFT creators and others in Web3 are also divided on the case.

Ishveen Jolly, CEO of OpenSponsorship, a sports marketing and sports influencer agency, told nft now that this lawsuit raises important questions about ownership and copyright in the context of AI-generated art. 

As someone who is often at the forefront of conversations with brands looking to enter the Web3 space, Jolly says there could be wide-reaching implications for the NFT ecosystem. “One potential outcome could be increased scrutiny and regulation of NFTs, particularly with regards to copyright and ownership issues. It is also possible that creators may need to be more cautious about using AI-generated elements in their work or that platforms may need to implement more stringent copyright enforcement measures,” she said.

These enforcement measures, however, could have an outsized effect on smaller creators who may not have the means to brush up on the legal ins and outs of copyright law. Jolly explains, “Smaller brands and collections may have a more difficult time pivoting if there is increased regulation or scrutiny of NFTs, as they may have less resources to navigate complex legal and technical issues.” 

AI Art by Stephan Vasement, Data Velvet, Jenni Pasanen, and MemoryMod.

That said, Jolly says she does see a potential upside. “Smaller brands and collections could benefit from a more level playing field if NFTs become subject to more standardized rules and regulations.”

Paula Sello, co-founder of Auroboros, a tech fashion house, doesn’t seem to share these same hopes. She expressed her sadness to nft now, explaining that current machine learning and data scraping practices impact less well-known talent. She elaborated by highlighting that artists are not typically wealthy and tend to struggle a lot for their art, so it can seem unfair that AI is being used in an industry that relies so heavily on its human elements.

Sello’s co-founder, Alissa Aulbekova, shared similar concerns and also reflected on the impact these AI systems will have on specific communities and individuals. “It’s easy to just drag and drop the library of a whole museum [to train an AI], but what about the cultural aspects? What about crediting and authorizing for it to be used again, and again, and again? Plus, a lot of education is lost in that process, and a future user of AI creative software has no idea about the importance of a fine artist.”

For now, these legal questions remain unanswered, and individuals across industries remain divided. But the first shots in the AI copyright wars have already been fired. Once the dust is settled and the decisions finally come down, they could reshape the future of numerous fields — and the lives of countless individuals. 

The post AI Lawsuits Are Here, and They Could Change Everything appeared first on nft now.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *