ai

UK Artists Potentially Joining Lawsuit Against Midjourney Over Use Of Their Work To Train AI Software 

Midjourney is one of the many new image generators available to the public online that uses artificial intelligence to generate a given image, or series of images, based on what the user enters into the given prompts. 

AI technology has been on the rise within the past few years, and has officially entered the mainstream as of late. The ways in which AI gathers information to be used for its specific purpose, however, have been called out for being unethical, or straight up theft from various writers, artists, and creatives in general who are unknowingly having their works used to train these systems. 

With Midjourney specifically, there has recently been a list of around 16,000 artists whose work has been used to train Midjourney’s AI. Some of these artists include Bridget Riley, Damien Hirst, Rachel Whiteread, Tracey Emin, David Hockney, and Anish Kapoor, according to the Guardian

Embed from Getty Images

UK artist’s have now contacted lawyers in the US to discuss joining a class action lawsuit against Midjourney and other AI companies involved in similar practices. 

Tim Flach, the president of the Association of Photographers and a photographer himself who was included on the list of 16,000 stated the importance of collaboration when it comes to battling AI programs and companies. 

“What we need to do is come together. This public showing of this list of names is a great catalyst for artists to come together and challenge it. I personally would be up for doing that.”

The list of names was released in a 24-page document that was used within the class action lawsuit filed by 10 American artists in California. The lawsuit in particular is against Midjourney, Stability AI, Runway AI, and DevianArt. 

According to Matthew Butterick, one of the lawyers representing the artists, stated that since filing, they’ve received interest from artists all over the world to join the suit. The tech firms involved have until February 8th to respond to the claim, which states the following: 

“Though [the] defendants like to describe their AI image products in lofty terms, the reality is grubbier and nastier: AI image products are primarily valued as copyright-laundering devices, promising customers the benefits of art without the costs of artists.”

Embed from Getty Images

The lawsuit also stated that with Midjourney specifically, users are allowed, and encouraged, to specify any artist’s personal style when entering their description for the image they want to generate using AI. 

“The impersonation of artists and their style is probably the thing that will stick, because if you take an artist’s style you’re effectively robbing them of their livelihood,”  Flach said. 

The Design and Artists Copyright Society (DACS) took a survey last week of 1,000 artists and agents regarding the lack of legal regulation over generative AI technologies. The survey showed that 89% of the respondents want the UK government to regulate generative AI, while 22% discovered that their own works have been used to train AI. 

“If we’d done our survey now [after the list had come out] we probably would have had a stronger response. A lot of people didn’t know whether their works had been used. There’s a transparency we didn’t have a couple of months ago,” said Reema Selhi, head of policy at DACS.

Selhi continued to discuss how ministers initially wanted to open up copyright laws to actually make it easier for companies to train AIs without needing permission in regards to the artists’ works they learn it from. 

“We’ve had such a great strength of feeling from people that this is completely copyright infringement. Permission hasn’t been sought. They haven’t given consent. They haven’t been remunerated. They haven’t been credited.”

DACS is actively pushing for some form of official licensing or royalty system to be put in place so artists have more control over where and how their works are produced, or at the very least receive some sort of compensation.