For years, painters, photographers and other artists have flocked to Instagram to publish their works and make themselves known. Now, many are parting ways to prevent Meta, which owns Instagram, from using their art to train its artificial intelligence (AI) models.

They denounce Meta on their accounts, where many of them announce that they are migrating to Cara, an online portfolio for artists that prohibits AI creations and training. In May, a Meta executive said the company considered any Instagram post to be used to train its AI. Shortly after, Meta notified European users that their hardware will be used for this purpose from June 26. There is no way to refuse, although European law allows for appeal over the use of personal data.

According to AI companies, nearly the entire public internet can be used to train AI… which could replace the authors, musicians and visual artists who created that “training data.”

Tension rises and artists are caught at each other’s throats: they need Meta apps to give themselves visibility, but they cannot prevent AI from vampirizing their works. Some say they are already close to losing their livelihood.

According to Cara founder Jingna Zhang, her platform grew from 40,000 to 650,000 users last week. At one point, it was the fifth most downloaded app, according to Apple. It is unclear if this exodus will have any effect on Meta.

“I’m losing sleep over it,” says Ms. Zhang, a photographer and artists’ rights advocate. “We didn’t expect that. »

Many artists, including Ms. Zhang, are suing Google, Stability AI and other AI companies, accusing them of training their systems with online content, some of which is copyrighted. Authors and publishers, including George R. R. Martin (Game of Thrones) and The New York Times, are doing the same. According to the defendants, this use is permitted by the “fair use” doctrine, which authorizes the remixing and interpretation of existing content.

In the meantime, artists are scrambling to protect their future works, relying on unproven alternatives.

Cara, which launched for free in January 2023, is still under development and has crashed several times this week, overwhelmed by registrations, Ms. Zhang says. Available on iOS, Android and the web, its home page resembles that of Instagram, with “like”, “comment” and “repost” buttons.

Artist Eva Redamonti has looked at “four or five” alternatives to Instagram, but finds it difficult to assess which one best protects her interests. According to Ben Zhao, a computer science professor at the University of Chicago, several apps lured artists with false promises, quickly revealing themselves to be “AI farms” where their works are harvested (that’s the technical term). . Mr. Zhao and his colleague Heather Zheng created the Glaze tool – integrated into Cara -, which is supposed to protect artists’ work from AI imitations.

Cara uses the company’s AI detection tool Hive to catch violators and labels each uploaded image with the NoAI label to discourage harvesting. But in fact, there is no way to stop AI companies from serving themselves.

According to some artists, AI has already taken away their income.

10 years ago, Kelly McKernan – an illustrator from Nashville – joined Facebook and Instagram, which quickly became the best source for customers. But from 2022 to 2023, its revenue from this showcase fell by 30%, as AI-generated images proliferated across the internet. Last year, she typed her name into Google and the first result was an AI-generated image mimicking her style.

Ms. McKernan and two other artists are suing AI companies, including Midjourney and Stability AI.

Independent illustrator Allie Sullberg moved to Cara this week, following the example of many artist friends who denounced Meta and deserted Instagram. She says she is outraged that Meta presents its AI products as tools for creators, while they benefit in no way from the use of their works to train the AI.

The conditions of use for Meta apps specify that all users accept the company’s AI policy. However, Ms. Sullberg says she joined Instagram in 2011, 10 years before the 2021 launch of OpenAI’s first consumer generative image model DALL-E.

According to Meta spokesperson Thomas Richards, the company does not offer a withdrawal option. “Depending on where they live […] and local privacy laws […], people may object to the use of their personal information to build and train AI,” says -he.

Jon Lam, a video game artist and creators’ rights activist, spent hours searching on Instagram for a way to protect his work from being sucked out by AI. He found a form, but it only applies to European users, who are protected by privacy laws. Lam says he feels “anger and fury” toward Meta and other AI companies.

“Ten years later, it has become just a platform used to collect data” to feed their AI.

Ms. McKernan says she hopes the big lawsuits filed by creators will prompt AI companies to change their policies.

“It’s complacency that allows companies like Meta to continue to treat content creators the way they do – the very people who make them money,” he says.