Stable Diffusion won't steal your art (if you tell them not to)
StabilityAI has announced that they will honour artists who choose to “opt-out” of having their artwork included AI training data for the next iteration of their picture-making machine, Stable Diffusion 3.
Frankly, I think this is a bogus move. Of all the millions of artists out there, only a tiny fraction are going to opt-out, not because they want their data to be in the training set but because the process is fiddly and onerous. There are 5.8 billion images in the https://haveibeentrained.com/ database. StabilityAI seriously expect every affected artist to log on, search for their art, and opt-out.
It’s a ridiculous process.
When you build a house, you don’t get to steal bricks from anyone who didn’t “opt-out” of having their bricks stolen.
You can read the thread on Twitter here: https://twitter.com/EMostaque/status/1603147709229170695?s=20&t=4cJ1ZW8q71HwJ4kk1RmTJw
In typical “tech bro” style, Emad (https://twitter.com/EMostaque) is keen to let you know that StabilityAI is “moving and maturing fast“. I’m not sure what level of maturity they need to reach before they realise that stealing is wrong. Maybe that will be in Stable Diffusion 4.