An artist did not want his art used to train AI art models and one of the companies — Stable Diffusion — did the right partially. It removed the ability to ask for the system to produce work in his style. It was probably too late to remove his work from the training set, but that was at least something.
The “community” of users really like this artist’s style, however, and so decided that what he wanted didn’t matter. They put the ability to request his style back into and open-source version of the system, over his known and explicit objections.
Who benefits from new technology and systems is often a matter of power. So-called free trade agreements are large and complex documents that have rules defining what is and is not allowed on the so-called free trade agreements. Unsurprisingly, things like intellectual property and medical professions, for example, receive many, many, many more protections in trade agreements as compared to say, factory workers. Earlier this year, one of the premier AI conferences forbade the submission of ChatGPT written, though not edited, papers.
You can tell a lot about the intentions of a movement by what it does. To the credit of the Stable Diffusion company, it honored the request of an artist that felt he was being unjustly harmed by the system. But the community around AI art rejected those wishes. The leading intellectual light of AI took steps to protect their own ability to make a living on their intellects at the same time that tools like ChatGPT are used to put that same ability at risk in other fields.
As a group, the people pushing imitative AI do not seem to like artists or writers very much. Whatever their protestations, their actions seem to indicate that what they really want is free art regardless of who it harms. And no matter how other that claim that generative AI will help writers, the fact that the AI conference banned papers written by it certainly suggest that they are afraid of their labor being replaced.
To be trite, when people show you who they are, believe them. The culture around AI is showing us that it does not have the best interest of artists and creatives at heart and that does, in fact, believe its products can be used to replace human writers. We, as a society, need to act accordingly. We need to ensure that no system that was trained on material that did not explicitly compensate the material’s creators is allowed to be put into use, we need to tax the revenue of these companies in order to support creatives, and we need to ensure that every system is responsible for the harm it output produces.
Because it seems clear that the AI proponents as a group aren’t interested in the wellbeing of creatives and artists, much less society at large.
Leave a Reply