0 0
The Problem with AI Art Isn't the AI, It's What the Art Does to Society - Metaphors Are Lies

The Problem with AI Art Isn’t the AI, It’s What the Art Does to Society

Read Time:4 Minute, 11 Second

I do not hate the idea of AI.

I realize that may be a bit hard to believe given that general content of these missives, but it is true. I do believe that what we somewhat lazily call artificial intelligence can be beneficial to society, if we use it correctly. There are areas of science where machine learning indisputably helps scientist make real advances, for example. And I am even intrigued by the idea of using generative AIs to help relieve loneliness for people who are isolated due to circumstances beyond their control. There will likely never be enough money in the health care system to provide personalized one on one care for everyone in those situations, for example.

Unfortunately, we don’t focus on those areas when we build AI systems. The recently concluded WGA strike and ongoing SAG-AFTRA strike are good examples of these concerns. Apparently, one of the last hold ups to a WG contract was the use of imitative AIs to create content and the use of AI to replace actors. There is no benefit to society to those uses. All they do is extract money from the majority of people in the creative industries and put that money into the hands of the people who run the studios. It creates poverty and inequality and diminishes art for no good reason.

AI art is a dead end. These systems are not generative, and they do not learn, not in a meaningful sense. They make copies of existing material and based on their math and prompts decide what word or pixel should come next. It can never create something that is not based entirely on what it has come before. It can never advance. This is not how human creativity works.

A member of my writing group told me the other day that my writing reminds them of Terry Pratchett. This is both incredibly flattering and odd — I do not write satire. What I do write are sarcastic fantasy stories that leverage actual, if slightly warped, history to make their thematic points. Pratchett is an obvious touchpoint to work like that to anyone who has read him closely. But an AI would not make the transformation I have (whether my writing is good or not is another question – we are discussing the process of creation). Such systems are not capable of that kind of growth.

Human beings do not, despite what some AI proponents would have you believe, merely remix everything they have ever read or seen. They transform it in ways that go beyond remixing. They process what they learn from based on their own lived experiences, biases, and preferences and, in the best cases, turn that material into something different than what came before. If you study the history of the best impressionists, for example, you can see that they started as masters of realistic paintings. Only gradually did they invent their impressionistic style, taking what they had learned and turning it into something new.

A world where AI art replaces human art would forever be frozen in the remix stage, replaying over and over again the same variations on the same themes it was trained upon. It would probably get worse and less interesting over time. It would train more and more on itself, remixing the remixes in a spiral of derivative recursiveness, a robot eating its own electronic tail. We would be making a few people rich on the backs of working people and the long-term health of our own culture. The price is too high.

We live in a society, not an economy. An economy is an aspect of a society, and so it must be nurtured. But no one is entitled to rule that economy at the price of the health of the society. Society is what makes the economy possible, not the other way around. A world where the devil takes the hindmost is not one where MBAs thrive, after all. We should encourage useful implementations of AI and discourage societal harming ones. If we were to, say, make the output of all imitative AI systems trained on copyrighted works public domain, then perhaps there would be more incentives to spend time creating systems that help cure cancer and less time creating systems that help make actors poor.

I am sure that there are other means of disincentivizing bad behavior (making these systems liable for their output, for example, might tamp down on misinformation), but the point remains: society comes before economy. This does not mean that there is not a balance. But is also does not mean that money can do what it wants to who it wants all while acting under the protection of the society it undermines. If some implementations of AI are harmful, they should be brought under control and/or the damage they cause should be paid for by their creators. Imitative AI is one such implementation, and we should not be afraid to take the appropriate steps.

Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %

Posted

in

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.