What follows is going to be more speculative than usual. I did reach out a few days ago to both the WGA East and the WGA West with this question, but neither have gotten back to me (Unsurprisingly. I am a nobody, and they have real reporters to deal with in what I am certain is a very busy time.) I also know no entertainment lawyers, so what I am about to say may be incorrect and if shown to be so I will apologize and correct.
But.
But I think that that the WGA contract may have done serious damage to the business model of imitative AI companies. The point of imitative AI, from a business perspective, is to replace people with machines. I wrote yesterday about why this is bad for people in the industry and bad for the larger culture and society, but this contract seems to drive a stake into the heart of a signficant portion of the model. The contract, as reported, states that “that AI-generated written material cannot be considered literary material written by a human”. This appears to do two things.
First, it means that studios cannot come to writers with half-assed “scripts” and then require them to “edit” them — meaning rewrite it to something filmable. It also appears that they cannot option AI written novels/stories/etc. as the basis for scripts either. This last is the point I have tried and failed to get clarification on. If true, then two major sources of revenue have been squelched.
Imitative AI is not cheap. To get something even vaguely passable requires a ton of training data. Even if you steal — I am sorry, even you if you repurpose copyrighted material in a totally fair use manner — all that training material, you still have to store it and process it. None of that is cheap on the scale we are talking about. And then you have to pay some clever burgers to refine your models and code if you hope to improve the outputs and make the processes more efficient. If you cannot fire writers (and if VFX and actors’ unions get similar protections, and it is likely they will, then you cannot fire actors and artists) in the largest content creation industry, where are your profits going to come from?
Hollywood and associated industries are not the only content creation markets, of course. The right wing has proven willing to subsidize various forms of propaganda (their associated magazines and think tanks, for example) and imitative AI is very good at misinformation. And the self-publishing world already has some writers who make their living publishing small, very similar pieces on a rapid schedule. But how much money is in misinformation, and at some point, it seems that a flood of AI writing in the self-published world would have diminishing returns. That is already a precarious model with low returns for most practitioners. How much can be rung out of it by sub-standard AI knock offs flooding it?
Even if I am wrong about the option effect, just having most work require a human script writer from the start ensures that the highest paying written work will be off limits to imitative AI systems. And that is a good thing. AI art, as I have argued, is bad for the economy and bad for society. By limiting the rewards for it, maybe we can push AI into areas where it can do some actual good.
Leave a Reply