I also agree that copyright law, as currently interpreted, does not support the copyrighting of AI art. I enjoy playing around with Stable Diffusion, and I would say that I have gotten pretty good at it, but I don't take ownership of the images that I generate, and I don't believe that I should be allowed to legally claim ownership over something that wasn't actually created by me.
But I still find the technology impressive and useful.
But, again, I don't fall for the "All AI art is art theft" argument, because it just doesn't make sense.
If I type "a moonlit beach, painting by [living artist]", It could be argued as infringing on that artist, but If I just type "a moonlight beach, oil painting", who does it infringe on? Every artist the model was trained on?
If a work infringes on every artist at once, it's safe to say that it doesn't infringe on any artist at all.
@greycat I don't think that comparing intellectual property rights to financial securities is a great argument coming from someone who identifies as a communist. Marx was pretty clear about the risk of exchange value superceding use value (commodification).
In your comparison, "stealing from every bank" = "borrowing from every artist". While the former could represent systemic theft or fraud, the latter, IMHO, clearly represents fair use.
@Alex this could be true but in many cases the ai isn't borrowing equally from everywhere in its corpus (which to me would seem to be an unavoidable feature of the nonlinear way the models work). for example @Colophonscrawl has experimented a lot with a particular genre with which they are familiar, and can often pinpoint the exact artist whose style the ai is iterating on for a particular piece
@Alex i do agree that the idea that just being in the corpus for a model is a violation of artists' moral rights somehow doesn't really hold water
@Alex i think the stronger argument, tho probably not the one most ai advocates would want to make, would be that the ai isn't actually capable of copying either the style or the subject matter of artists in the way that is alleged. the result invariably ends up looking like 'ai trained on x' instead of x
@esvrld this is interesting input. It definitely would be an issue that certain niche art styles in the corpus could end up zeroing in non-trivially on a specific artist's style. I hadn't fully considered this, and have been assuming that in the absence of an artist's name in the prompt that the art would be more generic. This immediately has me feeling more skeptical of third party models that don't share their datasets and could be weighted in such a way to be infringing by default.
@esvrld I think your proposal that "ai isn't actually capable of copying either the style or the subject matter of artists in the way that is alleged" is a philosophically interesting idea, but as a cautious AI advocate I would argue that, in practical terms, AI is very capable of copying an artist's style to the level of infringement, and would argue that responsible and ethically informed use of generative models is of high importance to anyone exploring this tech.
@esvrld There are also some holes in my philosophy as I consider this. There are a few models out there are unabashedly designed to copy a specific style. I have a Pixar-theme model that I have played with that is undoubtedly trained on Pixar's IP.
But I feel substantially less conflicted in infringing upon a $7.4B company, and should be more nuanced about how such training directed at smaller artists' works could be damaging to those artists.
This is the Vranas instance.