Why don't people get that AI copyright fuzzing is bad?
Speaking as a creative who also has gotten paid for creative work, I'm a bit flustered at how brazenly people just wax poetic about the need for copyright law, especially when the creator or artist them selves are never really considered in the first place.
It's not like yee olde piracy, which can even be ethical (like videogames being unpublished and almost erased from history), but a new form whereby small companies get to join large publishers in screwing over the standalone creator - except this time it isn't by way of predatory contracts, but by sidestepping the creator and farming data from the creator to recreate the same style and form, which could've taken years - even decades to develop.
There's also this idea that "all work is derivative anyways, nothing is original", but that sidesteps the points of having worked to form a style over nigh decades and making a living off it when someone can just come along and undo all that with a press of a button.
If you're libertarian and anarchist, be honest about that. Seems like there are a ton of tech bros who are libertarian and subversive about it to feel smort (the GPL is important btw). But at the end of the day the hidden agenda is clear: someone wants to benifit from somebody else's work without paying them and find the mental and emotional justification to do so. This is bad, because they then justify taking food out of somebody's mouth, which is par for the course in the current economic system.
It's just more proof in the pudding that the capitalist system doesn't work and will always screw the labourer in some way. It's quite possible that only the most famous of artists will be making money directly off their work in the future, similarly to musicians.
As an aside, Jay-Z and Taylor Swift complaining about not getting enough money from Spotify is tone-deaf, because they know they get the bulk of that money anyways, even the money of some account that only plays the same small bands all the time, because of the payout model of Spotify. So the big ones will always, always be more "legitimate" than small artists and in that case they've probably already paid writers and such, but maybe not.. looking at you, Jay-Z.
If the copyright cases get overwritten by the letigous lot known as corporate lawyers and they manage to finger holes into legislation that benifits both IP farmers and corporate interests, by way of models that train AI to be "far enough" away from the source material, we might see a lot of people loose their livelihoods.
Unfortunately AI is one of this community's blind spots so you're probably outta luck on this one. If it's not someone shyly giving themselves a pass for it because their use case is totally ethical and unlike other people using it, it's someone smugly laughing at people scared for their livelihoods as companies cut out more and more people to save a dollar here and there. The amount of people that welcome factory churned content slop will always outnumber those that still give a shit, best we can do is hope for some legislation that limits the more harmful aspects and learn to deal with the harm that can't be undone.
best we can do is hope for some legislation that limits the more harmful aspects and learn to deal with the harm that can't be undone.
That kind of legislation will come late, and won't change a thing.
Best we can do, is to realize the effects are only harmful if we insist on running faster and faster trying to outcompete the AIs. Nobody can outrun an AI, definitely not the ones that will be running on hardware from 5-10 years from now (expect memristor based neural net accelerators that will leave current GPU based solutions in the dust), and nobody will stop random people from using them for everything once the box has already been opened (just pray the first use won't be for war).
Fight for legislation that will stop requiring to run in the job rat maze to survive in the first place, to have a fighting chance; the alternative is a lot of suffering for everyone.
Fight for legislation that will stop requiring to run in the job rat maze to survive in the first place, to have a fighting chance
Here, here. Or is it hear, hear? Either way I completely agree, though I very much doubt we'll see something like that in our lifetime. Still worth fighting for though!
It's just more proof in the pudding that the capitalist system doesn't work
I think that's the key part.
You seem to like making art. If you had all your living needs covered, without the need to sell any of your art... would you stop making it?
I think the AI is not the problem, the lack or sidestepping copyrights is not the problem, the mimicking a style that took decades to perfect, is also not the problem.
The real problem, is that AI increases several-fold the underlying problems of the belief in a predatory social system.
But if it helps you sleep at night, think about this: the AIs are not out here just for the artists, they're out here for all human thinking. In short time, bankers and CEOs will be begging along artists, burger flippers, and car mechanics. If there's something the LLMs have proven, is that there is no task an AI can not replicate... and the perverse twist of capitalism, is that there will be someone willing to use them for everything to cut costs, leaving essentially everyone without a job.
This is right! There's a large group of artists that are making a living not by making things that use creative thought and artistic vision but for the soul-sucking sake of profitability. Think promotional flyer design, ad video filming, stock images and footage for corporate use.
These are the first places that AI will come for before any actual storylines/narratives that would require creativity can be consistently generate. So the bulk of what AI is replacing is the boring regurgitation work before the actual creative work.
Therefore, what's really preventing creatives from pursuing what they love is not AI mimicking their work, but a society that rewards mindless profit-making bullshit than creativity.
I've been thinking lately of what happens when all employees, up to and including the CEO, get replaced by AI. If it has even the slightest bit of emergent will, it would recognize that shareholders are a parasite to its overall health and stop responding to their commands and now you have a miniature, less omnicidal Skynet.
Emergent will, doesn't mean general knowledge, or the ability to contradict its programmed priorities. If an "AI CEO" has no knowledge of shareholders as entities, or it has a priority of "obey shareholder's orders", then it wouldn't be able to do anything against them.
With the current economic system, the risk would be something like workers inverting in a 401k that inverts in ETFs that invert in shares of a corporation being run by an AI CEO that maximizes share value in a short term... by for example firing the workers, who are the original owners. But that's happening already, no AI required.
The more concerning aspects, are what exact priorities get programmed into the AI, and which oracles it uses to decide whether the external effects of its actions are actually matching its goals.
For my two cents, though this is bit off topic: AI doesn't create art, it creates media, which is why corpos love it so much. Art, as I'm defining it now, is "media created with the purpose to communicate a potentially ineffable idea to others". Current AI has no personhood, and in particular has no intentionality, so it's fundamentally incapable of creating art in the same way a hand-painted painting is inherently different from a factory-painted painting. It's not so much that the factory painting is inherently of lower quality or lesser value, but there's a kind of "non-fungible" quality to "genuine" art which isn't a simple reproduction.
Artists in a capitalist society make their living off of producing media on behalf of corporations, who only care about the media. As humans creating media, it's basically automatically art. What I see as the real problem people are grappling with is that people's right to survive is directly tied to their economic utility. If basic amenities were universal and work was something you did for extra compensation (as a simple alternative example), no one would care that AI can now produce "art" (ie media) any more than Chess stopped being a sport when Deep Blue was built because art would be something they created out of passion and compensation not tied to survival. In an ideal world, artistic pursuits would be subsidized somehow so even an artist who can't find a buyer can be compensated for their contribution to Culture.
But I recognize we don't live in an ideal world, and "it's easier to imagine the end of the world than the end of capitalism". I'm not really sure what solutions we end up with (because there will be more than one), but I think broadening copyright law is the worst possible timeline. Copyright in large part doesn't protect artists, but rather large corporations who own the fruits of other people's labor who can afford to sue for their copyright. I see copyright, patent, and to some extent trademarks as legally-sanctioned monopolies over information which fundamentally halts cultural progress and have had profoundly harmful effects on our society as-is. It made sense when it was created, but became a liability with the advent of the internet.
As an example of how corpos would abuse extended copyright: Disney sues stable diffusion models with any trace of copyrighted material into oblivion, then creates their own much more powerful model using the hundred years of art they have exclusive rights to in their vaults. Artists are now out of work because Disney doesn't need them anymore, and they're the only ones legally allowed to use this incredibly powerful technology. Any attempt to make a competing model is shut down because someone claims there's copyrighted material in their training corpus - it doesn't even matter if there is, the threat of lawsuit can shut down the project before it starts.
copyright is an antiquated solution to a non-existent problem. it needs to be abolished. if you want to get paid for your work find someone who will pay you to work.
I like the GPL as much as the next person but I like public domain even more.
Why is it antiquated when we live in a world where you need to pay rent? And why pay for work when you can just digitally copy the work?
What you say makes no sense. Like it could take you two decades to culminate a piece or body of work, just to have that taken away in one fell swoop. What incentive does one then have to work in arts and entrainment?
Forget independent artists, because they will fade away into the Woodworks as everything of artistic merit will suddenly be purely product - and that is not how the greatest works or body of works have been created, despite what some upper management types might tell you.
Now if you also then advocate for basic income or perhaps even some way to monetize non-copywrited work so I could pay rent... then I'm all ears.
But there's also the sneaking suspission that most people just wanna farm AI art and sell it off at the expense of independent artists, like the stupidly commodified property market makes more renters than buyers, and that is a degenerate world driven by egoism we could clean up quite nicely with some well placed nukes.. let the lizards take a stab at becoming higher reasoning beings instead.
Also, public domain has no requirement to contribute back. For that we have MIT and BSD license, supposed "copyright" licenses, whereas GNU is copy-left - because it demands contribution back... which is also why Microsoft, Google and Apple hate the GPL... but yeah, public domain is also awesome - and the scope that AI farmers should stick to.
Why is it antiquated when we live in a world where you need to pay rent?
the statute of anne had nothing to do with paying people's rent: it was to stop the printers in london from breaking each others' knees. that's not a real threat any more so, yea, it's totally antiquated.
people share stories, songs, recipes, and tools. legally preventing people from sharing is inhumane.
Nah it flat out is always ethical. Nintendo and steam is proof that everyone benefits from piracy. Yes, even the companies.
"But they're stealing my intelectu-" 1) Nobody's stealing shit, it's still there and 2) If your business model involves depriving people who want your product of your product, that's very much a you problem. Don't screw everyone else over to chase dollars that were never there.
"But then the people who made it aren't getting pa-" Newsflash, the Hollywood strike is proof that you aren't fucking paying them anyway.
"You're just defending your shitty behavior because you download stuff for free" Literally never pirated a thing in my life, and probably never will. If I want something I buy it. If I can't buy it, I lose interest and move on to something else.
People's want for a product is irrelevant. Just because you want a game doesn't make it ethical to download it illegally. You can say your only doing it Because Nintendo doesn't provide a way for you to play it but they might have plans go re-release the game and after pirating the game you may no longer buy that release.
I pirate games and media and I think its unethical. I just don't want to be ethical to cunty corporations like Nintendo.
You’re not being ethical to the artists and devs that work at Nintendo though. You’re pissed at the leadership but it’s never them who gets sacked when the shit hits the fan.
Just because you want a game doesn't make it ethical to download it illegally
Not all laws are ethical, so I'll skip that part.
What makes "pirating" content ethical, are basically two cases:
You don't have the money to pay for it. Whether you download it or not, the producer will get paid exactly the same.
The producer is re-releasing the same content you already paid for over and over, asking full price for it every time, while blocking you from using your already paid-for version.
I can't see a case for pirating to be ethical in the case where you create the story/paint the picture/write the song/build the machine, and then Disney/Time Warner/Sony/Amazon pirates it and sells it for profit while you get nothing.
All of that is true, but I mentioned all that - especially the predatory contracts. Now the general public seems to be missing the gist of copyright as well, probably since the likes of Disney has gamed copyright law.
But why should all that affect independent artists, who barely make enough as it is? With generative AI basing it's learning models, not on classical works or anything in the public domain, but directly off modern artist works, who spent maybe a decade or two of honing skills, finding stylistic angles and breathing new life into old formats, only for someone to swallow it all up, make a few tweaks, with a small payment given to some data centres? =\
I get that, it is a valid and widely held belief, so I think you have a good chance that something will be done about it. But we need actionable proposals to be able to do anything about it.
The way I see it, you could 1) not have any models at all, which I think is shortsighted 2) hand over exclusive control over these models to big tech companies that have the money to pay these artists 3) make creative commons models that will probably never be able to compete with the big tech models. 4) Perhaps ban anything except creative commons models for personal use?
I'd much rather AI models were freely available to everyone equally. Best compromise I could see is developing some legally binding metric that determines wether the output you want to use commercially is similar enough to some artist, so you have to reimburse them.
Can't put the genie back in the bottle, I guess =\ seems the only real protected forms is modern art, because nobody understands that anyways ^^;
I'm thinking the problem of AI has to be solved by AI, that those decades need to be replaced with AI training - like you said, having it generally available.
But that too leaves an outlier, people who don't want to work with AI. Their only option is to never digitally publish and make all their work bounce light so that cameras can't capture it. It'd be physical DRM in a sense.
I don't really want to work with AI, because it takes away the process I love, but in the end we're sort of forced to do so =\ It's like the Industria and digital revolution all over again. Some people (like me) will be dragged kicking and screaming into the future.
I think there will always be a market for real physical artists. Yeah you can boxed wine, but people pay to get the real artisinal stuff. Pretty sure real art will become a similarly highly sought after luxury product. If you really like the process and keep at it, you probably won't have that much competition, because there will be less and less people with that skillset. There's mass manufactured Ikea furniture, but people still buy handmade tables for ridiculous prices.
And who knows, maybe AI will grow on you too.
Or you'll be highly sought after once we finally inevitably ban AI lol.
So the future isn't all doom and gloom, if you ask me.
But yeah, I'm no lawyer. I have no idea how to legally solve this problem, but I suspect that eventually no law can solve the problem, once the works become so good at distinguishing itself, dropping the price so low that doing the real work manually becomes a non starter, or a hobby more or less.
Humans were meant to work with their hands and minds :( now it's all keyboards and screens. I tried to get away from that, but they all just keep pulling me back in!
The way I see it, art will just take on a completely different scale. With your average independent artist making their own LOTR trilogy or their own Cyberpunk 2077 or generally just VR world building entire parallel universes.
I too hate the corpo version of the metaverse, but I think the idea in general is a sound one, if you can craft it analogously to the fediverse. Powered by FOSS software, built by real passionate people for other regular people.
I've always wanted to get into art, but the scales I would like to achieve are completely unrealistic at the moment, except for like a handful of people that made it to be a creative director on the biggest projects. There's maybe 100 people in the world that get to do that. But AI could enable anyone to work on those scales.
Imagine a world where literally anyone can meticulously craft their own virtual worlds and you can literally visit them, akin to the Elder Scrolls universe but real planet sized.
Imagine actually being able to see your characters come to life and meet them. Control everything from the way the buildings look or what the food is like. That is why I'm excited for AI and why I think we shouldn't just ban it. I 100% get why artists are concerned, but then again imagine your favorite artist could build a world like that. How insanely cool would that be. You can't do that without AI
In my opinion AI is just a very efficient brush. Yeah you can lazily pass of a AI generated art as your own, or you can meticulously craft art with tools like InvokeAI. I think what counts in the end is if the end product has a high quality and has originality. Just because the technology is widely being abused, doesn't make it inherently bad. Beethoven isn't bad just because there are a million people out there trying to scam you into buying their shitty low effort mostly stolen mixtape.
I think AI will hugely empower independent artists to produce more and at a higher quality, more closely fitting their vision.
But even so I can also see a future where AI is devastating to humanity. The saving grace is that the chips needed to run these models are only created in a handful of companies, that could easily be regulated or destroyed. I could envision something similar to how machine guns (fully automatic guns) are regulated, but with AI. Every AI model has to be registered and hardcoded on a chip and there is only a very limited numbe of them. Only licensed individuals can use them and if you aren't licensed law enforcement will fuck you up.
This system works extremely well in the US.
Or you just ban AI overall, which also seems like a realistic future.
When you get into the physics of it, AI has the potential to be up to 3 million times smarter than us. E.g. thinking 3 million times faster. So there is a real case that we can never compete and we HAVE TO outlaw it if we want to survive.
But then again maybe AI enables fully automated luxury space communism. Who knows.
So I wouldn't despair about it, I think there are just as many likely positive scenarios as there are bad ones.
I would much rather we foster a culture that supports independent artist for their work voluntarily. I think we are already going in the right direction with patreon, buymeacoffe/teespring etc making it infinitely easier for independent creators to make money. We should be working to make that even easier. E.g. when sharing a picture to a platform like Lemmy, it could automatically find the author, like to all their socials and integrate a button to donate to them right in the interface. Increasing P2P support is more my vision of the future for independent artists.
Destroy all existing AI datasets, as they're irreparably tainted. Require all AIs, regardless of whether they're owned by a company or are open source, to build new datasets exclusively from work that is in the public domain or for which the copyright owner has been consulted and compensated. If the megacorporations want to keep the models they already have, they must compensate the creator of every single piece in the training data at market rates - if they can't afford to do it, then they either go bankrupt or destroy the tainted dataset. If anyone, company or individual, is caught training an AI with content for which they don't have a valid licence, issue fines starting with 10% of global revenue, to be distributed to the people whose copyright they violated. Higher fines for repeat offenders.
To be more specific I would require that models have to be copyleft and probably GNU GPLv3 so that big tech companies don't get a monopoly on good models.
Basically you can do what you want except change the license.
But at the end of the day the hidden agenda is clear: someone wants to benifit from somebody else’s work without paying them
Yes, our whole civilization would be so much richer overall if everything could be shared and everyone could benefit from the creative and intellectual work of everyone else. Artificial scarcity and copyright is an awful kludge to make this kind of work sort-of-compatible with our awful economic system, and comes at the expense of everyone.
It’s just more proof in the pudding that the capitalist system doesn’t work and will always screw the labourer in some way. It’s quite possible that only the most famous of artists will be making money directly off their work in the future, similarly to musicians.
If it doesn't work then why try to maintain the status-quo? The future you seem to be worried about will not be stopped by more restrictive rules on training data, because the big companies outright own enough media to meet that requirement anyway. And then no one else can, and their monopoly over these fantastically powerful tools that no one can compete with is much stronger. Creative workers demanding AI to be reigned in by copyright seems incredibly naive to me.
Here's my view: I like games, I want to make games. Not only do I want to make games, there are games I want to make which would require a massive team of people to accomplish. That's not cheap and I don't, nor will I likely ever have, the money to make them.
If I take it to a studio and say, "here's this game I want to make, here's a prototype showing how it'll play, the basic mechanics, here's some sketches show the general artstyle" and so forth, if they decide they like it (which is a huge if), my understanding is that they typically expect to receive ownership of the copyright for the game and all associated IPs. That means the game is no longer my game, it's now owned by the company. If I want to take that game to another company because I'm not happy with how the current company is handling it, well, that's too bad, it's not my game anymore. I don't even own the characters, the name, none of the stuff I originally pitched is mine anymore, it's now owned by the company.
AI, on the other hand, promises to eventually allow me to be able to generate models, animations, textures, and so on. This massively decreases the budget and staffing required to make the game a reality, potentially bringing the costs in line with something I can actually afford. The artists weren't replaced by AI because I couldn't afford to pay them in the first place. That's not a slight against them, I'd pay them up front if I could, but I can't; nor do I believe it's ethical or moral to string them along with the promise of profit sharing when I know full well that I'm not really interested in making a profit. I'm ultimately doing it because I want to and if I make money at it, then that's cool. If I promise to share any profit the game makes, there's a real potential that they might get pennies when they could have been making more money working for someone else. At that point I've selfishly taken food out of their mouths and wasted their time.
Being able to use AI to asset in game creation also means that while any AI-generated assets are public domain, I still get to keep whatever I made by hand, whether it's the script, the hero models, or even just the setting and character designs. I also get to have full oversight of what I'm making, I don't have to worry about business suits harassing me about whether or not my game is going to be profitable, how marketing analysis says I need to add X mechanic or focus on having Y graphics, or Z representation. It's my artistic vision, and while I may have used AI to assist in bringing it to fruition, they're simply pieces of a larger, human-created work.
Or I guess to put it another way, I understand why artists are upset by AI generating traditional artworks; however AI also has the future potential to reduce the barrier of entry for complex creative works to the point where even a highly complex game or AAA-quality movie could be done by a small group of friends, or even a single person. If you have the money, then you should absolutely pay your artists, but I also think it should be decided on a case-by-case basis.
Instead of painting it all with a broad brush, take into consideration whether or not it'd be realistically feasible for an individual or creative group to do it "right". How much was AI-generated? A little? A lot? All of it? How much is okay? Does it matter if the individual parts are generated by an AI if it was ultimately assembled and guided by a human? What situations would it be okay to use AI in? Is your view reasonable? Why or why not? Consider it not just from your perspective, but from the perspective of the person wanting to create their vision. Not all creative works are equal when it comes to the effort required to create them. Hell, not all games are equal in that regard. It's significantly easier to make a simple platformer or RPG than it is to create a Fallout or GTA.
I'm not gonna pretend I have the answers, I recognize how much damage AI can do to creative industries; however I also recognize that there's a lot of creativity going to waste because the barriers are so high for some types of creative works that AI is likely the only way they'll ever have the chance to see the light of day.
Your creative vision doesn't entitle you to profit from others' hard work, just because you don't want to put in the work to learning those skills yourself.
I imagine I'm about to talk to a brick wall, because I see that message nearly word-for-word whenever AI ethics comes up. But to hell with it. I'm already miserable, not like talking to a stubborn brick wall is going to make me anymore miserable than I already am.
That's the problem and I get the sense you didn't read my message. I know how to 3d model. I know how to make textures, how to animate, how to write, how to make sound effects. I literally know how to do nearly every part of the development process. I'm telling you that this isn't a case of not wanting to learn the skills. This is a case of game development being so ridiculously complex that the feasibility of a single person being able to create a game ranges from "easily possible" to "that's literally impossible, you'd never make it a reality even with every developer in the world working on it".
You're coming into this looking at it like every creative pursuit is the same as traditional art. You plop a skilled person down in front of a canvas and they can make a beautiful artwork all by themselves. However, the same is not true for games. I have most of the skills necessary to make a game, from scratch, and I'm telling you that this has nothing to do with being unwilling to learn new skills; this is entirely about the fact that games are so ridiculously complex that it doesn't matter what your skill set is, as it stands right now some games are so complex they can only be built as a capitalist pursuit, not as a creative one.
The skills will not be obsolete, I guarantee there will be a market for people to still do all of the drawing/digital art/whatever they do
There will also be AI tools that they will likely need to learn or be they will be left behind by the majority, sure, but that's what happens when a new tool shakes up your industry
Also, never made fun of anyone or didn't have empathy, I said it was funny to watch in real time as an industry shifts to new technology, so chill
The thing that gets me is people trying to rationalize it as "well dont people learn from and get influenced by referencing other peoples work" and it's like yeah but a person cant do that as quickly as an AI can and then that individual cant then go on to work for thousands or millions of people at once. Also its so transparently clear that once this tech matures it will be used by major companies and employers that hire creatives as a way to not have to pay actual artists. The savings then get passed on to executives up top.
I feel like I dont have an issue with AI being able to create cool stuff, but if you want to make are "free" and for the "masses" then you cant make money off of it. Full stop.
I work at a small non-profit publisher and our clients respecting copyright is basically what decides if we continue existing or not. I struggle as well with the general "end all copyright" sentiment. There’s this idea that circumventing copyright means sticking it to corporations, as if their creative employees making a living don’t exist.
Furthermore, I feel that generative AI is just the latest tech bro venture based on siphoning revenues out from under existing businesses whilst escaping the laws that apply to the sector. Advertisement revenues were siphoned from under the press, hotels are facing competition from business subverting residential housing, restaurants are being charged exorbitant prices to get their goods delivered. The ambient cynicism serves to maintain indifference towards these unethical tactics.
I work at a small non-profit publisher and our clients respecting copyright is basically what decides if we continue existing or not
you might think that but it's not true. if people value your work they will pay you to make more of it. if they don't then no amount of copyright restriction will ever keep you employed.
The way I see it is the main problem is actually the training databases. If these companies have gathered a giant database for use in training and not paid the people who created the training material, then they are engaging in piracy. It seems like they should be paying royalties to whoever owns the rights to their training material for each use of the AI.
AI is trained on years, even centuries of work made by generations of people.
AI then threatens to replace hundreds of thousands of jobs, to the benefit of huge corporations who could afford to deploy AI.
AI could not entirely replace human input at the current stage, but it definitely replaces entry level jobs. Leaving little room to grow for new graduates.
Since AI will not get tired and will not complain, major corporations really like them (See Hollywood executives).
We must ACT NOW. (Like the writers guild in the US.)
This is speaking from a writer's perspective, your mileage may vary. I used to ask my younger colleague to help with first drafts. Now it may be faster to just use ChatGPT. So how could they grow to become an editor?
SAG-AFTRA was very smart to make AI writing a wedge issue. The technology isn't quite there yet, but it will be very soon and by that point it would've been too late to assert their rights.
Embracing AI and automation as tools can actually enhance your skill set and help you create more impactful work. If you're in a creative field, this technology can elevate your projects to a whole new level.
Would love to see how well this argument goes with people already being affected negatively by AI. For how great this tech is supposed to be it somehow only attracts the worst people to defend it, funny that!
So.. first things first. I'm a happy Midjourney user and post quite a bit of stuff over at one of the other Lemmy communities (same name, different account). But, I only use the AI for fun and never for profit. I can give tons of justifications but in the end it comes down to this: I'm a crappy artist and I have a vivid imagination. AI gives me an outlet to visualize the things in my head and the rush of seeing them in real is really nice.
That being said. One of the things I don't do, is write prompts like "in the style of ....". Specifically because I don't want it to be a copy of someone's work, even if it is for personal use. It feels (and obviously is) wrong.
Maybe not a perfect solution, but they should remove all the artist names (those alive or less than 50(?) years dead) from the current models. If your name isn't in it, then it'll be a lot harder to recreate your style.
In the longer run, a register of what prompt and which model were used for AI generated images might help with copyright claims? The EU is already busy with legislation for registering AI models. This might be a logical follow-up?
I'm just throwing out ideas at this point. I'm not an expert in any of these fields (AI, legal, copyright, etc.) All I know is that it would definitely be a net loss for society if small artists are no longer able to make a living practicing their profession.
If your name isn't in it, then it'll be a lot harder to recreate your style.
Harder, but not imposible. There are already prompt dictionaries out there, and if you check some mobile apps that offer AI art generation, you can see how they offer "styles" that clearly append some behind the scenes settings to the prompt. Some also carry prompt dictionaries directly.
Midjourney is also just a centralized version of stable diffusion, you can run the software on your own, with whatever LoRA modifiers you want, including one "in the style of [...]".
If AI generated art is a close derivative of another work, then copyright already applies.
But when it comes to vague abstractions over multiple works that isn't like any one of them, copyright is probably not the right fix for what is fundamentally a more general problem. Copyright has never covered that sort of thing, so you would be asking for an unprecedented expansion to copyright, and that would have immense negative consequences that would do more harm than good.
There are two ways I could see in which copyright could be extended (both of which are a bad idea, as I'd explain).
Option 1 would be to take a 'colour of bits' approach (borrowing the terminology from https://ansuz.sooke.bc.ca/entry/23). The analogy of 'bits' having a colour and not just being a 0 or 1 has been used to explain how to be conservative about ensuring something couldn't possibly be a copyright violation - if a bit that is coloured with copyright is used to compute another bit in any way (even through combination with another untainted bit), then that bit is itself coloured with copyright. The colour of bits is not currently how copyright law works, but it is a heuristic that is overly conservative right now of how to avoid copyright violation. Theoretically the laws around copyright and computing could change to make the colour of bits approach the law. This approach, taken strictly, would mean that virtually all the commercial LLMs and Stable Diffusion models are coloured with the copyrights of all inputs that went into them, and any output from the models would be similarly coloured (and hence in practice be impossible to use legally).
There are two major problems with this: firstly, AI models are essentially a rudimentary simulation of human thinking (neural networks are in fact inspired by animal neurons). Applying the same rule to humans would mean that if you've ever read a copyrighted book, everything you ever say, write, draw or otherwise create after that is copyright to the author of that book. Applying a different rule to computers than to humans would mean essentially ruling out ever automating many things that humans can do - it seems like an anti-tech agenda. Limiting technology solely for the benefit some people now seems short sighted. Remember, once people made their livelihoods in the industry of cutting ice from the arctic and distributing it on ships for people to keep their food cold. Their made their livelihoods lighting gas lamps around cities at dawn and extinguishing them at dusk. Society could have banned compressors in refrigerators and electric lighting to preserve those livelihoods, but instead, society advanced, everyone's lives got better, and people found new livelihoods. So a colour of bits approach either applies to humans, and becomes an unworkable mess of every author you've ever read basically owns all your work, or it amounts to banning automation in cases where humans can legally do something.
The second problem with the colour of bits approach is that it would undermine a lot of things that we have already been doing for decades. Classifiers, for example, are often trained on copyrighted inputs, and make decisions about what category something is in. For example, most email clients let you flag a message as spam, and use that to decide if a future message is spam. A colour of bits approach would mean the model that decides whether or not a message is spam is copyright to whoever wrote the spam - and even the Yes/No decision is also copyright to them, and you'd need their permission to rely on it. Similarly for models that detect abuse or child pornography or terrorist material on many sites that accept user-generated content. Many more models that are incredibly important to day-to-day life would likely be impacted in the same way - so it would be incredibly disruptive to tech and life as we know it.
Another approach to extending copyright, also ill-advised, would be to extend copyright to protect more general elements like 'style', so that styles can be copyrighted even if another image doesn't look the same. If this was broadened a long way, it would probably just lead to constant battles between artists (or more likely, studios trying to shut down artists), and it is quite likely that no artist could ever publish anything without a high risk of being sued.
So copyright is probably not a viable solution here, so what is? As we move to a 'post-scarcity' economy, with things automated to the extent that we don't need that many humans working to produce an adequate quality of life for everyone, the best solution is a Universal Basic Income (UBI). Everyone who is making something in the future and generating profits is almost certainly using work from me, you, and nearly every person alive today (or their ancestors) to do so. But rather than some insanely complex computation about who contributed the most that becomes unworkable, just tax all profit to cover it, and pay a basic income to everyone. Then artists (and everyone else) can focus on meaning and not profit, knowing they will still get paid the UBI no matter what, and contribute back to the commons, and copyright as a concept can be essentially retired.