I don't think just giving up and allowing porn deep fakes and stuff of people is really an acceptable answer here. The philosophical discussion of whether it's "actually them" or not doesn't really matter, it's still intrusive, violating and gross. In the same way that stealing someone's identity is illegal, it doesn't matter that the identity created by the con man isn't the real you, damage can be done with identity theft.
Maybe there's nothing you can do about it on the dark web, but sites absolutely can manage deepdakes, in the same way that PornHub will take down non-consensual ex-girlfriend type content.
The way people are just throw up their hands at every new problematic issue with AI is not a good way of dealing with them, we don't have to tolerate this stuff and now is the best time to deal with it and start creating rules and protections.
The way people are just throw up their hands at every new problematic issue with AI is not a good way of dealing with them, we don’t have to tolerate this stuff and now is the best time to deal with it and start creating rules and protections.
Exactly. In another thread on here recently someone said something that basically boiled down to "your protest against AI isn't going to stop it. There's too much corporate power behind it. So you might as well embrace it" and I just cannot get my head around that mentality.
Also, you can absolutely see the models who were used as references in some of the images generated by apps these days. Like that popular one right now that everyone is using to make idealized images of themselves. A few of my family and friends used it recently and you could clearly see in some of the pics the A-list celebs who were used as pose references, like Gal Godot, Scarlett Johansen, etc. It's creepy as hell.
Does it matter if someone jerks off to JaLo in the Fappening or some random AI generated BS?
The issue is that this technology can be used to create pornographic material of anyone that has some level of realism without their consent. For creators and the average person, this is incredibly harmful. I don't want porn of myself to be made and neither do a lot of creators online.
Not only are these images an affront to the dignity of people but it can also be incredibly harmful for someone to see porn of themselves they did not make with someone else's body.
This is a matter of human decency and consent. It is not negotiable.
As mentioned by @ram@lemmy.ca, this can also be used for other harmful things like CSAM which is genuinely terrifying.
But it's also a natural evolution of what we've been doing as a species since before we were a species.
Does imagining a different partner while having sex or masturbating count? I would imagine most people would say, "no".
How about if somebody draws a crude stick figure of somebody they met on the street? Unless you're Randall Munroe, this is probably harmless too.
Now a highly skilled portrait artist paints a near replica of somebody he knows, but has never seen in the nude. They never mention their friend by name, but the output is lifelike and unmistakably them.
Maybe a digital artist finds a few social media pictures of a person and decided to test drive Krita and manipulates them into appearing nude.
Or, and this happened to me quite recently, you find your porn doppelganger. My spouse found mine and it ruined her alone time. And they really did look just like me! Taking that a step further, is it illegal to find somebody's doppelganger and to dress them up so that they look more like their double?
Like you, I don't want people like this in my life. But it feels like this is one of those slippery slopes that turns out to be an actual slippery slope.
You can't make it illegal without some serious downstream effects.
If you did, the servers will just get hosted in an Eastern European country that is happy to lulwat at American warrants.
I don't have any answers, just more Devil's advocate-esque questions. If there was a way to make it illegal without any collateral damage, I'd be proudly behind you leading the charge. I just can't imagine a situation where it wouldn't get abused, a'la the DMCA.
You say that NOW, but if people start using your images to generate revenge porn or, you know, really anything you didn't consent to, that's a huge problem.
Both for the people whose images were used to train the model and for the people whose images are generated using the models.
Let's not forget that these AI aren't limited by age. Like fuck am I gonna be out here defending tech that would turn my kid into CSAM. Fucking disgusting.
As someone who personally wouldn't care at all if someone made AI porn of me and masturbated to it, I am incredibly uncomfortable with the idea that someone who doesn't like me may have the option to generate AI porn of me having sex with a child. Now there's fake "proof" I'm a pedophile, and I get my life ruined for sex I never had, for violation of consent I never actually committed. Even if I'm vindicated in court, I might still be convicted in the court of public opinion. And people could post faked porn of me and send it to companies to try to say "Evergreen5970 is promiscuous, don't hire them." Not all of us have the luxury of being able to pick and choose between companies depending on whether they match our values, some of us have to take what they can get and sometimes that would include companies that would judge you for taking nude photos of yourself. It would feel especially bad given I'm a virgin by choice who has never taken nudes let alone sent them. Punished for something I didn't do.
Not everyone is going to restrict their use to their private wank sessions, to making a real image of the stuff they probably already envision in their imagination. Some will do their best to make its results public with the full intention of using it to do harm.
And once faking abuse with AI porn becomes well-known, it might discredit actual photographic/video proof of CSAM happening. Humans get fooled by whether an AI-generated image was taken by a human or generated by AI, and AI doesn't detect AI-generated images with a perfect accuracy rate. So the question becomes "how can we trust any image anymore?" Not to mention the ability to generate new CSAM with AI. Some more mainstream AI models might try to tweak algorithms to prevent people from generating any porn involving minors, but there'll probably always be some floating around with those guardrails turned off.
I'm also very wary of dismissing other peoples' discomfort just because I don't share it. I'm still worried for people who would care about someone making AI porn of them even if it was just to masturbate with and kept private.
I worry that the cat is out of the bag on this. The tech for this stuff is out there, and you can run it on your home computer, so barring some sort of massive governmental overreach I don't see a way to stop it.
They can't even stop piracy and there's the full weight of the US copyright industry behind it. How are they going to stop this tech?
The point isnt that its too late, its that the hype is overblown. Go to the website this article mentions and follow the instructions (explore, turn on nsfw, type name and nude) and you will VERY QUICKLY realize the technology is just shit.
You can see some resemblances and sometimes one close one, but like another poster said they just look like the same shitty fan fiction we've had since Photoshop came about.
Also, this could end porn blackmail when none can tell if its real or not. People will start judging the person who us supplying the material rather than the person in it.
If it ends porn blackmail, it also ends photographic evidence. I think that's significantly worse.
And sure the tech is bad if you just type a name directly into a model, but if you take the time to refine it it gets pretty good, and it's only going to get better over time. It's time to start thinking about a future where this tech exists.
This doesn't even feel like an article - more like one long advertisement. The second paragraph of the article launches into a review of the "Erect Horse Penis - Concept LoRA"
Yeah i picked up quite a few tips for generating good AI porn in this article. Im still not sure if this was satire or not. it basically gives you step by step how to do this.
The tech isn't there yet. There are so often distracting flaws around the hands/feet. The AI doesn't really know what a human is, its just endlessly re-combining existing material.
As much as I loathe having to reveal this to you, the shapeliness of the hands should be semi-negligible to most people who would love to have an image created from the statement "I want to see Billie Eilish's boobs".
Just because the ai produces a model with Billie eilishes face and a naked body does not mean you've seen her nude for real.
Its the exact same as drawing a naked lady and then drawing Billie eilish's face on it.
If that really gets you off and really violates her autonomy in some way I'd be interested to hear how. Its not currently illegal to draw real people in fictional scenarios is it?
Yeah some body parts are a little weird today, but what about tomorrow, next week, next month, next year?
I really haven't given this much attention but the last time I did maybe 6-8 months ago, most of the photos had hands that were stuff of nightmares. Looking at them again today at least from the quick 10 minutes of looking they have improve significantly. Yeah they are still far from perfect but a handful are very good, most are passable and a few are still nightmare fuel.