First off, I am second positive, pro porn, pro sex work, and don't believe sex work should be shameful, and that there is nothing **inherently** wrong about buying intimacy from a willing seller.
-
I mean, there’s another side to this.
Assume you have exacting control of training data. You give it consensual sexual play, including rough play, bdsm play, and cnc play. We are 100% certain the content is consensual in this hypothetical.
Is the output a grey area, even if it seems like real rape?
Now another hypothetical. A person closes their eyes and imagines raping someone. “Real” rape. Is that a grey area?
Let’s build on that. Let’s say this person is a talented artist, and they draw out their imagined rape scene, which we are 100% certain is a non-consensual scene imagined by the artist. Is this a grey area?
We can build on that further. What if they take the time to animate this scene? Is that a grey area?
When does the above cross into a problem? Is it the AI making something that seems like rape but is built on consensual content? The thought of a person imagining a real rape? The putting of that thought onto a still image? The animating?
Or is it none of them?
We already allow simulated rape in tv and movies. AI simply allows a more graphical portrayal.
-
I think another major point to consider going forward is if it is problematic if people can generate all sorts of illegal stuff. If it is AI generated it is a victimless crime, so should it be illegal? I personally feel uncomfortable with the thought of several things being legal, but I can't logically argue for it being illegal without a victim.
I've been thinking about this recently too, and I have similar feelings.
I'm just gonna come out and say it without beating around the bush: what is the law's position on AI-generated child porn?
More importantly, what should it be?
It probably goes without saying that the training data absolutely should not contain CP, for reasons that should be obvious to anybody. But what if it wasn't?
If we're basing the law on pragmatism rather than emotional reaction, I guess it comes down to whether creating this material would embolden paedophiles and lead to more predatory behaviour, or whether it would satisfy their desires enough to cause a substantial drop in predatory behaviour.
And to know that, we'd need extensive and extremely controversial studies. Beyond that, even in the event allowing this stuff to be generated is an overall positive (and I don't know whether it would or won't), will many politicians actually call for this stuff to be allowed? Seems like the kind of thing that could ruin a political career. Nobody's touching that with a ten foot pole.
what is the law’s position on AI-generated child porn?
the simplest possible explanation here, is that any porn created based on images of children, is de facto illegal. If it's trained on adults explicitly, and you prompt it for child porn, that's a grey area, probably going to follow precedent for drawn art, rather than real content.
-
i have no problem with ai porn assuming it's not based on any real identities, i think that should be considered identity theft or impersonation or something.
Outside of that, it's more complicated, but i don't think it's a net negative, people will still thrive in the porn industry, it's been around since it's been possible, i don't see why it wouldn't continue.
Identity theft only makes sense for businesses. I can sketch naked Johny Depp in my sketchbook and do whatever I want with it and no one can stop me. Why should an AI tool be any different if distribution is not involved?
-
Identity theft only makes sense for businesses. I can sketch naked Johny Depp in my sketchbook and do whatever I want with it and no one can stop me. Why should an AI tool be any different if distribution is not involved?
revenge porn, simple as. Creating fake revenge porn of real people is still to some degree revenge porn, and i would argue stealing someones identity/impersonation.
To be clear, you're example is a sketch of johnny depp, i'm talking about a video of a person that resembles the likeness of another person, where the entire video is manufactured. Those are fundamentally, two different things.
-
revenge porn, simple as. Creating fake revenge porn of real people is still to some degree revenge porn, and i would argue stealing someones identity/impersonation.
To be clear, you're example is a sketch of johnny depp, i'm talking about a video of a person that resembles the likeness of another person, where the entire video is manufactured. Those are fundamentally, two different things.
Again you're talking about distribution
-
Again you're talking about distribution
sort of. There are arguments that private ownership of these videos is also weird and shitty, however i think impersonation and identity theft are going to the two most broadly applicable instances of relevant law here. Otherwise i can see issues cropping up.
Other people do not have any inherent rights to your likeness, you should not simply be able to pretend to be someone else. That's considered identity theft/fraud when we do it with legally identifying papers, it's a similar case here i think.
-
Am I reading this right? You're for prosecuting people who have broken no laws?
I'll add this; I have sexual fantasies (not involving children) that would be repugnant to me IRL. Should I be in jail for having those fantasies, even though I would never act on them?
This sounds like some Minority Report hellscape society.
Am I reading this right? You’re for prosecuting people who have broken no laws?
No I'm for making it against the law to simulate pedophile shit as the net effect is fewer abused kids than if such images were to be legal
-
sort of. There are arguments that private ownership of these videos is also weird and shitty, however i think impersonation and identity theft are going to the two most broadly applicable instances of relevant law here. Otherwise i can see issues cropping up.
Other people do not have any inherent rights to your likeness, you should not simply be able to pretend to be someone else. That's considered identity theft/fraud when we do it with legally identifying papers, it's a similar case here i think.
But the thing is it's not a relevant law here at all as nothing is being distributed and no one is being harmed. Would you say the same thing if AI is not involved? Sure it can be creepy and weird and whatnot but it's not inhertly harmful or at least it's not obvious how it would be.
-
Correct. This quickly approaches thought crime.
What about an AI gen of a violent rape and murder. Shouldn't that also be illegal.
But we have movies that have protected that sort of thing for years; graphically. Do those the become illegal after the fact?
And we also have movies of children being victimized so do these likewise become illegal?
We already have studies that show watching violence does not make one violent and while some refuse to accept that, it is well established science.
There is no reason to believe the same isn't true for watching sexual assault. There are been many many movies that contain such scenes.
But ultimately the issue will become that there is no way to prevent it. The hardware to generate this stuff is already in our pockets. It may not be efficient but it's possible and efficiency will increase.
The prompts to generate this stuff are easily shared and there is no way to stop that without monitoring all communication and even then I'm sure work around would occur.
Prohibition requires society sacrifice freedoms and we have to decide what weee willing to sacrifice here because as we've seen with or prohibitions, once we unleash the law on one, it can be impossible to undo.
Ok watch adult porn then watch a movie in which women or children are abused. Note how the abuse is in no way sexualized exactly opposite of porn. It often likely takes place off screen and when rape in general appears on screen between zero and no nudity co-occurs. For children it basically always happens off screen.
Simulated child abuse has been federally illegal for ~20 years in the US and we appear to have very little trouble telling the difference between prosecuting pedos and cinema even whilst we have struggled enough with sexuality in general.
But ultimately the issue will become that there is no way to prevent it.
This argument works well enough for actual child porn. We certainly don't catch it all but every prosecution takes one more pedo off the streets. The net effect is positive. We don't catch most car thieves either and nobody suggests we legalize car theft.
-
I think another major point to consider going forward is if it is problematic if people can generate all sorts of illegal stuff. If it is AI generated it is a victimless crime, so should it be illegal? I personally feel uncomfortable with the thought of several things being legal, but I can't logically argue for it being illegal without a victim.
I've been thinking about this recently too, and I have similar feelings.
I'm just gonna come out and say it without beating around the bush: what is the law's position on AI-generated child porn?
More importantly, what should it be?
It probably goes without saying that the training data absolutely should not contain CP, for reasons that should be obvious to anybody. But what if it wasn't?
If we're basing the law on pragmatism rather than emotional reaction, I guess it comes down to whether creating this material would embolden paedophiles and lead to more predatory behaviour, or whether it would satisfy their desires enough to cause a substantial drop in predatory behaviour.
And to know that, we'd need extensive and extremely controversial studies. Beyond that, even in the event allowing this stuff to be generated is an overall positive (and I don't know whether it would or won't), will many politicians actually call for this stuff to be allowed? Seems like the kind of thing that could ruin a political career. Nobody's touching that with a ten foot pole.
what is the law's position on AI-generated child porn?
Already illegal here in the UK https://metro.co.uk/2025/02/02/makers-ai-child-abuse-images-jailed-uk-introduces-world-first-law-22481459/
-
Am I reading this right? You’re for prosecuting people who have broken no laws?
No I'm for making it against the law to simulate pedophile shit as the net effect is fewer abused kids than if such images were to be legal
Lol, how can you say that do confidently? How would you know that with fewer AI CP you get less abused kids? And what is the logic behind it?
Demand doesn’t really drop if something is illegal (same goes for drugs). The only thing you reduce is offering, which just resulting in making the thing that got illegal more valuable (this wakes attention of shady money grabbers that hate regulation / give a shit about law enforcement and therefore do illegal stuff to get money) and that you have to pay a shitton of government money maintaining all the prisons.
-
Watching videos of rape doesn't create a new victim. But we consider it additional abuse of an existing victim.
So take that video and modify it a bit. Color correct or something. That's still abuse, right?
So the question is, at what point in modifying the video does it become not abuse? When you can't recognize the person? But I think simply blurring the face wouldn't suffice. So when?
That's the gray area. AI is trained on images of abuse (we know it's in there somewhere). So at what point can we say the modified images are okay because the abused person has been removed enough from the data?
I can't make that call. And because I can't make that call, I can't support the concept.
With this logic, any output of any pic gen AI is abuse.. I mean, we can 100% be sure that there are CP in training data (it would be a very bug surprise if not) and all output is result of all training data as far as I understand the statistical behaviour of photo gen AI.
-
With this logic, any output of any pic gen AI is abuse.. I mean, we can 100% be sure that there are CP in training data (it would be a very bug surprise if not) and all output is result of all training data as far as I understand the statistical behaviour of photo gen AI.
We could be sure of it if AI curated it's inputs, which really isn't too much to ask.
-
It's not just AI that can create content like that though. 3d artists have been making victimless rape slop of your vidya waifu for well over a decade now.
Yeah, I'm ok with that.
AI doesn't create, it modifies. You might argue that humans are the same, but I think that'd be a dismal view of human creativity. But then we're getting weirdly philosophical.
-
I see the issue with how much of a crime is enough for it to be okay, and the gray area. I can't make that call either, but I kinda disagree with the black and white conclusion. I don't need something to be perfectly ethical, few things are. I do however want to act in a ethical manner, and strive to be better.
Where do you draw the line?
It sounds like you mean no AI can be used in any cases, unless all the material has been carefully vetted?I highly doubt there isn't illegal content in most AI models of any size by big tech.
I am not sure where I draw the line, but I do want to use AI services, but not for porn though.
It just means I don't use AI to create porn. I figure that's as good as it gets.
-
I mean, there’s another side to this.
Assume you have exacting control of training data. You give it consensual sexual play, including rough play, bdsm play, and cnc play. We are 100% certain the content is consensual in this hypothetical.
Is the output a grey area, even if it seems like real rape?
Now another hypothetical. A person closes their eyes and imagines raping someone. “Real” rape. Is that a grey area?
Let’s build on that. Let’s say this person is a talented artist, and they draw out their imagined rape scene, which we are 100% certain is a non-consensual scene imagined by the artist. Is this a grey area?
We can build on that further. What if they take the time to animate this scene? Is that a grey area?
When does the above cross into a problem? Is it the AI making something that seems like rape but is built on consensual content? The thought of a person imagining a real rape? The putting of that thought onto a still image? The animating?
Or is it none of them?
Consensual training data makes it ok. I think AI companies should be accountable for curating inputs.
Any art is ok as long as the artist consents. Even if they're drawing horrible things, it's just a drawing.
Now the real question is, should we include rapes of people who have died and have no family? Because then you can't even argue increased suffering of the victim.
But maybe this just gets solved by curation and the "don't be a dick" rule. Because the above sounds kinda dickish.
-
Identity theft only makes sense for businesses. I can sketch naked Johny Depp in my sketchbook and do whatever I want with it and no one can stop me. Why should an AI tool be any different if distribution is not involved?
-
I think another major point to consider going forward is if it is problematic if people can generate all sorts of illegal stuff. If it is AI generated it is a victimless crime, so should it be illegal? I personally feel uncomfortable with the thought of several things being legal, but I can't logically argue for it being illegal without a victim.
I've been thinking about this recently too, and I have similar feelings.
I'm just gonna come out and say it without beating around the bush: what is the law's position on AI-generated child porn?
More importantly, what should it be?
It probably goes without saying that the training data absolutely should not contain CP, for reasons that should be obvious to anybody. But what if it wasn't?
If we're basing the law on pragmatism rather than emotional reaction, I guess it comes down to whether creating this material would embolden paedophiles and lead to more predatory behaviour, or whether it would satisfy their desires enough to cause a substantial drop in predatory behaviour.
And to know that, we'd need extensive and extremely controversial studies. Beyond that, even in the event allowing this stuff to be generated is an overall positive (and I don't know whether it would or won't), will many politicians actually call for this stuff to be allowed? Seems like the kind of thing that could ruin a political career. Nobody's touching that with a ten foot pole.
what is the law’s position on AI-generated child porn?
Pretend underage porn is illegal in the EU and some other countries. I believe, in the US it is protected by the first amendment.
Mind that when people talk about child porn or CSAM that means anything underage, as far as politics is concerned. When two 17-year-olds exchange nude selfies, that is child porn. There were some publicized cases of teens in the US being convicted as pedophile sex offenders for sexting.
-
First off, I am second positive, pro porn, pro sex work, and don't believe sex work should be shameful, and that there is nothing inherently wrong about buying intimacy from a willing seller.
That said. The current state of the industry and the conditions for many professionals raises serious ethical issues. Coercion being the biggest issue.
I am torn about AI porn. On one hand it can produce porn without suffering, on the other hand it might be trained on other peoples work and take peoples jobs.
I think another major point to consider going forward is if it is problematic if people can generate all sorts of illegal stuff. If it is AI generated it is a victimless crime, so should it be illegal? I personally feel uncomfortable with the thought of several things being legal, but I can't logically argue for it being illegal without a victim.
I've found that there's a lot of things on the Internet that went wrong because it was ad supported for "free". Porn is one of them.
There is ethically produced porn out there, but you're going to have to pay for it. Incidentally, it also tends to be better porn overall. The versions of the videos they put up on tube sites are usually cut down, and are only part of their complete library. Up through 2012 or so, the tube sites were mostly pirated content, but then they came to an agreement with the mainstream porn industry. Now it's mostly the studios putting up their own content (plus independent, verified creators), and anything pirated gets taken down fast.
Anyway, sites like Crash Pad Series, Erika Lust, Vanessa Cliff, and Adulttime (the most mainstream of this list) are worth a subscription fee.
-
Again you're talking about distribution
I guess the point is this enables the mass production of revenge porn and the like which makes it much harder to punish and prevent distribution. when it is relatively few sources that produces the unwanted product then only punishing the distribution might be a viable method. But when the production method becomes available to the masses then the only feasible control mechanism is to try to regulate the production method. It is all a matter of where is the most efficient position to put the bottle neck.