First off, I am second positive, pro porn, pro sex work, and don't believe sex work should be shameful, and that there is nothing **inherently** wrong about buying intimacy from a willing seller.
-
Therapy is well and good and I think we need far more avenues available for people to get help (for all issues). That said, sexuality and attraction are complicated.
Let me start by saying I am not trying to state there is a 1:1 equivalence, this is just a comparison, but we have long abandoned conversion therapy for homosexuals, because we've found these preferences are core to them and not easily overwritten. The same is true for me as a straight person, I don't think therapy would help me find men attractive. I have to imagine the same is true for pedophiles.
The question is, if AI can produce pornography that can satisfy the urges of someone with pedophilia without harming any minors, is that a net positive? Remember the attraction is not the crime, it's the actions that harm others that are. Therapy should always be on the table.
This is a tricky subject because we don't want to become thought police, so all our laws are built in that manner. However there are big exceptions for sexual crimes due to the gravity of their impact on society. It's very hard to "stand up" for pedophilia because if acted upon it has monstrous effects, but AI is making us open this can of worms that I don't belive we ever really thought through besides criminalizing and demonizing (which could be argued was the correct approach with the technology at the time).
That said, sexuality and attraction are complicated.
There's nothing particularly complicated about it not being ok to rape kids, or to distribute depictions of kids being raped.
-
i have no problem with ai porn assuming it's not based on any real identities, i think that should be considered identity theft or impersonation or something.
Outside of that, it's more complicated, but i don't think it's a net negative, people will still thrive in the porn industry, it's been around since it's been possible, i don't see why it wouldn't continue.
i have no problem with ai porn assuming it’s not based on any real identities
With any model in use, currently, that is impossible to meet. All models are trained on real images.
-
With this logic, any output of any pic gen AI is abuse.. I mean, we can 100% be sure that there are CP in training data (it would be a very bug surprise if not) and all output is result of all training data as far as I understand the statistical behaviour of photo gen AI.
With this logic, any output of any pic gen AI is abuse
Yes?
-
It’s so much simpler than that—it can be created now, so it will be. They will use narrative twists to post it on the clearnet, just like they do with anime (she’s really a 1000 year old vampire, etc.). Creating laws to allow it are simply setting the rules of the phenomenon that is already going to be happening.
The only question is whether or not politicians will stop mud slinging long enough to have an adult conversation, or will we just shove everything into the more obscure parts of the internet and let it police itself.
No adult conversation required, just a quick "looks like we don't get internet privacy after all everyone." And erosion of more civil liberties. Again.
-
i have no problem with ai porn assuming it's not based on any real identities, i think that should be considered identity theft or impersonation or something.
Outside of that, it's more complicated, but i don't think it's a net negative, people will still thrive in the porn industry, it's been around since it's been possible, i don't see why it wouldn't continue.
-
First off, I am second positive, pro porn, pro sex work, and don't believe sex work should be shameful, and that there is nothing inherently wrong about buying intimacy from a willing seller.
That said. The current state of the industry and the conditions for many professionals raises serious ethical issues. Coercion being the biggest issue.
I am torn about AI porn. On one hand it can produce porn without suffering, on the other hand it might be trained on other peoples work and take peoples jobs.
I think another major point to consider going forward is if it is problematic if people can generate all sorts of illegal stuff. If it is AI generated it is a victimless crime, so should it be illegal? I personally feel uncomfortable with the thought of several things being legal, but I can't logically argue for it being illegal without a victim.
Whats illegal in real porn should be illegal in AI porn, since eventually we won’t know whether it’s AI
-
With this logic, any output of any pic gen AI is abuse.. I mean, we can 100% be sure that there are CP in training data (it would be a very bug surprise if not) and all output is result of all training data as far as I understand the statistical behaviour of photo gen AI.
There is no ethical consumption while living a capitalist way of life.
-
That said, sexuality and attraction are complicated.
There's nothing particularly complicated about it not being ok to rape kids, or to distribute depictions of kids being raped.
Nobody here has at all suggested it's ok to rape kids. I hope you can understand the difference between thinking something and doing something.
-
There is no ethical consumption while living a capitalist way of life.
ML always there to say irrelevant things
-
Basically every pedo in prison is one who isn't abusing kids. Every pedo on a list is one who won't be left alone with a young family member. Actually reducing AI CP doesn't actually by itself do anything.
Wrong. Every pedo in prison is one WHO HAS ALREADY ABUSED A CHILD, whether directly or indirectly. There is an argument to be made, and some studies that show, that dealing with Minor Attracted People before they cross the line can be effective. Unfortunately, to do this we need to be able to have a logical and civil conversation about the topic, and the current political climate does not allow for that conversation to be had. The consequence is that preventable crimes are not being prevented, and more children are suffering for it in the long run.
-
Whats illegal in real porn should be illegal in AI porn, since eventually we won’t know whether it’s AI
That's the same as saying we shouldn't be able to make videos with murder in them because there is no way to tell if they're real or not.
-
First off, I am second positive, pro porn, pro sex work, and don't believe sex work should be shameful, and that there is nothing inherently wrong about buying intimacy from a willing seller.
That said. The current state of the industry and the conditions for many professionals raises serious ethical issues. Coercion being the biggest issue.
I am torn about AI porn. On one hand it can produce porn without suffering, on the other hand it might be trained on other peoples work and take peoples jobs.
I think another major point to consider going forward is if it is problematic if people can generate all sorts of illegal stuff. If it is AI generated it is a victimless crime, so should it be illegal? I personally feel uncomfortable with the thought of several things being legal, but I can't logically argue for it being illegal without a victim.
without a victim
You are wrong.
AI media models have to be trained on real media. The illegal content would mean illegal media and bentiting/supporting/profiting from a crime at minimum.
-
what is the law's position on AI-generated child porn?
Already illegal here in the UK https://metro.co.uk/2025/02/02/makers-ai-child-abuse-images-jailed-uk-introduces-world-first-law-22481459/
Illegal is most of the west already as creating sexual assault material of minors is already illegal regardless of method.
-
this ones a classic.
-
i have no problem with ai porn assuming it’s not based on any real identities
With any model in use, currently, that is impossible to meet. All models are trained on real images.
With any model in use, currently, that is impossible to meet. All models are trained on real images.
yes but if i go to thispersondoesnotexist.com and generate a random person, is that going to resemble the likeness of any given real person close enough to perceptibly be them?
You are literally using the schizo argument right now. "If an artists creates a piece depicting no specific person, but his understanding of persons is based inherently on the facial structures of other people that he knows and recognizes, therefore he must be stealing their likeness"
-
Wrong. Every pedo in prison is one WHO HAS ALREADY ABUSED A CHILD, whether directly or indirectly. There is an argument to be made, and some studies that show, that dealing with Minor Attracted People before they cross the line can be effective. Unfortunately, to do this we need to be able to have a logical and civil conversation about the topic, and the current political climate does not allow for that conversation to be had. The consequence is that preventable crimes are not being prevented, and more children are suffering for it in the long run.
People are locked up all the time for just possessing child porn without having abused anyone. This isn't a bad thing because they are a danger to society.
-
People are locked up all the time for just possessing child porn without having abused anyone. This isn't a bad thing because they are a danger to society.
No, they are not locked up because they're a danger to society. They're locked up because possessing CP is indirectly contributing to the abuse of the child involved.
-
I believe, in the US it is protected by the first amendment.
CSAM, artificial or not, is illegal in the United States.
I see. I've looked up the details. Obscenity - whatever that means - is not protected by the first amendment. So where the material is obscene, it is still illegal.
https://en.wikipedia.org/wiki/Ashcroft_v._Free_Speech_Coalition
-
With any model in use, currently, that is impossible to meet. All models are trained on real images.
yes but if i go to thispersondoesnotexist.com and generate a random person, is that going to resemble the likeness of any given real person close enough to perceptibly be them?
You are literally using the schizo argument right now. "If an artists creates a piece depicting no specific person, but his understanding of persons is based inherently on the facial structures of other people that he knows and recognizes, therefore he must be stealing their likeness"
No, the problem is a lack of consent of the person being used.
And now, being used to generate depictions of rape and CSAM.
-
without a victim
You are wrong.
AI media models have to be trained on real media. The illegal content would mean illegal media and bentiting/supporting/profiting from a crime at minimum.
Excuse me? I am very offended by your insinuations here. It honestly makes me not want to share my thought and opinions at all. I am not in any way interested in this kind of content.
I encourage you to read my other posts in the different threads here and see. I am not an apologist, and do not condone it either.
I do genuinely believe AI can generate content it is not trained on, that's why I claimed it can generate illegal content without a victim. Because it can combine stuff from things it is trained on and end up with something original.
I am interested in learning and discussing the consequences of an emerging and novel technology on society. This is a part of that. Even if it is uncomfortable to discuss.
You made me wish I didn't..