First off, I am second positive, pro porn, pro sex work, and don't believe sex work should be shameful, and that there is nothing **inherently** wrong about buying intimacy from a willing seller.
-
We could be sure of it if AI curated it's inputs, which really isn't too much to ask.
-
Yeah, I'm ok with that.
AI doesn't create, it modifies. You might argue that humans are the same, but I think that'd be a dismal view of human creativity. But then we're getting weirdly philosophical.
-
It just means I don't use AI to create porn. I figure that's as good as it gets.
-
Consensual training data makes it ok. I think AI companies should be accountable for curating inputs.
Any art is ok as long as the artist consents. Even if they're drawing horrible things, it's just a drawing.
Now the real question is, should we include rapes of people who have died and have no family? Because then you can't even argue increased suffering of the victim.
But maybe this just gets solved by curation and the "don't be a dick" rule. Because the above sounds kinda dickish.
-
-
what is the law’s position on AI-generated child porn?
Pretend underage porn is illegal in the EU and some other countries. I believe, in the US it is protected by the first amendment.
Mind that when people talk about child porn or CSAM that means anything underage, as far as politics is concerned. When two 17-year-olds exchange nude selfies, that is child porn. There were some publicized cases of teens in the US being convicted as pedophile sex offenders for sexting.
-
I've found that there's a lot of things on the Internet that went wrong because it was ad supported for "free". Porn is one of them.
There is ethically produced porn out there, but you're going to have to pay for it. Incidentally, it also tends to be better porn overall. The versions of the videos they put up on tube sites are usually cut down, and are only part of their complete library. Up through 2012 or so, the tube sites were mostly pirated content, but then they came to an agreement with the mainstream porn industry. Now it's mostly the studios putting up their own content (plus independent, verified creators), and anything pirated gets taken down fast.
Anyway, sites like Crash Pad Series, Erika Lust, Vanessa Cliff, and Adulttime (the most mainstream of this list) are worth a subscription fee.
-
I guess the point is this enables the mass production of revenge porn and the like which makes it much harder to punish and prevent distribution. when it is relatively few sources that produces the unwanted product then only punishing the distribution might be a viable method. But when the production method becomes available to the masses then the only feasible control mechanism is to try to regulate the production method. It is all a matter of where is the most efficient position to put the bottle neck.
-
I think that's a fair point and I wonder how will this effect the freedom of expression on the internet. If you can't find the distributor then it'll be really tough to get a handle of this.
On the other hand the sheer over abundance could simply break the entire value of revenge porn as in "nothing is real anyway so it doesn't matter" sort of thing which I hope would be the case. No one will be watching revenge porn cause they generate any porn they want in a heartbeat. Thats the ideal scenario anyway.
-
It is indeed a complicated problem with many intertwined variables, wouldn't wanna be in the shoes of policy makers (assuming that they actually are searching for an honest solution and not trying to turn this into profit lol). For instance too much regulation on fields like this essentially would kill high quality open source AI tools and make most of them proprietary software leaving the field in the mercy of tech monopolies. This is probably what these monopolies want and they will surely try to push things this way to kill competition (talk about capitalism spurring competition and innovation!). They might even don the cloak of some of these bad actors to speed up the process. Given the possible application range of AI, this is probably even more dangerous than flooding the internet with revenge porn.
%100 freedom, no regulations will essentially lead to a mixed situation of creative and possibly ground breaking uses of the tech vs many bad actors using the tech for things like scamming, disinformation etc. how it will balance out on the long run is probably very hard to predict.
I think two things are clear, 1-both extremities are not ideal, 2- between the two extremities %100 freedom is still the better option (the former just exchanges couple giant bad actors for many small ones and chokes any possible good outcomes).
Based on these starting with a solution closer to the "freedom edge" and improving it step by step based on results is probably the most sensible approach.
-
I think the concern is that although it's victimless, if it's legal it could.... Normalise (within certain circles) the practice. This might make the users more confident to do something that does create a victim.
Additionally, how do you tell if it's really or generated? If AI does get better, how do you tell?
-
Therapy is well and good and I think we need far more avenues available for people to get help (for all issues). That said, sexuality and attraction are complicated.
Let me start by saying I am not trying to state there is a 1:1 equivalence, this is just a comparison, but we have long abandoned conversion therapy for homosexuals, because we've found these preferences are core to them and not easily overwritten. The same is true for me as a straight person, I don't think therapy would help me find men attractive. I have to imagine the same is true for pedophiles.
The question is, if AI can produce pornography that can satisfy the urges of someone with pedophilia without harming any minors, is that a net positive? Remember the attraction is not the crime, it's the actions that harm others that are. Therapy should always be on the table.
This is a tricky subject because we don't want to become thought police, so all our laws are built in that manner. However there are big exceptions for sexual crimes due to the gravity of their impact on society. It's very hard to "stand up" for pedophilia because if acted upon it has monstrous effects, but AI is making us open this can of worms that I don't belive we ever really thought through besides criminalizing and demonizing (which could be argued was the correct approach with the technology at the time).
-
I really don't know enough about the subject or how that therapy works. I doubt that it is conversion therapy, but Ii really don't know. I would assume it's handling childhood trauma, medications etc.
Both therapy and if satisfying urges through AI generated content, is both something that should be answered scientifically. If there is research then that should be the basis for what decisions is taken, if there is a lack of research then more research should be the next step.
-
If the pictures are not, forced therapy is probably the best option.
This is true, but it really depends how "therapy" is defined. And forced therapy could mean anything from things like the old conversion therapy approach for gay people.
You might argue that these days we wouldn't do anything so barbaric, but considering that the nature of a pedophile's is very unsavory, and the fact that it will never be acceptable, unlike homosexuality, people would be far more willing to abuse or exploit said "therapy"
-
Basically every pedo in prison is one who isn't abusing kids. Every pedo on a list is one who won't be left alone with a young family member. Actually reducing AI CP doesn't actually by itself do anything.
-
I believe, in the US it is protected by the first amendment.
CSAM, artificial or not, is illegal in the United States.
-
That said, sexuality and attraction are complicated.
There's nothing particularly complicated about it not being ok to rape kids, or to distribute depictions of kids being raped.
-
i have no problem with ai porn assuming it’s not based on any real identities
With any model in use, currently, that is impossible to meet. All models are trained on real images.
-
With this logic, any output of any pic gen AI is abuse
Yes?
-
No adult conversation required, just a quick "looks like we don't get internet privacy after all everyone." And erosion of more civil liberties. Again.