GenAI website goes dark after explicit fakes exposed
-
No cp should be acceptable. But I argue AI generated isn't cp.
This is no different than someone cutting out a child's head from a Target catalog and sticking it to a body on a playboy magazine and masturbating to it.
Or someone using Photoshoping a kids head to a pornographic photo.
It's just a more accessible version of those examples.
At the end of the day, what you do in your own home is your thing. t's not my business what you do. As long as it doesn't hurt/affect anyone, go ahead.
I almost respect you for taking a stance so blatantly against what most people believe.
Almost.
-
Pictures of clothed children and naked adults.
Nobody trained them on what things made out of spaghetti look like, but they can generate them because smushing multiple things together is precisely what they do.
Well, that's somewhat reassuring.
Still reprehensible that it's being used that way, of course.
-
I don't need to. It's is just the way gen AI works. It takes images of things it knows and then generates NEW content based on what it think you want with your prompts.
If I'm looking for a infant flying an airplane, gen AI knows what a pilot looks like and what a child looks like and it creates something new.
Also kids face data doesn't mean they take the actual face of the actual child and paste it on a body. It might take an eyebrow and a freckle from one kidand use a hair style from another and eyes from someone else.
Lastly, the kids parents consented when they upload images of their kids on social media.
If you think that AI is only trained on legal images, I can't convince you otherwise.
-
When my dad worked for the DoD, he was assigned a laptop for work that had explicit photos of children on it.
-
Do you think I want to consent to an arbitration agreement
In many countries mandatory arbitration agreements in a B2C context are invalid. They have no legal power.
Ngl this feels like arguing semantics.
-
AI CP seems like a promising way to destroy demand for the real thing. How many people would risk a prison sentence making or viewing the real thing when they could push a button and have a convincing likeness for free with no children harmed? Flood the market with cheap fakes and makers of the real thing may not find it profitable enough to take the risk.
I think it would boost the market for the real thing more.
It's possible that there are people that would become into AI generated CP if it was just allowed to be advertised on nsfw website.
And that would lead some to seek out the real thing. I think it's best to condemn it entirely
-
If you think that AI is only trained on legal images, I can't convince you otherwise.
I mean, you're not giving a very convincing argument.
-
If you think that AI is only trained on legal images, I can't convince you otherwise.
What AI are you talking about? Are you suggesting the commercial models from OpenAI are trained using CP? Or just that there are some models out there that were trained using CP? Because yeah, anyone can create a model at home and train it with whatever. But suggesting that OpenAI has a DB of tagged CP is a different story.
-
Wow so its from the duh region I'm france, here I thought it was just sparkling dumbass
Nope, it's the fully alcoholic dumbass, not that shitty grape juice variety!
-
This, above any other reason, is why I'm most troubled with AI CSAM. I don't care what anyone gets off to if no one is harmed, but the fact that real CSAM could be created and be indistinguishable from AI created, is a real harm.
And I instinctively ask, who would bother producing it for real when AI is cheap and harmless? But people produce it for reasons other than money and there are places in this world where a child's life is probably less valuable than the electricity used to create images.
I fundamentally think AI should be completely uncensored. Because I think censorship limits and harms uses for it that might otherwise be good. I think if 12 year old me could've had an AI show me where the clitoris is on a girl or what the fuck a hymen looks like, or answer questions about my own body, I think I would've had a lot less confusion and uncertainty in my burgeoning sexuality. Maybe I'd have had less curiosity about what my classmates looked like under their clothes, leading to questionable decisions on my part.
I can find a million arguments why AI shouldn't be censored. Like, do you know ChatGPT can be convinced to describe vaginal and oral sex in a romantic fiction is fine, but if it's anal sex, it has a much higher refusal rate? Is that subtle anti-gay encoding in the training data? It also struggles with polyamory when it's two men and a woman but less when it's two women and a man. What's the long-term impact when these biases are built into everyday tools? These are concerns I consider all the time.
But at the end of the day, the idea that there are children out there being abused and consumed and no one will even look for them because "it's probably just AI" isn't something I can bear no matter how firm my convictions are about uncensored AI. It's something I struggle to reconcile.
Maybe the weird, extra human finger and appendage issues in AI images are a feature, not bugs. Maybe it's a naturally occurring, unintended consequences of their learning and feedback process to sabotage the output they generate in order to make it obvious the image is fake.
/s (sort of)
-
FYI, the current accepted term is csam. Children sexual abuse material. The reason why CP is wrong is that porn implies, or should imply, that there's consent on the sexual act, and children cannot consent.
You are right, it's a disgusting merger exactly because it implies something that's absolutely incorrect and wrong.
If we are pedantic, I'm not sure if "children cannot consent" is correct. Children at 16 are mature enough to give consent in legal context, we as a society just frown upon older adults mingling with them.
-
Pictures of clothed children and naked adults.
Nobody trained them on what things made out of spaghetti look like, but they can generate them because smushing multiple things together is precisely what they do.
Given the "we spared no expense" attitude to the rest of the data these things are trained on, I fear that may be wishful thinking...
-
What AI are you talking about? Are you suggesting the commercial models from OpenAI are trained using CP? Or just that there are some models out there that were trained using CP? Because yeah, anyone can create a model at home and train it with whatever. But suggesting that OpenAI has a DB of tagged CP is a different story.
Open AI just scours the Internet. 100% chance it's come across someone illegal and horrible. They don't pre-approve its training data.
-
I mean, you're not giving a very convincing argument.
AI models are trained on the open Internet. Not curated. Open Internet has horrible things.
-
Open AI just scours the Internet. 100% chance it's come across someone illegal and horrible. They don't pre-approve its training data.
But you have to describe it. It doesn't just suck in images at random. I imagine someone will remove CP when the images are reviewed. Or do you think they just download all images and add them to the training set without even looking at them?
-
If we are pedantic, I'm not sure if "children cannot consent" is correct. Children at 16 are mature enough to give consent in legal context, we as a society just frown upon older adults mingling with them.
buddy, theres a whole range of ages between 0 and 16, and i use that number 0 advisedly.
-
If we are pedantic, I'm not sure if "children cannot consent" is correct. Children at 16 are mature enough to give consent in legal context, we as a society just frown upon older adults mingling with them.
When I was 16 I would have totally posed for porn and I would have been completely consenting. But it would have been illegal. I wonder where we should draw the line, and if the current one is the best one.
-
When my dad worked for the DoD, he was assigned a laptop for work that had explicit photos of children on it.
For what scope would they do that?
-
If we are pedantic, I'm not sure if "children cannot consent" is correct. Children at 16 are mature enough to give consent in legal context, we as a society just frown upon older adults mingling with them.
Legally speaking children can't consent, which is why it's illegal and the basis of my statement. I wasn't being pedantic, I was showing a new terminology.
-
Generated AI CP should be illegalized even if its creation did not technically harm anyone. The reason is, presumably it looks too close to real CP, so close that it: 1) normalizes consumption of CP, 2) grows a market for CP, and 3) Real CP could get off the hook by claiming it is AI.
While there are similar reasons to be against clearly not real CP (e.g. hentai), this type at least does not have problem #3. For example, there doesnt need to be an investigation into whether a picture is real or not.
Fun fact it's already illegal. If it's indistinguishable from the real thing it's a crime.