GenAI website goes dark after explicit fakes exposed
-
Yes, but how is the AI making the images or videos? It has to be trained on SOMETHING.
So, regardless of direct harm or not, harm is done at some point in the process and it needs to be stopped before it slips and gets worse because people "get used to" it.
I wouldn't think it needs to have child porn in the training data to be able to generate it. It has porn as the data, it knows what kids look like, merge the two. I think that works for anything AI knows about, make this resemble this.
-
Yes, but how is the AI making the images or videos? It has to be trained on SOMETHING.
So, regardless of direct harm or not, harm is done at some point in the process and it needs to be stopped before it slips and gets worse because people "get used to" it.
Ai can combine two things. It can train on completely normal pictures of children, and it can train on completely normal porn, and then it can put those together.
This is the same reason it can do something like Godzilla with Sailor Moon's hair, not because it trained on images of Godzilla with Sailor Moon's hair, but because it can combine those two separate things.
-
Ai can combine two things. It can train on completely normal pictures of children, and it can train on completely normal porn, and then it can put those together.
This is the same reason it can do something like Godzilla with Sailor Moon's hair, not because it trained on images of Godzilla with Sailor Moon's hair, but because it can combine those two separate things.
Fair enough. I still think it shouldn't be allowed though.
-
I wouldn't think it needs to have child porn in the training data to be able to generate it. It has porn as the data, it knows what kids look like, merge the two. I think that works for anything AI knows about, make this resemble this.
That's fair, but I still think it shouldn't be accepted or allowed.
-
Yes, but how is the AI making the images or videos? It has to be trained on SOMETHING.
So, regardless of direct harm or not, harm is done at some point in the process and it needs to be stopped before it slips and gets worse because people "get used to" it.
needs to be stopped before it slips and gets worse because people "get used to" it.
Ah, right, almost finally forgot the killer games rhetoric.
-
needs to be stopped before it slips and gets worse because people "get used to" it.
Ah, right, almost finally forgot the killer games rhetoric.
I also don't agree with the killer games thing, but humans are very adaptable as a species.
Normally that's a good thing, but in a case like this exposure to something shocking or upsetting can make it less shocking or upsetting over time (obviously not in every case). So, if AI is being used for something like this and being reported on isn't it possible that people might slowly get desensitized to it over time?
-
Ai can combine two things. It can train on completely normal pictures of children, and it can train on completely normal porn, and then it can put those together.
This is the same reason it can do something like Godzilla with Sailor Moon's hair, not because it trained on images of Godzilla with Sailor Moon's hair, but because it can combine those two separate things.
Only the real things are actual humans who have likely not consented to ever being in this database at all let alone having parts of their likeness being used for this horrific shit. There is no moral argument for this garbage:
-
I don't like saying the full phrase, it's a disgusting merger of words that shouldn't exist.
FYI, the current accepted term is csam. Children sexual abuse material. The reason why CP is wrong is that porn implies, or should imply, that there's consent on the sexual act, and children cannot consent.
You are right, it's a disgusting merger exactly because it implies something that's absolutely incorrect and wrong.
-
I also don't agree with the killer games thing, but humans are very adaptable as a species.
Normally that's a good thing, but in a case like this exposure to something shocking or upsetting can make it less shocking or upsetting over time (obviously not in every case). So, if AI is being used for something like this and being reported on isn't it possible that people might slowly get desensitized to it over time?
But what if pedophiles in therapy are less likely to commit a crime if they have access to respective porn? Even better then, if it can be AI generated, no?
-
That's fair, but I still think it shouldn't be accepted or allowed.
It seems pretty understandable that companies wouldn't allow it, it's more that if it is illegal (like in some places) then that gets into really sketchy territory imo.
-
Fair enough. I still think it shouldn't be allowed though.
Why? Not pressing but just curious what the logic is
-
The craziest thing to me is there was movements to advocate the creation of CP through AI to help those addicted to it as it "wasn't real" and there were no victims involved in it. But no comments regarding how the LLM gained the models to generate those images or the damages that will come when such things get normalized.
It just should never be normalized or exist.
Nuanced take coming, take a breath:
I agree that Child Sexual Abuse is a horrible practice along with all other violence and oppression, sexual or not. But the attraction de facto exists and has done for thousands of years, even through intense taboos. It seems our current strategy of shaming and ignoring it has been ineffective. The definition of insanity being repeating the same thing expecting different results and all that.
Short of eugenics (and from previous trials maybe not even then) we might not be able to get rid of it.
So when do we try other ways of dealing with it?
I'm not saying generative AI is the solution, but I'm pretty sure denying harder isn't it either.
-
I don't like saying the full phrase, it's a disgusting merger of words that shouldn't exist.
Very true, thanks for your sensitivity @dumbass
-
I don't like saying the full phrase, it's a disgusting merger of words that shouldn't exist.
and it's wrong, too. it's not pornography, it's rape.
-
Who actually gets hurt in AI generated cp? The servers?
All the little girls it learned from.
-
All the little girls it learned from.
Gen AI doesn't take cp content and recreates it. There wouldn't be a point of gen AI if that is the case. It knows what regular porn looks like and what a child looks like and it generates an image. With those inputs it can create something new and at the same time hurt nobody.
-
Only the real things are actual humans who have likely not consented to ever being in this database at all let alone having parts of their likeness being used for this horrific shit. There is no moral argument for this garbage:
Technically speaking, if you post images of your child on social media, you have consented. If you never uploaded an image of your child online, you never need to worry.
-
Who actually gets hurt in AI generated cp? The servers?
Are you suggesting that this particular type of CP should be acceptable? (And suddenly "but I used AI" becomes a popular defence.)
-
Nuanced take coming, take a breath:
I agree that Child Sexual Abuse is a horrible practice along with all other violence and oppression, sexual or not. But the attraction de facto exists and has done for thousands of years, even through intense taboos. It seems our current strategy of shaming and ignoring it has been ineffective. The definition of insanity being repeating the same thing expecting different results and all that.
Short of eugenics (and from previous trials maybe not even then) we might not be able to get rid of it.
So when do we try other ways of dealing with it?
I'm not saying generative AI is the solution, but I'm pretty sure denying harder isn't it either.
I've been kind of on the fence about this but then research found that people who physically or verbally express their anger tend to get angrier or delay calming down. I wonder if there could be a similar pattern with this so now I'm hesitating.
-
Technically speaking, if you post images of your child on social media, you have consented. If you never uploaded an image of your child online, you never need to worry.
Social media has been around a long time. It is not reasonable to expect people to think of technology they can’t imagine even existing ten years in the future when “consenting” to use a platform. Legally you are correct. Morally this is obviously terrible. Everything about how terms and conditions are communicated is designed to take advantage of people who won’t or are unable to parse its meaning. Consent needs to be informed.