25 arrested in global hit against AI-generated child sexual abuse material
-
This post did not contain any content.
-
T [email protected] shared this topic
-
On one hand I don't think this kind of thing can be consequence free (from a practical standpoint). On the other hand... how old were the subjects? You can't look at a person to determine their age and someone that looks like a child but is actually adult wouldn't be charged as a child pornographer. The whole reason age limits are set is to give reasonable assurance the subject is not being exploited or otherwise harmed by the act.
This is a massive grey area and I just hope sentences are proportional to the crime. I could live with this kind of thing being classified as a misdemeanor provided the creator didn't use underage subjects to train or influence the output.
-
Ehhhhh...
It also borders on real CSAM
-
Even in cases when the content is fully artificial and there is no real victim depicted, such as Operation Cumberland, AI-generated CSAM still contributes to the objectification and sexualisation of children.
I get how fucking creepy and downright sickening this all feels, but I'm genuinely surprised that it's illegal or criminal if there's no actual children involved.
It mentions sexual extortion and that's definitely something that should be illegal, same for spreading AI generated explicit stuff about real people without their concent, involving children or adults, but idk about the case mentioned here.
-
It's not a gray area at all. There's an EU directive on the matter. If an image appears to depict someone under the age of 18 then it's child porn. It doesn't matter if any minor was exploited. That's simply not what these laws are about.
Bear in mind, there are many countries where consenting adults are prosecuted for having sex the wrong way. It's not so long ago that this was also the case in Europe, and a lot of people explicitly want that back. On the other hand, beating children has a lot of fans in the same demographic. Some people want to actually protect children, but a whole lot of people simply want to prosecute sexual minorities, and the difference shows.
17 year-olds who exchange nude selfies engage in child porn. I know there have been convictions in the US; not sure about Europe. I know that teachers have been prosecuted when minors sought help when their selfies were being passed around in school, because they sent the images in question to the teacher, and that's possession. In Germany, the majority of suspects in child porn cases are minors. Valuable life lesson for them.
Anyway, what I'm saying is: We need harsher laws and more surveillance to deal with this epidemic of child porn. Only a creep would defend child porn and I am not a creep.
-
There's not an epidemic of child porn.
There's an epidemic of governments wanting greater surveillance powers over the Internet and it is framed as being used to "fight child porn".
So you're going to hear about every single case and conviction until your perception is that there is an epidemic of child porn.
"You can't possibly oppose these privacy destroying laws, after all you're not on the side of child porn are you?"
-
Paracetamol "borders on" poison, but isn't.
Slippery slope is a logical fallacy, and there are actual consequences here. We need to do better.
-
It's certainly creepy and disgusting
It also seems like we're half a step away from thought police regulating any thought or expression a person has that those in power do not like
-
Followed swiftly by operation jizzberworld
-
It would depend on the country. In the UK even drawn depictions are illegal. I assume it has to at least be realistic and stick figures don't count.
-
It sounds like a very iffy thing to police. Since drawn stuff doesn't have actual age, how do you determine it? Looks? Wouldn't be great.
-
It's not a gray area at all. There's an EU directive on the matter. If an image appears to depict someone under the age of 18 then it's child porn.
So a person that is 18 years old, depicted in the nude, is still a child pornographer if they don't look their age? This gives judges and prosecutors too much leeway and I could guarantee there are right-wing judges that would charge a 25yo because it could believed they were 17.
In Germany, the majority of suspects in child porn cases are minors. Valuable life lesson for them.
Is it though? I don't know about the penalties in Germany but in the US a 17yo that takes a nude selfie is likely to be put on a sex offender list for life and have their freedom significantly limited. I'm not against penalties, but they should be proportional to the harm. A day in court followed by a fair amount of community service should be enough of an embarrassment to deter them, not jail.
-
Not going to read the article, but I will say that I understand making hyper-realistic fictional CP illegal, because it would make limiting actual CP impossible.
As long as it’s clearly fictional though, let people get off to whatever imaginary stuff they want to. We might find it disgusting, but there are plenty of sexual genres that most people would find disgusting b yet shouldn’t be illegal.
-
Imagine having to argue to a jury that a wolf-human hybrid with bright neon fur is underage because it isn’t similar enough to a wolf for dog years to apply.
-
The only way to generate something like that is to teach it something like that from real images.
-
I don’t think this is actually true. Pretty sure if you feed it naked adults and clothed children it can figure out the rest.
-
I mean that's the same thing with AI generated content. It's all trained on a wide range of real people, how do you know what's generated isn't depicting an underage person, which is why laws like this are really dangerous.