25 arrested in global hit against AI-generated child sexual abuse material
-
Followed swiftly by operation jizzberworld
-
It would depend on the country. In the UK even drawn depictions are illegal. I assume it has to at least be realistic and stick figures don't count.
-
It sounds like a very iffy thing to police. Since drawn stuff doesn't have actual age, how do you determine it? Looks? Wouldn't be great.
-
It's not a gray area at all. There's an EU directive on the matter. If an image appears to depict someone under the age of 18 then it's child porn.
So a person that is 18 years old, depicted in the nude, is still a child pornographer if they don't look their age? This gives judges and prosecutors too much leeway and I could guarantee there are right-wing judges that would charge a 25yo because it could believed they were 17.
In Germany, the majority of suspects in child porn cases are minors. Valuable life lesson for them.
Is it though? I don't know about the penalties in Germany but in the US a 17yo that takes a nude selfie is likely to be put on a sex offender list for life and have their freedom significantly limited. I'm not against penalties, but they should be proportional to the harm. A day in court followed by a fair amount of community service should be enough of an embarrassment to deter them, not jail.
-
Not going to read the article, but I will say that I understand making hyper-realistic fictional CP illegal, because it would make limiting actual CP impossible.
As long as it’s clearly fictional though, let people get off to whatever imaginary stuff they want to. We might find it disgusting, but there are plenty of sexual genres that most people would find disgusting b yet shouldn’t be illegal.
-
Imagine having to argue to a jury that a wolf-human hybrid with bright neon fur is underage because it isn’t similar enough to a wolf for dog years to apply.
-
The only way to generate something like that is to teach it something like that from real images.
-
I don’t think this is actually true. Pretty sure if you feed it naked adults and clothed children it can figure out the rest.
-
I mean that's the same thing with AI generated content. It's all trained on a wide range of real people, how do you know what's generated isn't depicting an underage person, which is why laws like this are really dangerous.
-
That's not how these image generators work.
How would it know what an age appropriate penis looks like with our, you know, seeing one.
-
considering style transfer models, you could probably just draw or 3d model unknown details and feed it that.
-
I totally agree with these guys being arrested. I want to get that out of the way first.
But what crime did they commit? They didn't abuse children...they are AI generated and do not exist. What they did is obviously disgusting and makes me want to punch them in the face repeatedly until it's flat, but where's the line here? If they draw pictures of non-existent children is that also a crime?
Does that open artists to the interpretation of the law when it comes to art? Can they be put in prison because they did a professional painting of a child? Like what if they did a painting of their own child in the bath or something? Sure the contents questionable but it's not exactly predatory. And if you add safeguards for these people could then not the predators just claim artistic expression?
It just seems entirely unenforceable and an entire goddamn can of worms...
-
Again, that's not how image generators work.
You can't just make up some wishful thinking and assume that's how it must work.
It takes thousands upon housands of unique photos to make an image generator.
Are you going to draw enough child genetalia to train these generators? Are you actually comfortable doing that task?
-
Exactly, which is why I'm against your first line, I don't want them arrested specifically because of artistic expression. I think they're absolutely disgusting and should stop, but they're not harming anyone so they shouldn't go to jail.
In my opinion, you should only go to jail if there's an actual victim. Who exactly is the victim here?
-
only way
That's just not true.
That said, there's a decent chance that existing models use real images, and that is what we should be fighting against. The user of a model has plausible deniability because there's a good chance they don't understand how they work, but the creators of the model should absolutely know where they're getting the source data from.
Prove that the models use illegal material and go after the model creators for that, because that's an actual crime. Don't go after people using the models who are providing alternatives to abusive material.
-
Exactly. If there's no victim, there's no crime.
-
Exactly. Any time there's subjectivity, it's ripe for abuse.
The law should punish:
- creating images of actual underage people
- creating images of actual non-consenting people of legal age
- knowingly distributing one of the above
Each of those has a clearly identifiable victim. Creating a new work of a fictitious person doesn't have any clearly identifiable victim.
Don't make laws to make prosecution easier, make laws to protect actual people from becoming victims or at least punish those who victimize others.
-
I could live with this kind of thing being classified as a misdemeanor provided the creator didn’t use underage subjects to train or influence the output.
So could I, but that doesn't make it just. It should only be a crime if someone is actually harmed, or intended to be harmed.
Creating a work about a fictitious individual shouldn't be illegal, regardless of how distasteful the work is.
-
I think all are unethical, and any service offering should be shut down yes.
I never said prosecute the user's.
I said you can't make it ethically, because at some point, someone is using/creating original art and the odds of human explotations at some point in the chain are just too high.
-
It obviously depends on where they live and/or committed the crimes. But most countries have broad laws against anything, real or fake, that depicts CSAM.
It both because as technology gets better it would be easy for offenders to claims anything they’ve been caught with is AI created.
It’s also because there’s a belief that AI generated CSAM encourages real child abuse.
I shan’t say whether it does - I tend to believe so but haven’t seen data to prove me right or wrong.
Also, at the end, I think it’s simply an ethical position.