25 arrested in global hit against AI-generated child sexual abuse material
-
Good to have data points as reference points to at least guide the discussion.
-
The thing is, banning is also a consequential action.
And based on what we know about similar behaviors, having an outlet is likely to be good.
Here, the EU takes an approach of "banning just in case" while also ignoring the potential implications of such bans.
-
What would stop someone from creating a tool that tagged real images as AI generated?
Have at it with drawings that are easily distinguished, but if anything is photorealistic I feel like it needs to be treated as real.
-
I hope they don't have access to a cloud computing provider somewhere, otherwise this is going to be a tough thing to enforce without a great firewall larger than China has.
It will be hilarious to see them attempt it though.
-
Some form of digital signatures for allowed services?
Sure, it will limit the choice of where to legally generate content, but it should work.
-
First off I’ll say this topic is very nuanced. And as sick as any child porn is I completely agree. This, in my gut, feels like a weird slippery slope that will somehow get used against any AI generated images or possibly any AI generated content. It makes me feel like those “online child protection” bills that seem on the surface like not terrible ideas, but when you start thinking about them in detail are horrific dystopian ideas.
-
I highly doubt any commercially available service is going to get in on officially generating photorealistic CSAM.
-
Open-source models exist and can be forked
-
In Germany, if 14-18yolds make nude selfies then nothing happens, if they share it with their intimate partner(s) then neither, if someone distributes (that's the key word) the pictures on the schoolyard then the law is getting involved. Under 14yolds technically works out similar just that the criminal law won't get involved because under 14yolds can't commit crimes, that's all child protective services jurisdiction which will intervene as necessary. The general advise to kids given by schools is "just don't, it's not worth the possible headache". It's a bullet point in biology (sex ed) and/or social studies (media competency), you'd have to dig into state curricula.
Not sure where that "majority of cases" thing comes from. It might very well be true because when nudes leak on the schoolyard you suddenly have a whole school's worth of suspects many of which (people who deleted) will not be followed up on and another significant portion (didn't send on) might have to write an essay in exchange for terminating proceedings. Yet another reason why you should never rely on police statistics. Ten people in an elevator, one farts, ten suspects.
We do have a general criminal register but it's not public. Employers generally are not allowed to demand certificates of good conduct unless there's very good reason (say, kindergarten teachers) and your neighbours definitely can't.
-
Sounds like some actual common sense was applied to German law. Good to hear.
-
...and then we're back at "someone can take that model and tag real images to appear AI-generated."
You would need a closed-source model run server-side in order to prevent that.
-
Yep, essentially. But that's for the hyperrealistic one.