25 arrested in global hit against AI-generated child sexual abuse material
-
Why would "thousands of photos of drawings of children in sexual contexts" be unethical?
-
It doesn't need to get more authentic, it just needs to get more arousing, and we have a perfectly ethical way to measure that. You tell the AI it was "right" if the pedos you show it to get aroused.
-
Because they're barely legal in certain places?
-
Plenty of moral things are illegal or barely legal in certain places. For example, homosexual adults having consensual sex with each other in their own home. I assume you don't think that's unethical or immoral?
-
I'm not saying legality is ethical.
I'm saying there's no practical way to assemble that much material without exploration at some level.
-
This relies on the idea that "outlet" is not harmful. It might be encouraging even but who do you think even would ever study this to help us know here. Can you imagine the scientists who'd have to be leading studies like this - incredibly grim and difficult subject with high likelihood that no one would listen to you anyway.
-
Nah the argument that this could real "pedophile culture" and even encourage real activities is really not that far fetched and could be even true. Without very convincing studies do you take a chance where real kids could soon suffer? And I mean the studies would have to be really convincing.
-
There have been controversies about that sort of thing.
I know the Oscar-winning movie The Tin Drum as an example. The book by Günter Grass is a very serious, highly celebrated piece of German post-war literature. It takes place around WW2. The protagonist has the mind of an adult in the body of a child. I guess the idea is that he is the other way around from most people?
The movie was banned in Ontario and Oklahoma, for a time. https://en.wikipedia.org/wiki/The_Tin_Drum_(film)#Censorship
With European societies shifting right, I doubt such a movie could be made today, but we aren't at a point where it would be outright illegal.
-
Good to have data points as reference points to at least guide the discussion.
-
The thing is, banning is also a consequential action.
And based on what we know about similar behaviors, having an outlet is likely to be good.
Here, the EU takes an approach of "banning just in case" while also ignoring the potential implications of such bans.
-
What would stop someone from creating a tool that tagged real images as AI generated?
Have at it with drawings that are easily distinguished, but if anything is photorealistic I feel like it needs to be treated as real.
-
I hope they don't have access to a cloud computing provider somewhere, otherwise this is going to be a tough thing to enforce without a great firewall larger than China has.
It will be hilarious to see them attempt it though.
-
Some form of digital signatures for allowed services?
Sure, it will limit the choice of where to legally generate content, but it should work.
-
First off I’ll say this topic is very nuanced. And as sick as any child porn is I completely agree. This, in my gut, feels like a weird slippery slope that will somehow get used against any AI generated images or possibly any AI generated content. It makes me feel like those “online child protection” bills that seem on the surface like not terrible ideas, but when you start thinking about them in detail are horrific dystopian ideas.
-
I highly doubt any commercially available service is going to get in on officially generating photorealistic CSAM.
-
Open-source models exist and can be forked
-
In Germany, if 14-18yolds make nude selfies then nothing happens, if they share it with their intimate partner(s) then neither, if someone distributes (that's the key word) the pictures on the schoolyard then the law is getting involved. Under 14yolds technically works out similar just that the criminal law won't get involved because under 14yolds can't commit crimes, that's all child protective services jurisdiction which will intervene as necessary. The general advise to kids given by schools is "just don't, it's not worth the possible headache". It's a bullet point in biology (sex ed) and/or social studies (media competency), you'd have to dig into state curricula.
Not sure where that "majority of cases" thing comes from. It might very well be true because when nudes leak on the schoolyard you suddenly have a whole school's worth of suspects many of which (people who deleted) will not be followed up on and another significant portion (didn't send on) might have to write an essay in exchange for terminating proceedings. Yet another reason why you should never rely on police statistics. Ten people in an elevator, one farts, ten suspects.
We do have a general criminal register but it's not public. Employers generally are not allowed to demand certificates of good conduct unless there's very good reason (say, kindergarten teachers) and your neighbours definitely can't.