25 arrested in global hit against AI-generated child sexual abuse material
-
But to know if it's accurate, someone has to view and compare....
It doesn't matter if it's accurate or not as long as pedos can get off to it, so just keep going until they can. According to our definition of what a pedophile is, though, it would likely be accurate.
-
It doesn't matter if it's accurate or not as long as pedos can get off to it, so just keep going until they can. According to our definition of what a pedophile is, though, it would likely be accurate.
But if it's not accurate, will pedos jerk off to it?
-
Okay, and those drawings are my problem.
https://www.icenews.is/2010/07/28/unsavoury-cartoon-ruling-sparks-debate-in-sweden/
It's not clear cut that those are okay.
"Okay" in what sense? If you mean morally, then I think that's pretty clear cut. If you mean legally, then that's just a technicality.
-
But if it's not accurate, will pedos jerk off to it?
Probably not, but that's irrelevant. The point is that no one needs to harm a child to find out if the output is sufficiently arousing.
-
"Okay" in what sense? If you mean morally, then I think that's pretty clear cut. If you mean legally, then that's just a technicality.
totally ethical thousands of photos of drawings of children in sexual contexts
Legality is just a technicality
Okay there bud.
-
Probably not, but that's irrelevant. The point is that no one needs to harm a child to find out if the output is sufficiently arousing.
But how does it get more authentic without actual input if what's accurate.
It's not enough to tell and AI that's somethings wrong. You have to also tell it what was right.
-
I think it's pretty stupid. Borders on Thought Crime kind of stuff.
I'd rather see that kind of enforcement and effort go towards actually finding people who are harming children.
There’s a few in the White House.
-
totally ethical thousands of photos of drawings of children in sexual contexts
Legality is just a technicality
Okay there bud.
Why would "thousands of photos of drawings of children in sexual contexts" be unethical?
-
But how does it get more authentic without actual input if what's accurate.
It's not enough to tell and AI that's somethings wrong. You have to also tell it what was right.
It doesn't need to get more authentic, it just needs to get more arousing, and we have a perfectly ethical way to measure that. You tell the AI it was "right" if the pedos you show it to get aroused.
-
Why would "thousands of photos of drawings of children in sexual contexts" be unethical?
Because they're barely legal in certain places?
-
Because they're barely legal in certain places?
Plenty of moral things are illegal or barely legal in certain places. For example, homosexual adults having consensual sex with each other in their own home. I assume you don't think that's unethical or immoral?
-
Plenty of moral things are illegal or barely legal in certain places. For example, homosexual adults having consensual sex with each other in their own home. I assume you don't think that's unethical or immoral?
I'm not saying legality is ethical.
I'm saying there's no practical way to assemble that much material without exploration at some level.
-
I'm afraid Europol is shooting themselves in the foot here.
What should be done is better ways to mark and identify AI-generated content, not a carpet ban and criminalization.
Let whoever happens to crave CSAM (remember: sexuality, however perverted or terrible it is, is not a choice) use the most harmless outlet - otherwise, they may just turn to the real materials, and as continuous investigations suggest, there's no shortage of supply or demand on that front. If everything is illegal, and some of that is needed anyway, it's easier to escalate, and that's dangerous.
As sickening as it may sound to us, these people often need something, or else things are quickly gonna go downhill. Give them their drawings.
This relies on the idea that "outlet" is not harmful. It might be encouraging even but who do you think even would ever study this to help us know here. Can you imagine the scientists who'd have to be leading studies like this - incredibly grim and difficult subject with high likelihood that no one would listen to you anyway.
-
I actually do not agree with them being arrested.
While I recognize the issue of identification posed in the article, I hold a strong opinion it should be tackled in another way.
AI-generated CSAM might be a powerful tool to reduce demand for the content featuring real children. If we leave it legal to watch and produce, and keep the actual materials illegal, we can make more pedophiles turn to what is less harmful and impactful - a computer-generated image that was produced with no children being harmed.
By introducing actions against AI-generated materials, they make such materials as illegal as the real thing, and there's one less reason for an interested party not to go to a CSAM site and watch actual children getting abused, perpetuating the cycle and leading to more real-world victims.
Nah the argument that this could real "pedophile culture" and even encourage real activities is really not that far fetched and could be even true. Without very convincing studies do you take a chance where real kids could soon suffer? And I mean the studies would have to be really convincing.
-
If an underage AI character, is portrayed in say a movie or games, is that wrong? Seems like a very slippery slope.
There have been controversies about that sort of thing.
I know the Oscar-winning movie The Tin Drum as an example. The book by Günter Grass is a very serious, highly celebrated piece of German post-war literature. It takes place around WW2. The protagonist has the mind of an adult in the body of a child. I guess the idea is that he is the other way around from most people?
The movie was banned in Ontario and Oklahoma, for a time. https://en.wikipedia.org/wiki/The_Tin_Drum_(film)#Censorship
With European societies shifting right, I doubt such a movie could be made today, but we aren't at a point where it would be outright illegal.
-
There have been controversies about that sort of thing.
I know the Oscar-winning movie The Tin Drum as an example. The book by Günter Grass is a very serious, highly celebrated piece of German post-war literature. It takes place around WW2. The protagonist has the mind of an adult in the body of a child. I guess the idea is that he is the other way around from most people?
The movie was banned in Ontario and Oklahoma, for a time. https://en.wikipedia.org/wiki/The_Tin_Drum_(film)#Censorship
With European societies shifting right, I doubt such a movie could be made today, but we aren't at a point where it would be outright illegal.
Good to have data points as reference points to at least guide the discussion.
-
This relies on the idea that "outlet" is not harmful. It might be encouraging even but who do you think even would ever study this to help us know here. Can you imagine the scientists who'd have to be leading studies like this - incredibly grim and difficult subject with high likelihood that no one would listen to you anyway.
IIRC there was actually a study and pedos with access to synthetic CSAM were less likely to victimize real children.
-
You can download the models and compile them yourself, that will be as effective as the US government was at banning encryption.
-
Nah the argument that this could real "pedophile culture" and even encourage real activities is really not that far fetched and could be even true. Without very convincing studies do you take a chance where real kids could soon suffer? And I mean the studies would have to be really convincing.
The thing is, banning is also a consequential action.
And based on what we know about similar behaviors, having an outlet is likely to be good.
Here, the EU takes an approach of "banning just in case" while also ignoring the potential implications of such bans.
-
I'm afraid Europol is shooting themselves in the foot here.
What should be done is better ways to mark and identify AI-generated content, not a carpet ban and criminalization.
Let whoever happens to crave CSAM (remember: sexuality, however perverted or terrible it is, is not a choice) use the most harmless outlet - otherwise, they may just turn to the real materials, and as continuous investigations suggest, there's no shortage of supply or demand on that front. If everything is illegal, and some of that is needed anyway, it's easier to escalate, and that's dangerous.
As sickening as it may sound to us, these people often need something, or else things are quickly gonna go downhill. Give them their drawings.
What would stop someone from creating a tool that tagged real images as AI generated?
Have at it with drawings that are easily distinguished, but if anything is photorealistic I feel like it needs to be treated as real.