25 arrested in global hit against AI-generated child sexual abuse material
-
How can it be made ethically?
Let's say you manually edit a bunch of legal pictures and feed that into a model to generate new images. Or maybe you pull some legal images from other regions (e.g. topless children), and label some young-looking adults as children for the rest.
I don't know, I'm not an expert. But just because I don't know of something doesn't mean it doesn't exist, it means I need to consult experts.
It can’t.
Then prove it. That's how things are done in courts of law. Each side provides experts to try to convince the judge/jury that something did or did not happen.
My point is merely that an image that looks like CSAM is only CSAM if it actually involves abuse of a child. It's not CSAM if it's generated some other way, such as hand-drawing (e.g. hentai) or a model that doesn't use CSAM in its training data.
-
You can't prove a negative. That's not how prooving things work.
You also assume legal images. But that puts limits on what's actually legal globally. What if someone wants a 5 year old? How are there legal photos of that?
You assume it can, prove that it can.
-
You can’t prove a negative
You can show how existing solutions work and demonstrate that the solution used works like those other solutions. That takes a lot more work than "see, it looks like a child therefore it's CSAM," but it's necessary to protect innocent people.
You assume it can, prove that it can.
That's guilty until proven innocent. There's a reason courts operate on the assumption of innocence and force the prosecution to prove guilt. I am not interested in reversing that.
-
You better believe when the cops come knocking, the burden of proof to be ethical is wholly on you.
All existing solutions are based on real life images. There's no ethically way to acquire thousand upon thousands of images of naked children to produce anything resembling real.
That's how existing solutions work.
So again, how can it be done ethically?
-
I haven't read any of this research because, like, the only feelings I have about pedophiles are outright contempt and a small amount of pity for the whole fucking destructive evilness of it all, but I've been told having access to drawings and images and whatnot makes people more likely to act on their impulses.
And like. I don't think images of CSAM in any form, no matter how far removed they are from real people, actually contribute anything worthwhile st all yo the world, so like. I dunno.
Really couldn't give two squirts of piss of about anything that makes a pedophiles life harder. Human garbage.
-
when the cops come knocking
When the cops come knocking, your best bet is to comply under duress (be clear that it's under duress). Fighting the police will just add more charges, the right place to fight is in the courts. If your country's justice system is corrupt, then I guess you might as well fight the police, but in most developed countries, the courts are much more reasonable than the police.
how can it be done ethically?
The burden of proof is on showing that it was done unethically, not that it was done ethically. Force the prosecution to actually do their job, don't just assume someone is guilty because the thing they made looks illegal.
-
It's all part of 'manufacturing consent'.
There's plenty of material out in academia about it (as always check your sources), if you want to get into the weeds
-
people are fucking weird. especially when it comes to porn.
-
Much as all in modern AI - it's able to train without much human intervention.
My point is, even if results are not perfectly accurate and resembling a child's body, they work. They are widely used, in fact, so widely that Europol made a giant issue out of it. People get off to whatever it manages to produce, and that's what matters.
I do not care about how accurate it is, because it's not me who consumes this content. I care about how efficient it is at curbing worse desires in pedophiles, because I care about safety of children.
-
It's strange to me that it is referred to as CSAM. No people are involved so no one is a being sexually assaulted. It's creepy but calling it that implies a drawing is a person to me.
-
How can it be trained to produce something without human input.
It wasn’t trained to produce every specific image it produces. That would make it pointless. It “learns” concepts and then applies them.
No one trained AI on material of Donald Trump sucking on feet, but it can still generate it.
-
As an advocate for online and offline safety of children, I did read into the research. None of the research I've found confirm with any sort of evidence that AI-generated CSAM materials increase risks of other illicit behavior. We need more evidence, and I do recommend to exercise caution with statements, but for the time being, we can rely on the studies in other forms of illegal behaviors and the effects of their decriminalization, which paint a fairly positive picture. Generally, people will tend to opt for what is legal and more readily accessible - and we can make AI CSAM into exactly that.
For now, people are criminalized for the zero-evidence-its-even-bad crime, while I tend to look quite positively on what it can bring on the table instead.
Also, pedophiles are not human trash, and this line of thinking is also harmful, making more of them hide and never get adequate help from a therapist, increasing their chances of offending. Which, well, harms children.
They are regular people who, involuntarily, have their sexuality warped in a way that includes children. They never chose it, they cannot do anything about it in itself, and can only figure out what to do with it going forward. You could be one, I could be one. What matters is the decisions they take based on their sexuality. The correct way is celibacy and refusion of any sources of direct harm towards children, including the consumption of real CSAM. This might be hard on many, and to aid them, we can provide fictional materials so they could let some steam off. Otherwise, many are likely to turn to real CSAM as a source of satisfaction, or even turn to actually abusing children IRL.
-
It was able to produce that because enough images of both feet and Donald Trump exist.
How would it know what young genitals look like?
-
That's a directive, it's not a regulation, and the regulation calling anyone under 18 a child does not mean that everything under 18 is treated the same way in actually applicable law, which directives very much aren't. Germany, for example, splits the whole thing into under 14 and 14-18.
We certainly don't arrest youth for sending each other nudes:
(4) Subsection (1) no. 3, also in conjunction with subsection (5), and subsection (3) do not apply to acts by persons relating to such youth pornographic content which they have produced exclusively for their personal use with the consent of the persons depicted.
...their own nudes, that is. Not that of classmates or whatnot.
-
You could probably make some semi-realistic drawings and feed those in, and then re-train the model with those same images over and over until the model is biased to use the child-like properties of the drawings but the realism of the adult pictures. You could also feed the most CP-looking images from a partially trained model as the training data of another model, which over time would make the outputs approach the desired result.
-
If you train a model on 1,000,000 images of dogs and 1,000,000 images of cats, your output isn't going to be a 50/50 split of purely dogs and purely cats, it's going to be (on average) somewhere between a cat and a dog. At no point did you have to feed in pictures of dog-cat hybrids to end up with that model.
-
Yes but you start with the basics of a cat and a dog. So you start with adult genitals and.......
-
But to know if it's accurate, someone has to view and compare....
-
Non-pornographic pictures of children and/or human-made pornographic drawings of children.
-
Okay, and those drawings are my problem.
https://www.icenews.is/2010/07/28/unsavoury-cartoon-ruling-sparks-debate-in-sweden/
It's not clear cut that those are okay.