25 arrested in global hit against AI-generated child sexual abuse material
-
How can it be trained to produce something without human input.
It wasn’t trained to produce every specific image it produces. That would make it pointless. It “learns” concepts and then applies them.
No one trained AI on material of Donald Trump sucking on feet, but it can still generate it.
-
As an advocate for online and offline safety of children, I did read into the research. None of the research I've found confirm with any sort of evidence that AI-generated CSAM materials increase risks of other illicit behavior. We need more evidence, and I do recommend to exercise caution with statements, but for the time being, we can rely on the studies in other forms of illegal behaviors and the effects of their decriminalization, which paint a fairly positive picture. Generally, people will tend to opt for what is legal and more readily accessible - and we can make AI CSAM into exactly that.
For now, people are criminalized for the zero-evidence-its-even-bad crime, while I tend to look quite positively on what it can bring on the table instead.
Also, pedophiles are not human trash, and this line of thinking is also harmful, making more of them hide and never get adequate help from a therapist, increasing their chances of offending. Which, well, harms children.
They are regular people who, involuntarily, have their sexuality warped in a way that includes children. They never chose it, they cannot do anything about it in itself, and can only figure out what to do with it going forward. You could be one, I could be one. What matters is the decisions they take based on their sexuality. The correct way is celibacy and refusion of any sources of direct harm towards children, including the consumption of real CSAM. This might be hard on many, and to aid them, we can provide fictional materials so they could let some steam off. Otherwise, many are likely to turn to real CSAM as a source of satisfaction, or even turn to actually abusing children IRL.
-
It was able to produce that because enough images of both feet and Donald Trump exist.
How would it know what young genitals look like?
-
That's a directive, it's not a regulation, and the regulation calling anyone under 18 a child does not mean that everything under 18 is treated the same way in actually applicable law, which directives very much aren't. Germany, for example, splits the whole thing into under 14 and 14-18.
We certainly don't arrest youth for sending each other nudes:
(4) Subsection (1) no. 3, also in conjunction with subsection (5), and subsection (3) do not apply to acts by persons relating to such youth pornographic content which they have produced exclusively for their personal use with the consent of the persons depicted.
...their own nudes, that is. Not that of classmates or whatnot.
-
You could probably make some semi-realistic drawings and feed those in, and then re-train the model with those same images over and over until the model is biased to use the child-like properties of the drawings but the realism of the adult pictures. You could also feed the most CP-looking images from a partially trained model as the training data of another model, which over time would make the outputs approach the desired result.
-
If you train a model on 1,000,000 images of dogs and 1,000,000 images of cats, your output isn't going to be a 50/50 split of purely dogs and purely cats, it's going to be (on average) somewhere between a cat and a dog. At no point did you have to feed in pictures of dog-cat hybrids to end up with that model.
-
Yes but you start with the basics of a cat and a dog. So you start with adult genitals and.......
-
But to know if it's accurate, someone has to view and compare....
-
Non-pornographic pictures of children and/or human-made pornographic drawings of children.
-
Okay, and those drawings are my problem.
https://www.icenews.is/2010/07/28/unsavoury-cartoon-ruling-sparks-debate-in-sweden/
It's not clear cut that those are okay.
-
If an underage AI character, is portrayed in say a movie or games, is that wrong? Seems like a very slippery slope.
-
It doesn't matter if it's accurate or not as long as pedos can get off to it, so just keep going until they can. According to our definition of what a pedophile is, though, it would likely be accurate.
-
But if it's not accurate, will pedos jerk off to it?
-
"Okay" in what sense? If you mean morally, then I think that's pretty clear cut. If you mean legally, then that's just a technicality.
-
Probably not, but that's irrelevant. The point is that no one needs to harm a child to find out if the output is sufficiently arousing.
-
totally ethical thousands of photos of drawings of children in sexual contexts
Legality is just a technicality
Okay there bud.
-
But how does it get more authentic without actual input if what's accurate.
It's not enough to tell and AI that's somethings wrong. You have to also tell it what was right.
-
There’s a few in the White House.
-
Why would "thousands of photos of drawings of children in sexual contexts" be unethical?
-
It doesn't need to get more authentic, it just needs to get more arousing, and we have a perfectly ethical way to measure that. You tell the AI it was "right" if the pedos you show it to get aroused.