25 arrested in global hit against AI-generated child sexual abuse material
-
It's not a gray area at all. There's an EU directive on the matter. If an image appears to depict someone under the age of 18 then it's child porn. It doesn't matter if any minor was exploited. That's simply not what these laws are about.
Bear in mind, there are many countries where consenting adults are prosecuted for having sex the wrong way. It's not so long ago that this was also the case in Europe, and a lot of people explicitly want that back. On the other hand, beating children has a lot of fans in the same demographic. Some people want to actually protect children, but a whole lot of people simply want to prosecute sexual minorities, and the difference shows.
17 year-olds who exchange nude selfies engage in child porn. I know there have been convictions in the US; not sure about Europe. I know that teachers have been prosecuted when minors sought help when their selfies were being passed around in school, because they sent the images in question to the teacher, and that's possession. In Germany, the majority of suspects in child porn cases are minors. Valuable life lesson for them.
Anyway, what I'm saying is: We need harsher laws and more surveillance to deal with this epidemic of child porn. Only a creep would defend child porn and I am not a creep.
That's a directive, it's not a regulation, and the regulation calling anyone under 18 a child does not mean that everything under 18 is treated the same way in actually applicable law, which directives very much aren't. Germany, for example, splits the whole thing into under 14 and 14-18.
We certainly don't arrest youth for sending each other nudes:
(4) Subsection (1) no. 3, also in conjunction with subsection (5), and subsection (3) do not apply to acts by persons relating to such youth pornographic content which they have produced exclusively for their personal use with the consent of the persons depicted.
...their own nudes, that is. Not that of classmates or whatnot.
-
It was able to produce that because enough images of both feet and Donald Trump exist.
How would it know what young genitals look like?
You could probably make some semi-realistic drawings and feed those in, and then re-train the model with those same images over and over until the model is biased to use the child-like properties of the drawings but the realism of the adult pictures. You could also feed the most CP-looking images from a partially trained model as the training data of another model, which over time would make the outputs approach the desired result.
-
It was able to produce that because enough images of both feet and Donald Trump exist.
How would it know what young genitals look like?
If you train a model on 1,000,000 images of dogs and 1,000,000 images of cats, your output isn't going to be a 50/50 split of purely dogs and purely cats, it's going to be (on average) somewhere between a cat and a dog. At no point did you have to feed in pictures of dog-cat hybrids to end up with that model.
-
If you train a model on 1,000,000 images of dogs and 1,000,000 images of cats, your output isn't going to be a 50/50 split of purely dogs and purely cats, it's going to be (on average) somewhere between a cat and a dog. At no point did you have to feed in pictures of dog-cat hybrids to end up with that model.
Yes but you start with the basics of a cat and a dog. So you start with adult genitals and.......
-
You could probably make some semi-realistic drawings and feed those in, and then re-train the model with those same images over and over until the model is biased to use the child-like properties of the drawings but the realism of the adult pictures. You could also feed the most CP-looking images from a partially trained model as the training data of another model, which over time would make the outputs approach the desired result.
But to know if it's accurate, someone has to view and compare....
-
Yes but you start with the basics of a cat and a dog. So you start with adult genitals and.......
Non-pornographic pictures of children and/or human-made pornographic drawings of children.
-
Non-pornographic pictures of children and/or human-made pornographic drawings of children.
Okay, and those drawings are my problem.
https://www.icenews.is/2010/07/28/unsavoury-cartoon-ruling-sparks-debate-in-sweden/
It's not clear cut that those are okay.
-
This post did not contain any content.
If an underage AI character, is portrayed in say a movie or games, is that wrong? Seems like a very slippery slope.
-
But to know if it's accurate, someone has to view and compare....
It doesn't matter if it's accurate or not as long as pedos can get off to it, so just keep going until they can. According to our definition of what a pedophile is, though, it would likely be accurate.
-
It doesn't matter if it's accurate or not as long as pedos can get off to it, so just keep going until they can. According to our definition of what a pedophile is, though, it would likely be accurate.
But if it's not accurate, will pedos jerk off to it?
-
Okay, and those drawings are my problem.
https://www.icenews.is/2010/07/28/unsavoury-cartoon-ruling-sparks-debate-in-sweden/
It's not clear cut that those are okay.
"Okay" in what sense? If you mean morally, then I think that's pretty clear cut. If you mean legally, then that's just a technicality.
-
But if it's not accurate, will pedos jerk off to it?
Probably not, but that's irrelevant. The point is that no one needs to harm a child to find out if the output is sufficiently arousing.
-
"Okay" in what sense? If you mean morally, then I think that's pretty clear cut. If you mean legally, then that's just a technicality.
totally ethical thousands of photos of drawings of children in sexual contexts
Legality is just a technicality
Okay there bud.
-
Probably not, but that's irrelevant. The point is that no one needs to harm a child to find out if the output is sufficiently arousing.
But how does it get more authentic without actual input if what's accurate.
It's not enough to tell and AI that's somethings wrong. You have to also tell it what was right.
-
I think it's pretty stupid. Borders on Thought Crime kind of stuff.
I'd rather see that kind of enforcement and effort go towards actually finding people who are harming children.
There’s a few in the White House.
-
totally ethical thousands of photos of drawings of children in sexual contexts
Legality is just a technicality
Okay there bud.
Why would "thousands of photos of drawings of children in sexual contexts" be unethical?
-
But how does it get more authentic without actual input if what's accurate.
It's not enough to tell and AI that's somethings wrong. You have to also tell it what was right.
It doesn't need to get more authentic, it just needs to get more arousing, and we have a perfectly ethical way to measure that. You tell the AI it was "right" if the pedos you show it to get aroused.
-
Why would "thousands of photos of drawings of children in sexual contexts" be unethical?
Because they're barely legal in certain places?
-
Because they're barely legal in certain places?
Plenty of moral things are illegal or barely legal in certain places. For example, homosexual adults having consensual sex with each other in their own home. I assume you don't think that's unethical or immoral?
-
Plenty of moral things are illegal or barely legal in certain places. For example, homosexual adults having consensual sex with each other in their own home. I assume you don't think that's unethical or immoral?
I'm not saying legality is ethical.
I'm saying there's no practical way to assemble that much material without exploration at some level.