Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Technology
  3. 25 arrested in global hit against AI-generated child sexual abuse material

25 arrested in global hit against AI-generated child sexual abuse material

Scheduled Pinned Locked Moved Technology
92 Posts 32 Posters 51 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • D [email protected]

    How can it be made ethically?

    That's my point.

    It can't.

    Some human has to sit and make many, many, many models of genitals to produce an artificial one.

    And that, IMO is not ethically possible.

    S This user is from outside of this forum
    S This user is from outside of this forum
    [email protected]
    wrote on last edited by
    #42

    How can it be made ethically?

    Let's say you manually edit a bunch of legal pictures and feed that into a model to generate new images. Or maybe you pull some legal images from other regions (e.g. topless children), and label some young-looking adults as children for the rest.

    I don't know, I'm not an expert. But just because I don't know of something doesn't mean it doesn't exist, it means I need to consult experts.

    It can’t.

    Then prove it. That's how things are done in courts of law. Each side provides experts to try to convince the judge/jury that something did or did not happen.

    My point is merely that an image that looks like CSAM is only CSAM if it actually involves abuse of a child. It's not CSAM if it's generated some other way, such as hand-drawing (e.g. hentai) or a model that doesn't use CSAM in its training data.

    D 1 Reply Last reply
    0
    • S [email protected]

      How can it be made ethically?

      Let's say you manually edit a bunch of legal pictures and feed that into a model to generate new images. Or maybe you pull some legal images from other regions (e.g. topless children), and label some young-looking adults as children for the rest.

      I don't know, I'm not an expert. But just because I don't know of something doesn't mean it doesn't exist, it means I need to consult experts.

      It can’t.

      Then prove it. That's how things are done in courts of law. Each side provides experts to try to convince the judge/jury that something did or did not happen.

      My point is merely that an image that looks like CSAM is only CSAM if it actually involves abuse of a child. It's not CSAM if it's generated some other way, such as hand-drawing (e.g. hentai) or a model that doesn't use CSAM in its training data.

      D This user is from outside of this forum
      D This user is from outside of this forum
      [email protected]
      wrote on last edited by
      #43

      You can't prove a negative. That's not how prooving things work.

      You also assume legal images. But that puts limits on what's actually legal globally. What if someone wants a 5 year old? How are there legal photos of that?

      You assume it can, prove that it can.

      S 1 Reply Last reply
      0
      • D [email protected]

        You can't prove a negative. That's not how prooving things work.

        You also assume legal images. But that puts limits on what's actually legal globally. What if someone wants a 5 year old? How are there legal photos of that?

        You assume it can, prove that it can.

        S This user is from outside of this forum
        S This user is from outside of this forum
        [email protected]
        wrote on last edited by
        #44

        You can’t prove a negative

        You can show how existing solutions work and demonstrate that the solution used works like those other solutions. That takes a lot more work than "see, it looks like a child therefore it's CSAM," but it's necessary to protect innocent people.

        You assume it can, prove that it can.

        That's guilty until proven innocent. There's a reason courts operate on the assumption of innocence and force the prosecution to prove guilt. I am not interested in reversing that.

        D 1 Reply Last reply
        0
        • S [email protected]

          You can’t prove a negative

          You can show how existing solutions work and demonstrate that the solution used works like those other solutions. That takes a lot more work than "see, it looks like a child therefore it's CSAM," but it's necessary to protect innocent people.

          You assume it can, prove that it can.

          That's guilty until proven innocent. There's a reason courts operate on the assumption of innocence and force the prosecution to prove guilt. I am not interested in reversing that.

          D This user is from outside of this forum
          D This user is from outside of this forum
          [email protected]
          wrote on last edited by
          #45

          You better believe when the cops come knocking, the burden of proof to be ethical is wholly on you.

          All existing solutions are based on real life images. There's no ethically way to acquire thousand upon thousands of images of naked children to produce anything resembling real.

          That's how existing solutions work.

          So again, how can it be done ethically?

          S 1 Reply Last reply
          0
          • A [email protected]

            I'm afraid Europol is shooting themselves in the foot here.

            What should be done is better ways to mark and identify AI-generated content, not a carpet ban and criminalization.

            Let whoever happens to crave CSAM (remember: sexuality, however perverted or terrible it is, is not a choice) use the most harmless outlet - otherwise, they may just turn to the real materials, and as continuous investigations suggest, there's no shortage of supply or demand on that front. If everything is illegal, and some of that is needed anyway, it's easier to escalate, and that's dangerous.

            As sickening as it may sound to us, these people often need something, or else things are quickly gonna go downhill. Give them their drawings.

            F This user is from outside of this forum
            F This user is from outside of this forum
            [email protected]
            wrote on last edited by
            #46

            I haven't read any of this research because, like, the only feelings I have about pedophiles are outright contempt and a small amount of pity for the whole fucking destructive evilness of it all, but I've been told having access to drawings and images and whatnot makes people more likely to act on their impulses.

            And like. I don't think images of CSAM in any form, no matter how far removed they are from real people, actually contribute anything worthwhile st all yo the world, so like. I dunno.

            Really couldn't give two squirts of piss of about anything that makes a pedophiles life harder. Human garbage.

            A 1 Reply Last reply
            0
            • D [email protected]

              You better believe when the cops come knocking, the burden of proof to be ethical is wholly on you.

              All existing solutions are based on real life images. There's no ethically way to acquire thousand upon thousands of images of naked children to produce anything resembling real.

              That's how existing solutions work.

              So again, how can it be done ethically?

              S This user is from outside of this forum
              S This user is from outside of this forum
              [email protected]
              wrote on last edited by
              #47

              when the cops come knocking

              When the cops come knocking, your best bet is to comply under duress (be clear that it's under duress). Fighting the police will just add more charges, the right place to fight is in the courts. If your country's justice system is corrupt, then I guess you might as well fight the police, but in most developed countries, the courts are much more reasonable than the police.

              how can it be done ethically?

              The burden of proof is on showing that it was done unethically, not that it was done ethically. Force the prosecution to actually do their job, don't just assume someone is guilty because the thing they made looks illegal.

              1 Reply Last reply
              0
              • T [email protected]

                Same with misinformation. Where anything they disagree with, in good faith or not, is misinformation.

                F This user is from outside of this forum
                F This user is from outside of this forum
                [email protected]
                wrote on last edited by
                #48

                It's all part of 'manufacturing consent'.

                There's plenty of material out in academia about it (as always check your sources), if you want to get into the weeds

                1 Reply Last reply
                0
                • D [email protected]

                  Okay, but my point still stands.

                  Someone has to make the genitals models to learn from. Some human has to be involved otherwise it wouldn't just exist.

                  And if your not willing to get your hands dirty and do it, why would anyone else?

                  lime@feddit.nuL This user is from outside of this forum
                  lime@feddit.nuL This user is from outside of this forum
                  [email protected]
                  wrote on last edited by
                  #49

                  people are fucking weird. especially when it comes to porn.

                  1 Reply Last reply
                  0
                  • D [email protected]

                    How can it be trained to produce something without human input.

                    To verify it's models are indeed correct, some human has to sit and view it.

                    Will that be you?

                    A This user is from outside of this forum
                    A This user is from outside of this forum
                    [email protected]
                    wrote on last edited by
                    #50

                    Much as all in modern AI - it's able to train without much human intervention.

                    My point is, even if results are not perfectly accurate and resembling a child's body, they work. They are widely used, in fact, so widely that Europol made a giant issue out of it. People get off to whatever it manages to produce, and that's what matters.

                    I do not care about how accurate it is, because it's not me who consumes this content. I care about how efficient it is at curbing worse desires in pedophiles, because I care about safety of children.

                    1 Reply Last reply
                    0
                    • A [email protected]

                      I actually do not agree with them being arrested.

                      While I recognize the issue of identification posed in the article, I hold a strong opinion it should be tackled in another way.

                      AI-generated CSAM might be a powerful tool to reduce demand for the content featuring real children. If we leave it legal to watch and produce, and keep the actual materials illegal, we can make more pedophiles turn to what is less harmful and impactful - a computer-generated image that was produced with no children being harmed.

                      By introducing actions against AI-generated materials, they make such materials as illegal as the real thing, and there's one less reason for an interested party not to go to a CSAM site and watch actual children getting abused, perpetuating the cycle and leading to more real-world victims.

                      L This user is from outside of this forum
                      L This user is from outside of this forum
                      [email protected]
                      wrote on last edited by
                      #51

                      It's strange to me that it is referred to as CSAM. No people are involved so no one is a being sexually assaulted. It's creepy but calling it that implies a drawing is a person to me.

                      1 Reply Last reply
                      0
                      • D [email protected]

                        How can it be trained to produce something without human input.

                        To verify it's models are indeed correct, some human has to sit and view it.

                        Will that be you?

                        T This user is from outside of this forum
                        T This user is from outside of this forum
                        [email protected]
                        wrote on last edited by
                        #52

                        How can it be trained to produce something without human input.

                        It wasn’t trained to produce every specific image it produces. That would make it pointless. It “learns” concepts and then applies them.

                        No one trained AI on material of Donald Trump sucking on feet, but it can still generate it.

                        D 1 Reply Last reply
                        0
                        • F [email protected]

                          I haven't read any of this research because, like, the only feelings I have about pedophiles are outright contempt and a small amount of pity for the whole fucking destructive evilness of it all, but I've been told having access to drawings and images and whatnot makes people more likely to act on their impulses.

                          And like. I don't think images of CSAM in any form, no matter how far removed they are from real people, actually contribute anything worthwhile st all yo the world, so like. I dunno.

                          Really couldn't give two squirts of piss of about anything that makes a pedophiles life harder. Human garbage.

                          A This user is from outside of this forum
                          A This user is from outside of this forum
                          [email protected]
                          wrote on last edited by
                          #53

                          As an advocate for online and offline safety of children, I did read into the research. None of the research I've found confirm with any sort of evidence that AI-generated CSAM materials increase risks of other illicit behavior. We need more evidence, and I do recommend to exercise caution with statements, but for the time being, we can rely on the studies in other forms of illegal behaviors and the effects of their decriminalization, which paint a fairly positive picture. Generally, people will tend to opt for what is legal and more readily accessible - and we can make AI CSAM into exactly that.

                          For now, people are criminalized for the zero-evidence-its-even-bad crime, while I tend to look quite positively on what it can bring on the table instead.

                          Also, pedophiles are not human trash, and this line of thinking is also harmful, making more of them hide and never get adequate help from a therapist, increasing their chances of offending. Which, well, harms children.

                          They are regular people who, involuntarily, have their sexuality warped in a way that includes children. They never chose it, they cannot do anything about it in itself, and can only figure out what to do with it going forward. You could be one, I could be one. What matters is the decisions they take based on their sexuality. The correct way is celibacy and refusion of any sources of direct harm towards children, including the consumption of real CSAM. This might be hard on many, and to aid them, we can provide fictional materials so they could let some steam off. Otherwise, many are likely to turn to real CSAM as a source of satisfaction, or even turn to actually abusing children IRL.

                          1 Reply Last reply
                          0
                          • T [email protected]

                            How can it be trained to produce something without human input.

                            It wasn’t trained to produce every specific image it produces. That would make it pointless. It “learns” concepts and then applies them.

                            No one trained AI on material of Donald Trump sucking on feet, but it can still generate it.

                            D This user is from outside of this forum
                            D This user is from outside of this forum
                            [email protected]
                            wrote on last edited by
                            #54

                            It was able to produce that because enough images of both feet and Donald Trump exist.

                            How would it know what young genitals look like?

                            J 2 Replies Last reply
                            0
                            • G [email protected]

                              It's not a gray area at all. There's an EU directive on the matter. If an image appears to depict someone under the age of 18 then it's child porn. It doesn't matter if any minor was exploited. That's simply not what these laws are about.

                              Bear in mind, there are many countries where consenting adults are prosecuted for having sex the wrong way. It's not so long ago that this was also the case in Europe, and a lot of people explicitly want that back. On the other hand, beating children has a lot of fans in the same demographic. Some people want to actually protect children, but a whole lot of people simply want to prosecute sexual minorities, and the difference shows.

                              17 year-olds who exchange nude selfies engage in child porn. I know there have been convictions in the US; not sure about Europe. I know that teachers have been prosecuted when minors sought help when their selfies were being passed around in school, because they sent the images in question to the teacher, and that's possession. In Germany, the majority of suspects in child porn cases are minors. Valuable life lesson for them.

                              Anyway, what I'm saying is: We need harsher laws and more surveillance to deal with this epidemic of child porn. Only a creep would defend child porn and I am not a creep.

                              B This user is from outside of this forum
                              B This user is from outside of this forum
                              [email protected]
                              wrote on last edited by
                              #55

                              That's a directive, it's not a regulation, and the regulation calling anyone under 18 a child does not mean that everything under 18 is treated the same way in actually applicable law, which directives very much aren't. Germany, for example, splits the whole thing into under 14 and 14-18.

                              We certainly don't arrest youth for sending each other nudes:

                              (4) Subsection (1) no. 3, also in conjunction with subsection (5), and subsection (3) do not apply to acts by persons relating to such youth pornographic content which they have produced exclusively for their personal use with the consent of the persons depicted.

                              ...their own nudes, that is. Not that of classmates or whatnot.

                              1 Reply Last reply
                              0
                              • D [email protected]

                                It was able to produce that because enough images of both feet and Donald Trump exist.

                                How would it know what young genitals look like?

                                J This user is from outside of this forum
                                J This user is from outside of this forum
                                [email protected]
                                wrote on last edited by
                                #56

                                You could probably make some semi-realistic drawings and feed those in, and then re-train the model with those same images over and over until the model is biased to use the child-like properties of the drawings but the realism of the adult pictures. You could also feed the most CP-looking images from a partially trained model as the training data of another model, which over time would make the outputs approach the desired result.

                                D 1 Reply Last reply
                                0
                                • D [email protected]

                                  It was able to produce that because enough images of both feet and Donald Trump exist.

                                  How would it know what young genitals look like?

                                  J This user is from outside of this forum
                                  J This user is from outside of this forum
                                  [email protected]
                                  wrote on last edited by
                                  #57

                                  If you train a model on 1,000,000 images of dogs and 1,000,000 images of cats, your output isn't going to be a 50/50 split of purely dogs and purely cats, it's going to be (on average) somewhere between a cat and a dog. At no point did you have to feed in pictures of dog-cat hybrids to end up with that model.

                                  D 1 Reply Last reply
                                  0
                                  • J [email protected]

                                    If you train a model on 1,000,000 images of dogs and 1,000,000 images of cats, your output isn't going to be a 50/50 split of purely dogs and purely cats, it's going to be (on average) somewhere between a cat and a dog. At no point did you have to feed in pictures of dog-cat hybrids to end up with that model.

                                    D This user is from outside of this forum
                                    D This user is from outside of this forum
                                    [email protected]
                                    wrote on last edited by
                                    #58

                                    Yes but you start with the basics of a cat and a dog. So you start with adult genitals and.......

                                    J 1 Reply Last reply
                                    0
                                    • J [email protected]

                                      You could probably make some semi-realistic drawings and feed those in, and then re-train the model with those same images over and over until the model is biased to use the child-like properties of the drawings but the realism of the adult pictures. You could also feed the most CP-looking images from a partially trained model as the training data of another model, which over time would make the outputs approach the desired result.

                                      D This user is from outside of this forum
                                      D This user is from outside of this forum
                                      [email protected]
                                      wrote on last edited by
                                      #59

                                      But to know if it's accurate, someone has to view and compare....

                                      J 1 Reply Last reply
                                      0
                                      • D [email protected]

                                        Yes but you start with the basics of a cat and a dog. So you start with adult genitals and.......

                                        J This user is from outside of this forum
                                        J This user is from outside of this forum
                                        [email protected]
                                        wrote on last edited by
                                        #60

                                        Non-pornographic pictures of children and/or human-made pornographic drawings of children.

                                        D 1 Reply Last reply
                                        0
                                        • J [email protected]

                                          Non-pornographic pictures of children and/or human-made pornographic drawings of children.

                                          D This user is from outside of this forum
                                          D This user is from outside of this forum
                                          [email protected]
                                          wrote on last edited by
                                          #61

                                          Okay, and those drawings are my problem.

                                          https://www.icenews.is/2010/07/28/unsavoury-cartoon-ruling-sparks-debate-in-sweden/

                                          It's not clear cut that those are okay.

                                          J 1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups