Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Technology
  3. 25 arrested in global hit against AI-generated child sexual abuse material

25 arrested in global hit against AI-generated child sexual abuse material

Scheduled Pinned Locked Moved Technology
92 Posts 32 Posters 51 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • K [email protected]

    Even in cases when the content is fully artificial and there is no real victim depicted, such as Operation Cumberland, AI-generated CSAM still contributes to the objectification and sexualisation of children.

    I get how fucking creepy and downright sickening this all feels, but I'm genuinely surprised that it's illegal or criminal if there's no actual children involved.

    It mentions sexual extortion and that's definitely something that should be illegal, same for spreading AI generated explicit stuff about real people without their concent, involving children or adults, but idk about the case mentioned here.

    S This user is from outside of this forum
    S This user is from outside of this forum
    [email protected]
    wrote on last edited by
    #27

    Exactly. If there's no victim, there's no crime.

    1 Reply Last reply
    0
    • J [email protected]

      I mean that's the same thing with AI generated content. It's all trained on a wide range of real people, how do you know what's generated isn't depicting an underage person, which is why laws like this are really dangerous.

      S This user is from outside of this forum
      S This user is from outside of this forum
      [email protected]
      wrote on last edited by
      #28

      Exactly. Any time there's subjectivity, it's ripe for abuse.

      The law should punish:

      • creating images of actual underage people
      • creating images of actual non-consenting people of legal age
      • knowingly distributing one of the above

      Each of those has a clearly identifiable victim. Creating a new work of a fictitious person doesn't have any clearly identifiable victim.

      Don't make laws to make prosecution easier, make laws to protect actual people from becoming victims or at least punish those who victimize others.

      1 Reply Last reply
      0
      • B [email protected]

        On one hand I don't think this kind of thing can be consequence free (from a practical standpoint). On the other hand... how old were the subjects? You can't look at a person to determine their age and someone that looks like a child but is actually adult wouldn't be charged as a child pornographer. The whole reason age limits are set is to give reasonable assurance the subject is not being exploited or otherwise harmed by the act.

        This is a massive grey area and I just hope sentences are proportional to the crime. I could live with this kind of thing being classified as a misdemeanor provided the creator didn't use underage subjects to train or influence the output.

        S This user is from outside of this forum
        S This user is from outside of this forum
        [email protected]
        wrote on last edited by
        #29

        I could live with this kind of thing being classified as a misdemeanor provided the creator didn’t use underage subjects to train or influence the output.

        So could I, but that doesn't make it just. It should only be a crime if someone is actually harmed, or intended to be harmed.

        Creating a work about a fictitious individual shouldn't be illegal, regardless of how distasteful the work is.

        1 Reply Last reply
        0
        • S [email protected]

          only way

          That's just not true.

          That said, there's a decent chance that existing models use real images, and that is what we should be fighting against. The user of a model has plausible deniability because there's a good chance they don't understand how they work, but the creators of the model should absolutely know where they're getting the source data from.

          Prove that the models use illegal material and go after the model creators for that, because that's an actual crime. Don't go after people using the models who are providing alternatives to abusive material.

          D This user is from outside of this forum
          D This user is from outside of this forum
          [email protected]
          wrote on last edited by
          #30

          I think all are unethical, and any service offering should be shut down yes.

          I never said prosecute the user's.

          I said you can't make it ethically, because at some point, someone is using/creating original art and the odds of human explotations at some point in the chain are just too high.

          S 1 Reply Last reply
          0
          • X [email protected]

            I totally agree with these guys being arrested. I want to get that out of the way first.

            But what crime did they commit? They didn't abuse children...they are AI generated and do not exist. What they did is obviously disgusting and makes me want to punch them in the face repeatedly until it's flat, but where's the line here? If they draw pictures of non-existent children is that also a crime?

            Does that open artists to the interpretation of the law when it comes to art? Can they be put in prison because they did a professional painting of a child? Like what if they did a painting of their own child in the bath or something? Sure the contents questionable but it's not exactly predatory. And if you add safeguards for these people could then not the predators just claim artistic expression?

            It just seems entirely unenforceable and an entire goddamn can of worms...

            S This user is from outside of this forum
            S This user is from outside of this forum
            [email protected]
            wrote on last edited by
            #31

            It obviously depends on where they live and/or committed the crimes. But most countries have broad laws against anything, real or fake, that depicts CSAM.

            It both because as technology gets better it would be easy for offenders to claims anything they’ve been caught with is AI created.

            It’s also because there’s a belief that AI generated CSAM encourages real child abuse.

            I shan’t say whether it does - I tend to believe so but haven’t seen data to prove me right or wrong.

            Also, at the end, I think it’s simply an ethical position.

            1 Reply Last reply
            0
            • D [email protected]

              I think all are unethical, and any service offering should be shut down yes.

              I never said prosecute the user's.

              I said you can't make it ethically, because at some point, someone is using/creating original art and the odds of human explotations at some point in the chain are just too high.

              S This user is from outside of this forum
              S This user is from outside of this forum
              [email protected]
              wrote on last edited by
              #32

              the odds of human explotations at some point in the chain are just too high

              We don't punish people based on odds. At least in the US, the standard is that they're guilty "beyond a reasonable doubt." As in, there's virtually no possibility that they didn't commit the crime. If there's a 90% chance someone is guilty, but a 10% chance they're completely innocent, most would agree that there's reasonable doubt, so they shouldn't be convicted.

              If you can't prove that they made it unethically, and there are methods to make it ethically, then you have reasonable doubt. All the defense needs to do is demonstrate one such method of producing it ethically, and that creates reasonable doubt.

              Services should only be shut down if they're doing something illegal. Prove that the images are generated using CSAM as source material and then shut down any service that refuses to remove it, or who can be proved as knowing "beyond a reasonable doubt" that they were committing a crime. That's how the law works, you only punish people you can prove "beyond a reasonable doubt" were committing a crime.

              D 1 Reply Last reply
              0
              • D [email protected]

                Again, that's not how image generators work.

                You can't just make up some wishful thinking and assume that's how it must work.

                It takes thousands upon housands of unique photos to make an image generator.

                Are you going to draw enough child genetalia to train these generators? Are you actually comfortable doing that task?

                lime@feddit.nuL This user is from outside of this forum
                lime@feddit.nuL This user is from outside of this forum
                [email protected]
                wrote on last edited by
                #33

                i'm not, no. but i'm also well-enough versed in stable diffusion and loras that i know that even a model with no training on a particular topic can be made to produce it with enough tweaking, and if the results are bad you can plug in an extra model trained on at minimum 10-50 images to significantly improve them.

                D 1 Reply Last reply
                0
                • G [email protected]
                  This post did not contain any content.
                  A This user is from outside of this forum
                  A This user is from outside of this forum
                  [email protected]
                  wrote on last edited by
                  #34

                  I'm afraid Europol is shooting themselves in the foot here.

                  What should be done is better ways to mark and identify AI-generated content, not a carpet ban and criminalization.

                  Let whoever happens to crave CSAM (remember: sexuality, however perverted or terrible it is, is not a choice) use the most harmless outlet - otherwise, they may just turn to the real materials, and as continuous investigations suggest, there's no shortage of supply or demand on that front. If everything is illegal, and some of that is needed anyway, it's easier to escalate, and that's dangerous.

                  As sickening as it may sound to us, these people often need something, or else things are quickly gonna go downhill. Give them their drawings.

                  T F D R 4 Replies Last reply
                  0
                  • X [email protected]

                    I totally agree with these guys being arrested. I want to get that out of the way first.

                    But what crime did they commit? They didn't abuse children...they are AI generated and do not exist. What they did is obviously disgusting and makes me want to punch them in the face repeatedly until it's flat, but where's the line here? If they draw pictures of non-existent children is that also a crime?

                    Does that open artists to the interpretation of the law when it comes to art? Can they be put in prison because they did a professional painting of a child? Like what if they did a painting of their own child in the bath or something? Sure the contents questionable but it's not exactly predatory. And if you add safeguards for these people could then not the predators just claim artistic expression?

                    It just seems entirely unenforceable and an entire goddamn can of worms...

                    A This user is from outside of this forum
                    A This user is from outside of this forum
                    [email protected]
                    wrote on last edited by
                    #35

                    I actually do not agree with them being arrested.

                    While I recognize the issue of identification posed in the article, I hold a strong opinion it should be tackled in another way.

                    AI-generated CSAM might be a powerful tool to reduce demand for the content featuring real children. If we leave it legal to watch and produce, and keep the actual materials illegal, we can make more pedophiles turn to what is less harmful and impactful - a computer-generated image that was produced with no children being harmed.

                    By introducing actions against AI-generated materials, they make such materials as illegal as the real thing, and there's one less reason for an interested party not to go to a CSAM site and watch actual children getting abused, perpetuating the cycle and leading to more real-world victims.

                    L D 2 Replies Last reply
                    0
                    • D [email protected]

                      That's not how these image generators work.

                      How would it know what an age appropriate penis looks like with our, you know, seeing one.

                      A This user is from outside of this forum
                      A This user is from outside of this forum
                      [email protected]
                      wrote on last edited by
                      #36

                      That's exactly how they work. According to many articles I've seen in the past, one of the most common models used for this purpose is Stable Diffusion. For all we know, this model was never fed with any CSAM materials, but it seems to be good enough for people to get off - which is exactly what matters.

                      D 1 Reply Last reply
                      0
                      • lime@feddit.nuL [email protected]

                        i'm not, no. but i'm also well-enough versed in stable diffusion and loras that i know that even a model with no training on a particular topic can be made to produce it with enough tweaking, and if the results are bad you can plug in an extra model trained on at minimum 10-50 images to significantly improve them.

                        D This user is from outside of this forum
                        D This user is from outside of this forum
                        [email protected]
                        wrote on last edited by
                        #37

                        Okay, but my point still stands.

                        Someone has to make the genitals models to learn from. Some human has to be involved otherwise it wouldn't just exist.

                        And if your not willing to get your hands dirty and do it, why would anyone else?

                        lime@feddit.nuL 1 Reply Last reply
                        0
                        • S [email protected]

                          the odds of human explotations at some point in the chain are just too high

                          We don't punish people based on odds. At least in the US, the standard is that they're guilty "beyond a reasonable doubt." As in, there's virtually no possibility that they didn't commit the crime. If there's a 90% chance someone is guilty, but a 10% chance they're completely innocent, most would agree that there's reasonable doubt, so they shouldn't be convicted.

                          If you can't prove that they made it unethically, and there are methods to make it ethically, then you have reasonable doubt. All the defense needs to do is demonstrate one such method of producing it ethically, and that creates reasonable doubt.

                          Services should only be shut down if they're doing something illegal. Prove that the images are generated using CSAM as source material and then shut down any service that refuses to remove it, or who can be proved as knowing "beyond a reasonable doubt" that they were committing a crime. That's how the law works, you only punish people you can prove "beyond a reasonable doubt" were committing a crime.

                          D This user is from outside of this forum
                          D This user is from outside of this forum
                          [email protected]
                          wrote on last edited by
                          #38

                          How can it be made ethically?

                          That's my point.

                          It can't.

                          Some human has to sit and make many, many, many models of genitals to produce an artificial one.

                          And that, IMO is not ethically possible.

                          S 1 Reply Last reply
                          0
                          • A [email protected]

                            That's exactly how they work. According to many articles I've seen in the past, one of the most common models used for this purpose is Stable Diffusion. For all we know, this model was never fed with any CSAM materials, but it seems to be good enough for people to get off - which is exactly what matters.

                            D This user is from outside of this forum
                            D This user is from outside of this forum
                            [email protected]
                            wrote on last edited by
                            #39

                            How can it be trained to produce something without human input.

                            To verify it's models are indeed correct, some human has to sit and view it.

                            Will that be you?

                            A T 2 Replies Last reply
                            0
                            • A [email protected]

                              I'm afraid Europol is shooting themselves in the foot here.

                              What should be done is better ways to mark and identify AI-generated content, not a carpet ban and criminalization.

                              Let whoever happens to crave CSAM (remember: sexuality, however perverted or terrible it is, is not a choice) use the most harmless outlet - otherwise, they may just turn to the real materials, and as continuous investigations suggest, there's no shortage of supply or demand on that front. If everything is illegal, and some of that is needed anyway, it's easier to escalate, and that's dangerous.

                              As sickening as it may sound to us, these people often need something, or else things are quickly gonna go downhill. Give them their drawings.

                              T This user is from outside of this forum
                              T This user is from outside of this forum
                              [email protected]
                              wrote on last edited by
                              #40

                              You can download the models and compile them yourself, that will be as effective as the US government was at banning encryption.

                              A 1 Reply Last reply
                              0
                              • F [email protected]

                                There's not an epidemic of child porn.

                                There's an epidemic of governments wanting greater surveillance powers over the Internet and it is framed as being used to "fight child porn".

                                So you're going to hear about every single case and conviction until your perception is that there is an epidemic of child porn.

                                "You can't possibly oppose these privacy destroying laws, after all you're not on the side of child porn are you?"

                                T This user is from outside of this forum
                                T This user is from outside of this forum
                                [email protected]
                                wrote on last edited by
                                #41

                                Same with misinformation. Where anything they disagree with, in good faith or not, is misinformation.

                                F 1 Reply Last reply
                                0
                                • D [email protected]

                                  How can it be made ethically?

                                  That's my point.

                                  It can't.

                                  Some human has to sit and make many, many, many models of genitals to produce an artificial one.

                                  And that, IMO is not ethically possible.

                                  S This user is from outside of this forum
                                  S This user is from outside of this forum
                                  [email protected]
                                  wrote on last edited by
                                  #42

                                  How can it be made ethically?

                                  Let's say you manually edit a bunch of legal pictures and feed that into a model to generate new images. Or maybe you pull some legal images from other regions (e.g. topless children), and label some young-looking adults as children for the rest.

                                  I don't know, I'm not an expert. But just because I don't know of something doesn't mean it doesn't exist, it means I need to consult experts.

                                  It can’t.

                                  Then prove it. That's how things are done in courts of law. Each side provides experts to try to convince the judge/jury that something did or did not happen.

                                  My point is merely that an image that looks like CSAM is only CSAM if it actually involves abuse of a child. It's not CSAM if it's generated some other way, such as hand-drawing (e.g. hentai) or a model that doesn't use CSAM in its training data.

                                  D 1 Reply Last reply
                                  0
                                  • S [email protected]

                                    How can it be made ethically?

                                    Let's say you manually edit a bunch of legal pictures and feed that into a model to generate new images. Or maybe you pull some legal images from other regions (e.g. topless children), and label some young-looking adults as children for the rest.

                                    I don't know, I'm not an expert. But just because I don't know of something doesn't mean it doesn't exist, it means I need to consult experts.

                                    It can’t.

                                    Then prove it. That's how things are done in courts of law. Each side provides experts to try to convince the judge/jury that something did or did not happen.

                                    My point is merely that an image that looks like CSAM is only CSAM if it actually involves abuse of a child. It's not CSAM if it's generated some other way, such as hand-drawing (e.g. hentai) or a model that doesn't use CSAM in its training data.

                                    D This user is from outside of this forum
                                    D This user is from outside of this forum
                                    [email protected]
                                    wrote on last edited by
                                    #43

                                    You can't prove a negative. That's not how prooving things work.

                                    You also assume legal images. But that puts limits on what's actually legal globally. What if someone wants a 5 year old? How are there legal photos of that?

                                    You assume it can, prove that it can.

                                    S 1 Reply Last reply
                                    0
                                    • D [email protected]

                                      You can't prove a negative. That's not how prooving things work.

                                      You also assume legal images. But that puts limits on what's actually legal globally. What if someone wants a 5 year old? How are there legal photos of that?

                                      You assume it can, prove that it can.

                                      S This user is from outside of this forum
                                      S This user is from outside of this forum
                                      [email protected]
                                      wrote on last edited by
                                      #44

                                      You can’t prove a negative

                                      You can show how existing solutions work and demonstrate that the solution used works like those other solutions. That takes a lot more work than "see, it looks like a child therefore it's CSAM," but it's necessary to protect innocent people.

                                      You assume it can, prove that it can.

                                      That's guilty until proven innocent. There's a reason courts operate on the assumption of innocence and force the prosecution to prove guilt. I am not interested in reversing that.

                                      D 1 Reply Last reply
                                      0
                                      • S [email protected]

                                        You can’t prove a negative

                                        You can show how existing solutions work and demonstrate that the solution used works like those other solutions. That takes a lot more work than "see, it looks like a child therefore it's CSAM," but it's necessary to protect innocent people.

                                        You assume it can, prove that it can.

                                        That's guilty until proven innocent. There's a reason courts operate on the assumption of innocence and force the prosecution to prove guilt. I am not interested in reversing that.

                                        D This user is from outside of this forum
                                        D This user is from outside of this forum
                                        [email protected]
                                        wrote on last edited by
                                        #45

                                        You better believe when the cops come knocking, the burden of proof to be ethical is wholly on you.

                                        All existing solutions are based on real life images. There's no ethically way to acquire thousand upon thousands of images of naked children to produce anything resembling real.

                                        That's how existing solutions work.

                                        So again, how can it be done ethically?

                                        S 1 Reply Last reply
                                        0
                                        • A [email protected]

                                          I'm afraid Europol is shooting themselves in the foot here.

                                          What should be done is better ways to mark and identify AI-generated content, not a carpet ban and criminalization.

                                          Let whoever happens to crave CSAM (remember: sexuality, however perverted or terrible it is, is not a choice) use the most harmless outlet - otherwise, they may just turn to the real materials, and as continuous investigations suggest, there's no shortage of supply or demand on that front. If everything is illegal, and some of that is needed anyway, it's easier to escalate, and that's dangerous.

                                          As sickening as it may sound to us, these people often need something, or else things are quickly gonna go downhill. Give them their drawings.

                                          F This user is from outside of this forum
                                          F This user is from outside of this forum
                                          [email protected]
                                          wrote on last edited by
                                          #46

                                          I haven't read any of this research because, like, the only feelings I have about pedophiles are outright contempt and a small amount of pity for the whole fucking destructive evilness of it all, but I've been told having access to drawings and images and whatnot makes people more likely to act on their impulses.

                                          And like. I don't think images of CSAM in any form, no matter how far removed they are from real people, actually contribute anything worthwhile st all yo the world, so like. I dunno.

                                          Really couldn't give two squirts of piss of about anything that makes a pedophiles life harder. Human garbage.

                                          A 1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups