Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Technology
  3. 25 arrested in global hit against AI-generated child sexual abuse material

25 arrested in global hit against AI-generated child sexual abuse material

Scheduled Pinned Locked Moved Technology
92 Posts 32 Posters 51 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • D [email protected]

    This relies on the idea that "outlet" is not harmful. It might be encouraging even but who do you think even would ever study this to help us know here. Can you imagine the scientists who'd have to be leading studies like this - incredibly grim and difficult subject with high likelihood that no one would listen to you anyway.

    A This user is from outside of this forum
    A This user is from outside of this forum
    [email protected]
    wrote on last edited by
    #79

    IIRC there was actually a study and pedos with access to synthetic CSAM were less likely to victimize real children.

    1 Reply Last reply
    0
    • T [email protected]

      You can download the models and compile them yourself, that will be as effective as the US government was at banning encryption.

      A This user is from outside of this forum
      A This user is from outside of this forum
      [email protected]
      wrote on last edited by
      #80

      https://www.theverge.com/policy/621848/uk-killing-encryption-e2e-apple-adp-privacy

      T 1 Reply Last reply
      0
      • D [email protected]

        Nah the argument that this could real "pedophile culture" and even encourage real activities is really not that far fetched and could be even true. Without very convincing studies do you take a chance where real kids could soon suffer? And I mean the studies would have to be really convincing.

        A This user is from outside of this forum
        A This user is from outside of this forum
        [email protected]
        wrote on last edited by
        #81

        The thing is, banning is also a consequential action.

        And based on what we know about similar behaviors, having an outlet is likely to be good.

        Here, the EU takes an approach of "banning just in case" while also ignoring the potential implications of such bans.

        1 Reply Last reply
        0
        • A [email protected]

          I'm afraid Europol is shooting themselves in the foot here.

          What should be done is better ways to mark and identify AI-generated content, not a carpet ban and criminalization.

          Let whoever happens to crave CSAM (remember: sexuality, however perverted or terrible it is, is not a choice) use the most harmless outlet - otherwise, they may just turn to the real materials, and as continuous investigations suggest, there's no shortage of supply or demand on that front. If everything is illegal, and some of that is needed anyway, it's easier to escalate, and that's dangerous.

          As sickening as it may sound to us, these people often need something, or else things are quickly gonna go downhill. Give them their drawings.

          R This user is from outside of this forum
          R This user is from outside of this forum
          [email protected]
          wrote on last edited by
          #82

          What would stop someone from creating a tool that tagged real images as AI generated?

          Have at it with drawings that are easily distinguished, but if anything is photorealistic I feel like it needs to be treated as real.

          A 1 Reply Last reply
          0
          • A [email protected]

            https://www.theverge.com/policy/621848/uk-killing-encryption-e2e-apple-adp-privacy

            T This user is from outside of this forum
            T This user is from outside of this forum
            [email protected]
            wrote on last edited by
            #83

            I hope they don't have access to a cloud computing provider somewhere, otherwise this is going to be a tough thing to enforce without a great firewall larger than China has.

            It will be hilarious to see them attempt it though.

            A 1 Reply Last reply
            0
            • T [email protected]

              I hope they don't have access to a cloud computing provider somewhere, otherwise this is going to be a tough thing to enforce without a great firewall larger than China has.

              It will be hilarious to see them attempt it though.

              A This user is from outside of this forum
              A This user is from outside of this forum
              [email protected]
              wrote on last edited by
              #84

              Sadly it seems like most of Europe and potentially other "western" countries will follow

              1 Reply Last reply
              0
              • R [email protected]

                What would stop someone from creating a tool that tagged real images as AI generated?

                Have at it with drawings that are easily distinguished, but if anything is photorealistic I feel like it needs to be treated as real.

                A This user is from outside of this forum
                A This user is from outside of this forum
                [email protected]
                wrote on last edited by
                #85

                Some form of digital signatures for allowed services?

                Sure, it will limit the choice of where to legally generate content, but it should work.

                R 1 Reply Last reply
                0
                • X [email protected]

                  I totally agree with these guys being arrested. I want to get that out of the way first.

                  But what crime did they commit? They didn't abuse children...they are AI generated and do not exist. What they did is obviously disgusting and makes me want to punch them in the face repeatedly until it's flat, but where's the line here? If they draw pictures of non-existent children is that also a crime?

                  Does that open artists to the interpretation of the law when it comes to art? Can they be put in prison because they did a professional painting of a child? Like what if they did a painting of their own child in the bath or something? Sure the contents questionable but it's not exactly predatory. And if you add safeguards for these people could then not the predators just claim artistic expression?

                  It just seems entirely unenforceable and an entire goddamn can of worms...

                  B This user is from outside of this forum
                  B This user is from outside of this forum
                  [email protected]
                  wrote on last edited by
                  #86

                  First off I’ll say this topic is very nuanced. And as sick as any child porn is I completely agree. This, in my gut, feels like a weird slippery slope that will somehow get used against any AI generated images or possibly any AI generated content. It makes me feel like those “online child protection” bills that seem on the surface like not terrible ideas, but when you start thinking about them in detail are horrific dystopian ideas.

                  1 Reply Last reply
                  0
                  • A [email protected]

                    Some form of digital signatures for allowed services?

                    Sure, it will limit the choice of where to legally generate content, but it should work.

                    R This user is from outside of this forum
                    R This user is from outside of this forum
                    [email protected]
                    wrote on last edited by
                    #87

                    I highly doubt any commercially available service is going to get in on officially generating photorealistic CSAM.

                    A 1 Reply Last reply
                    0
                    • R [email protected]

                      I highly doubt any commercially available service is going to get in on officially generating photorealistic CSAM.

                      A This user is from outside of this forum
                      A This user is from outside of this forum
                      [email protected]
                      wrote on last edited by
                      #88

                      Open-source models exist and can be forked

                      R 1 Reply Last reply
                      0
                      • B [email protected]

                        It's not a gray area at all. There's an EU directive on the matter. If an image appears to depict someone under the age of 18 then it's child porn.

                        So a person that is 18 years old, depicted in the nude, is still a child pornographer if they don't look their age? This gives judges and prosecutors too much leeway and I could guarantee there are right-wing judges that would charge a 25yo because it could believed they were 17.

                        In Germany, the majority of suspects in child porn cases are minors. Valuable life lesson for them.

                        Is it though? I don't know about the penalties in Germany but in the US a 17yo that takes a nude selfie is likely to be put on a sex offender list for life and have their freedom significantly limited. I'm not against penalties, but they should be proportional to the harm. A day in court followed by a fair amount of community service should be enough of an embarrassment to deter them, not jail.

                        B This user is from outside of this forum
                        B This user is from outside of this forum
                        [email protected]
                        wrote on last edited by
                        #89

                        In Germany, if 14-18yolds make nude selfies then nothing happens, if they share it with their intimate partner(s) then neither, if someone distributes (that's the key word) the pictures on the schoolyard then the law is getting involved. Under 14yolds technically works out similar just that the criminal law won't get involved because under 14yolds can't commit crimes, that's all child protective services jurisdiction which will intervene as necessary. The general advise to kids given by schools is "just don't, it's not worth the possible headache". It's a bullet point in biology (sex ed) and/or social studies (media competency), you'd have to dig into state curricula.

                        Not sure where that "majority of cases" thing comes from. It might very well be true because when nudes leak on the schoolyard you suddenly have a whole school's worth of suspects many of which (people who deleted) will not be followed up on and another significant portion (didn't send on) might have to write an essay in exchange for terminating proceedings. Yet another reason why you should never rely on police statistics. Ten people in an elevator, one farts, ten suspects.

                        We do have a general criminal register but it's not public. Employers generally are not allowed to demand certificates of good conduct unless there's very good reason (say, kindergarten teachers) and your neighbours definitely can't.

                        B 1 Reply Last reply
                        0
                        • B [email protected]

                          In Germany, if 14-18yolds make nude selfies then nothing happens, if they share it with their intimate partner(s) then neither, if someone distributes (that's the key word) the pictures on the schoolyard then the law is getting involved. Under 14yolds technically works out similar just that the criminal law won't get involved because under 14yolds can't commit crimes, that's all child protective services jurisdiction which will intervene as necessary. The general advise to kids given by schools is "just don't, it's not worth the possible headache". It's a bullet point in biology (sex ed) and/or social studies (media competency), you'd have to dig into state curricula.

                          Not sure where that "majority of cases" thing comes from. It might very well be true because when nudes leak on the schoolyard you suddenly have a whole school's worth of suspects many of which (people who deleted) will not be followed up on and another significant portion (didn't send on) might have to write an essay in exchange for terminating proceedings. Yet another reason why you should never rely on police statistics. Ten people in an elevator, one farts, ten suspects.

                          We do have a general criminal register but it's not public. Employers generally are not allowed to demand certificates of good conduct unless there's very good reason (say, kindergarten teachers) and your neighbours definitely can't.

                          B This user is from outside of this forum
                          B This user is from outside of this forum
                          [email protected]
                          wrote on last edited by
                          #90

                          Sounds like some actual common sense was applied to German law. Good to hear.

                          1 Reply Last reply
                          0
                          • A [email protected]

                            Open-source models exist and can be forked

                            R This user is from outside of this forum
                            R This user is from outside of this forum
                            [email protected]
                            wrote on last edited by
                            #91

                            ...and then we're back at "someone can take that model and tag real images to appear AI-generated."

                            You would need a closed-source model run server-side in order to prevent that.

                            A 1 Reply Last reply
                            0
                            • R [email protected]

                              ...and then we're back at "someone can take that model and tag real images to appear AI-generated."

                              You would need a closed-source model run server-side in order to prevent that.

                              A This user is from outside of this forum
                              A This user is from outside of this forum
                              [email protected]
                              wrote on last edited by
                              #92

                              Yep, essentially. But that's for the hyperrealistic one.

                              1 Reply Last reply
                              0
                              • System shared this topic on
                              Reply
                              • Reply as topic
                              Log in to reply
                              • Oldest to Newest
                              • Newest to Oldest
                              • Most Votes


                              • Login

                              • Login or register to search.
                              • First post
                                Last post
                              0
                              • Categories
                              • Recent
                              • Tags
                              • Popular
                              • World
                              • Users
                              • Groups