Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Technology
  3. 25 arrested in global hit against AI-generated child sexual abuse material

25 arrested in global hit against AI-generated child sexual abuse material

Scheduled Pinned Locked Moved Technology
92 Posts 32 Posters 51 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • stanley_pain@lemmy.dbzer0.comS [email protected]

    I think it's pretty stupid. Borders on Thought Crime kind of stuff.

    I'd rather see that kind of enforcement and effort go towards actually finding people who are harming children.

    L This user is from outside of this forum
    L This user is from outside of this forum
    [email protected]
    wrote on last edited by
    #4

    Ehhhhh...

    It also borders on real CSAM

    S 1 Reply Last reply
    0
    • G [email protected]
      This post did not contain any content.
      K This user is from outside of this forum
      K This user is from outside of this forum
      [email protected]
      wrote on last edited by
      #5

      Even in cases when the content is fully artificial and there is no real victim depicted, such as Operation Cumberland, AI-generated CSAM still contributes to the objectification and sexualisation of children.

      I get how fucking creepy and downright sickening this all feels, but I'm genuinely surprised that it's illegal or criminal if there's no actual children involved.

      It mentions sexual extortion and that's definitely something that should be illegal, same for spreading AI generated explicit stuff about real people without their concent, involving children or adults, but idk about the case mentioned here.

      H K S 3 Replies Last reply
      0
      • B [email protected]

        On one hand I don't think this kind of thing can be consequence free (from a practical standpoint). On the other hand... how old were the subjects? You can't look at a person to determine their age and someone that looks like a child but is actually adult wouldn't be charged as a child pornographer. The whole reason age limits are set is to give reasonable assurance the subject is not being exploited or otherwise harmed by the act.

        This is a massive grey area and I just hope sentences are proportional to the crime. I could live with this kind of thing being classified as a misdemeanor provided the creator didn't use underage subjects to train or influence the output.

        G This user is from outside of this forum
        G This user is from outside of this forum
        [email protected]
        wrote on last edited by
        #6

        It's not a gray area at all. There's an EU directive on the matter. If an image appears to depict someone under the age of 18 then it's child porn. It doesn't matter if any minor was exploited. That's simply not what these laws are about.

        Bear in mind, there are many countries where consenting adults are prosecuted for having sex the wrong way. It's not so long ago that this was also the case in Europe, and a lot of people explicitly want that back. On the other hand, beating children has a lot of fans in the same demographic. Some people want to actually protect children, but a whole lot of people simply want to prosecute sexual minorities, and the difference shows.

        17 year-olds who exchange nude selfies engage in child porn. I know there have been convictions in the US; not sure about Europe. I know that teachers have been prosecuted when minors sought help when their selfies were being passed around in school, because they sent the images in question to the teacher, and that's possession. In Germany, the majority of suspects in child porn cases are minors. Valuable life lesson for them.

        Anyway, what I'm saying is: We need harsher laws and more surveillance to deal with this epidemic of child porn. Only a creep would defend child porn and I am not a creep.

        ? F B B 4 Replies Last reply
        0
        • G [email protected]

          It's not a gray area at all. There's an EU directive on the matter. If an image appears to depict someone under the age of 18 then it's child porn. It doesn't matter if any minor was exploited. That's simply not what these laws are about.

          Bear in mind, there are many countries where consenting adults are prosecuted for having sex the wrong way. It's not so long ago that this was also the case in Europe, and a lot of people explicitly want that back. On the other hand, beating children has a lot of fans in the same demographic. Some people want to actually protect children, but a whole lot of people simply want to prosecute sexual minorities, and the difference shows.

          17 year-olds who exchange nude selfies engage in child porn. I know there have been convictions in the US; not sure about Europe. I know that teachers have been prosecuted when minors sought help when their selfies were being passed around in school, because they sent the images in question to the teacher, and that's possession. In Germany, the majority of suspects in child porn cases are minors. Valuable life lesson for them.

          Anyway, what I'm saying is: We need harsher laws and more surveillance to deal with this epidemic of child porn. Only a creep would defend child porn and I am not a creep.

          ? Offline
          ? Offline
          Guest
          wrote on last edited by
          #7

          Legality is not the same as morality.

          1 Reply Last reply
          0
          • G [email protected]

            It's not a gray area at all. There's an EU directive on the matter. If an image appears to depict someone under the age of 18 then it's child porn. It doesn't matter if any minor was exploited. That's simply not what these laws are about.

            Bear in mind, there are many countries where consenting adults are prosecuted for having sex the wrong way. It's not so long ago that this was also the case in Europe, and a lot of people explicitly want that back. On the other hand, beating children has a lot of fans in the same demographic. Some people want to actually protect children, but a whole lot of people simply want to prosecute sexual minorities, and the difference shows.

            17 year-olds who exchange nude selfies engage in child porn. I know there have been convictions in the US; not sure about Europe. I know that teachers have been prosecuted when minors sought help when their selfies were being passed around in school, because they sent the images in question to the teacher, and that's possession. In Germany, the majority of suspects in child porn cases are minors. Valuable life lesson for them.

            Anyway, what I'm saying is: We need harsher laws and more surveillance to deal with this epidemic of child porn. Only a creep would defend child porn and I am not a creep.

            F This user is from outside of this forum
            F This user is from outside of this forum
            [email protected]
            wrote on last edited by
            #8

            There's not an epidemic of child porn.

            There's an epidemic of governments wanting greater surveillance powers over the Internet and it is framed as being used to "fight child porn".

            So you're going to hear about every single case and conviction until your perception is that there is an epidemic of child porn.

            "You can't possibly oppose these privacy destroying laws, after all you're not on the side of child porn are you?"

            T 1 Reply Last reply
            0
            • L [email protected]

              Ehhhhh...

              It also borders on real CSAM

              S This user is from outside of this forum
              S This user is from outside of this forum
              [email protected]
              wrote on last edited by
              #9

              Paracetamol "borders on" poison, but isn't.

              Slippery slope is a logical fallacy, and there are actual consequences here. We need to do better.

              1 Reply Last reply
              0
              • stanley_pain@lemmy.dbzer0.comS [email protected]

                I think it's pretty stupid. Borders on Thought Crime kind of stuff.

                I'd rather see that kind of enforcement and effort go towards actually finding people who are harming children.

                I This user is from outside of this forum
                I This user is from outside of this forum
                [email protected]
                wrote on last edited by
                #10

                This is also my take: any person can set up an image generator and churn any content they want. Focus should be on actual people being trafficed and abused.

                1 Reply Last reply
                0
                • K [email protected]

                  Even in cases when the content is fully artificial and there is no real victim depicted, such as Operation Cumberland, AI-generated CSAM still contributes to the objectification and sexualisation of children.

                  I get how fucking creepy and downright sickening this all feels, but I'm genuinely surprised that it's illegal or criminal if there's no actual children involved.

                  It mentions sexual extortion and that's definitely something that should be illegal, same for spreading AI generated explicit stuff about real people without their concent, involving children or adults, but idk about the case mentioned here.

                  H This user is from outside of this forum
                  H This user is from outside of this forum
                  [email protected]
                  wrote on last edited by
                  #11

                  It's certainly creepy and disgusting

                  It also seems like we're half a step away from thought police regulating any thought or expression a person has that those in power do not like

                  1 Reply Last reply
                  0
                  • G [email protected]
                    This post did not contain any content.
                    M This user is from outside of this forum
                    M This user is from outside of this forum
                    [email protected]
                    wrote on last edited by
                    #12

                    Followed swiftly by operation jizzberworld

                    1 Reply Last reply
                    0
                    • K [email protected]

                      Even in cases when the content is fully artificial and there is no real victim depicted, such as Operation Cumberland, AI-generated CSAM still contributes to the objectification and sexualisation of children.

                      I get how fucking creepy and downright sickening this all feels, but I'm genuinely surprised that it's illegal or criminal if there's no actual children involved.

                      It mentions sexual extortion and that's definitely something that should be illegal, same for spreading AI generated explicit stuff about real people without their concent, involving children or adults, but idk about the case mentioned here.

                      K This user is from outside of this forum
                      K This user is from outside of this forum
                      [email protected]
                      wrote on last edited by
                      #13

                      It would depend on the country. In the UK even drawn depictions are illegal. I assume it has to at least be realistic and stick figures don't count.

                      K 1 Reply Last reply
                      0
                      • K [email protected]

                        It would depend on the country. In the UK even drawn depictions are illegal. I assume it has to at least be realistic and stick figures don't count.

                        K This user is from outside of this forum
                        K This user is from outside of this forum
                        [email protected]
                        wrote on last edited by
                        #14

                        It sounds like a very iffy thing to police. Since drawn stuff doesn't have actual age, how do you determine it? Looks? Wouldn't be great.

                        J J 2 Replies Last reply
                        0
                        • G [email protected]

                          It's not a gray area at all. There's an EU directive on the matter. If an image appears to depict someone under the age of 18 then it's child porn. It doesn't matter if any minor was exploited. That's simply not what these laws are about.

                          Bear in mind, there are many countries where consenting adults are prosecuted for having sex the wrong way. It's not so long ago that this was also the case in Europe, and a lot of people explicitly want that back. On the other hand, beating children has a lot of fans in the same demographic. Some people want to actually protect children, but a whole lot of people simply want to prosecute sexual minorities, and the difference shows.

                          17 year-olds who exchange nude selfies engage in child porn. I know there have been convictions in the US; not sure about Europe. I know that teachers have been prosecuted when minors sought help when their selfies were being passed around in school, because they sent the images in question to the teacher, and that's possession. In Germany, the majority of suspects in child porn cases are minors. Valuable life lesson for them.

                          Anyway, what I'm saying is: We need harsher laws and more surveillance to deal with this epidemic of child porn. Only a creep would defend child porn and I am not a creep.

                          B This user is from outside of this forum
                          B This user is from outside of this forum
                          [email protected]
                          wrote on last edited by
                          #15

                          It's not a gray area at all. There's an EU directive on the matter. If an image appears to depict someone under the age of 18 then it's child porn.

                          So a person that is 18 years old, depicted in the nude, is still a child pornographer if they don't look their age? This gives judges and prosecutors too much leeway and I could guarantee there are right-wing judges that would charge a 25yo because it could believed they were 17.

                          In Germany, the majority of suspects in child porn cases are minors. Valuable life lesson for them.

                          Is it though? I don't know about the penalties in Germany but in the US a 17yo that takes a nude selfie is likely to be put on a sex offender list for life and have their freedom significantly limited. I'm not against penalties, but they should be proportional to the harm. A day in court followed by a fair amount of community service should be enough of an embarrassment to deter them, not jail.

                          B 1 Reply Last reply
                          0
                          • G [email protected]
                            This post did not contain any content.
                            J This user is from outside of this forum
                            J This user is from outside of this forum
                            [email protected]
                            wrote on last edited by
                            #16

                            Not going to read the article, but I will say that I understand making hyper-realistic fictional CP illegal, because it would make limiting actual CP impossible.

                            As long as it’s clearly fictional though, let people get off to whatever imaginary stuff they want to. We might find it disgusting, but there are plenty of sexual genres that most people would find disgusting b yet shouldn’t be illegal.

                            D 1 Reply Last reply
                            0
                            • K [email protected]

                              It sounds like a very iffy thing to police. Since drawn stuff doesn't have actual age, how do you determine it? Looks? Wouldn't be great.

                              J This user is from outside of this forum
                              J This user is from outside of this forum
                              [email protected]
                              wrote on last edited by
                              #17

                              Imagine having to argue to a jury that a wolf-human hybrid with bright neon fur is underage because it isn’t similar enough to a wolf for dog years to apply.

                              1 Reply Last reply
                              0
                              • J [email protected]

                                Not going to read the article, but I will say that I understand making hyper-realistic fictional CP illegal, because it would make limiting actual CP impossible.

                                As long as it’s clearly fictional though, let people get off to whatever imaginary stuff they want to. We might find it disgusting, but there are plenty of sexual genres that most people would find disgusting b yet shouldn’t be illegal.

                                D This user is from outside of this forum
                                D This user is from outside of this forum
                                [email protected]
                                wrote on last edited by
                                #18

                                The only way to generate something like that is to teach it something like that from real images.

                                I S 2 Replies Last reply
                                0
                                • D [email protected]

                                  The only way to generate something like that is to teach it something like that from real images.

                                  I This user is from outside of this forum
                                  I This user is from outside of this forum
                                  [email protected]
                                  wrote on last edited by
                                  #19

                                  I don’t think this is actually true. Pretty sure if you feed it naked adults and clothed children it can figure out the rest.

                                  D 1 Reply Last reply
                                  0
                                  • K [email protected]

                                    It sounds like a very iffy thing to police. Since drawn stuff doesn't have actual age, how do you determine it? Looks? Wouldn't be great.

                                    J This user is from outside of this forum
                                    J This user is from outside of this forum
                                    [email protected]
                                    wrote on last edited by
                                    #20

                                    I mean that's the same thing with AI generated content. It's all trained on a wide range of real people, how do you know what's generated isn't depicting an underage person, which is why laws like this are really dangerous.

                                    S 1 Reply Last reply
                                    0
                                    • I [email protected]

                                      I don’t think this is actually true. Pretty sure if you feed it naked adults and clothed children it can figure out the rest.

                                      D This user is from outside of this forum
                                      D This user is from outside of this forum
                                      [email protected]
                                      wrote on last edited by
                                      #21

                                      That's not how these image generators work.

                                      How would it know what an age appropriate penis looks like with our, you know, seeing one.

                                      lime@feddit.nuL A 2 Replies Last reply
                                      0
                                      • D [email protected]

                                        That's not how these image generators work.

                                        How would it know what an age appropriate penis looks like with our, you know, seeing one.

                                        lime@feddit.nuL This user is from outside of this forum
                                        lime@feddit.nuL This user is from outside of this forum
                                        [email protected]
                                        wrote on last edited by
                                        #22

                                        considering style transfer models, you could probably just draw or 3d model unknown details and feed it that.

                                        D 1 Reply Last reply
                                        0
                                        • G [email protected]
                                          This post did not contain any content.
                                          X This user is from outside of this forum
                                          X This user is from outside of this forum
                                          [email protected]
                                          wrote on last edited by
                                          #23

                                          I totally agree with these guys being arrested. I want to get that out of the way first.

                                          But what crime did they commit? They didn't abuse children...they are AI generated and do not exist. What they did is obviously disgusting and makes me want to punch them in the face repeatedly until it's flat, but where's the line here? If they draw pictures of non-existent children is that also a crime?

                                          Does that open artists to the interpretation of the law when it comes to art? Can they be put in prison because they did a professional painting of a child? Like what if they did a painting of their own child in the bath or something? Sure the contents questionable but it's not exactly predatory. And if you add safeguards for these people could then not the predators just claim artistic expression?

                                          It just seems entirely unenforceable and an entire goddamn can of worms...

                                          S S A B 4 Replies Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups