Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Technology
  3. Kids are making deepfakes of each other, and laws aren’t keeping up

Kids are making deepfakes of each other, and laws aren’t keeping up

Scheduled Pinned Locked Moved Technology
technology
172 Posts 77 Posters 1 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • P This user is from outside of this forum
    P This user is from outside of this forum
    [email protected]
    wrote last edited by
    #1

    Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

    M W D M P 21 Replies Last reply
    227
    • P [email protected]

      Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

      M This user is from outside of this forum
      M This user is from outside of this forum
      [email protected]
      wrote last edited by [email protected]
      #2

      The only defense is to train AI to draw guys with micropenises. As long as kids being kids is a defense for this shit (and to be fair, kids are pretty fucking stupid and need the freedom to grow out of that) rule makers have no power here. At least insofar as the AI to do this can be run locally on a potato.

      W 1 Reply Last reply
      5
      • M [email protected]

        The only defense is to train AI to draw guys with micropenises. As long as kids being kids is a defense for this shit (and to be fair, kids are pretty fucking stupid and need the freedom to grow out of that) rule makers have no power here. At least insofar as the AI to do this can be run locally on a potato.

        W This user is from outside of this forum
        W This user is from outside of this forum
        [email protected]
        wrote last edited by
        #3

        I think the micropenis thing wold just encourage this further

        M 1 Reply Last reply
        14
        • W [email protected]

          I think the micropenis thing wold just encourage this further

          M This user is from outside of this forum
          M This user is from outside of this forum
          [email protected]
          wrote last edited by
          #4

          It's obviously not a serious suggestion, but the reality is the tools are out there and Pandora's box can't be put back on the shelf. Kids can't be held accountable in a meaningful way. This is just an issue we are going to face basically forever now.

          There is a window of time during which most kids are little sociopaths and you can't appeal to any better nature. They have the means and often no internal or external restraint. And so mutually assured destruction is my tongue in cheek answer.

          arararagi@ani.socialA 1 Reply Last reply
          9
          • P [email protected]

            Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

            W This user is from outside of this forum
            W This user is from outside of this forum
            [email protected]
            wrote last edited by
            #5

            Honestly I think we need to understand that this is no different to sticking a photo of someone's head on a porn magazine photo. It's not real. It's just less janky.

            I would categorise it as sexual harassment, not abuse. Still serious, but a different level

            L L ladyautumn@lemmy.blahaj.zoneL A 4 Replies Last reply
            41
            • P [email protected]

              Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

              D This user is from outside of this forum
              D This user is from outside of this forum
              [email protected]
              wrote last edited by
              #6

              Lawmakers are grappling with how to address ...

              Just a reminder that the government is actively voting against regulations on AI, because obviously a lot of these people are pocketing lobbyist money

              V J I S L 5 Replies Last reply
              90
              • M [email protected]

                It's obviously not a serious suggestion, but the reality is the tools are out there and Pandora's box can't be put back on the shelf. Kids can't be held accountable in a meaningful way. This is just an issue we are going to face basically forever now.

                There is a window of time during which most kids are little sociopaths and you can't appeal to any better nature. They have the means and often no internal or external restraint. And so mutually assured destruction is my tongue in cheek answer.

                arararagi@ani.socialA This user is from outside of this forum
                arararagi@ani.socialA This user is from outside of this forum
                [email protected]
                wrote last edited by
                #7

                AI can do penises just fine though, there's just no market demand for it so quick and easy deep fake sites are focused on female bodies.

                But I disagree with this anyway, this will be the "bullied kid brings a knife to class" of AI.

                M W 2 Replies Last reply
                0
                • W [email protected]

                  Honestly I think we need to understand that this is no different to sticking a photo of someone's head on a porn magazine photo. It's not real. It's just less janky.

                  I would categorise it as sexual harassment, not abuse. Still serious, but a different level

                  L This user is from outside of this forum
                  L This user is from outside of this forum
                  [email protected]
                  wrote last edited by
                  #8

                  Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the "AI" companies are generating a ton of csam and nobody is doing anything about it.

                  W L 2 Replies Last reply
                  20
                  • W [email protected]

                    Honestly I think we need to understand that this is no different to sticking a photo of someone's head on a porn magazine photo. It's not real. It's just less janky.

                    I would categorise it as sexual harassment, not abuse. Still serious, but a different level

                    L This user is from outside of this forum
                    L This user is from outside of this forum
                    [email protected]
                    wrote last edited by [email protected]
                    #9

                    I hope it might lead to a situation of dirty pics/vids not being a problem for the people in it any more, as it could be a deepfake. Like there were cases where a surfacing dirty pic was used for blackmail, ruined someones career or got them kicked out of some comittee, but since it could be fabrication now, I hope this will beva thing of the past, soon.

                    W 1 Reply Last reply
                    5
                    • L [email protected]

                      Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the "AI" companies are generating a ton of csam and nobody is doing anything about it.

                      W This user is from outside of this forum
                      W This user is from outside of this forum
                      [email protected]
                      wrote last edited by
                      #10

                      Disagree. Not CSAM when no abuse has taken place.

                      That's my point.

                      atomicorange@lemmy.worldA L zak@lemmy.worldZ L 4 Replies Last reply
                      11
                      • L [email protected]

                        I hope it might lead to a situation of dirty pics/vids not being a problem for the people in it any more, as it could be a deepfake. Like there were cases where a surfacing dirty pic was used for blackmail, ruined someones career or got them kicked out of some comittee, but since it could be fabrication now, I hope this will beva thing of the past, soon.

                        W This user is from outside of this forum
                        W This user is from outside of this forum
                        [email protected]
                        wrote last edited by
                        #11

                        That could be a socially healthy place to end up at. I don't see it anytime soon though. Just look at the other response I got.

                        bombomom@lemmy.worldB A 2 Replies Last reply
                        7
                        • arararagi@ani.socialA [email protected]

                          AI can do penises just fine though, there's just no market demand for it so quick and easy deep fake sites are focused on female bodies.

                          But I disagree with this anyway, this will be the "bullied kid brings a knife to class" of AI.

                          M This user is from outside of this forum
                          M This user is from outside of this forum
                          [email protected]
                          wrote last edited by
                          #12

                          You're disagreeing with my unserious suggestion? I just... okay. No. Micropenises aren't a solution. I just don't think there is one.

                          If you want to disagree with that, let's hear it. I have 15 and 13 year old daughters. Anyone can buy a $400 computer, install Linux, install AI, and undress people all day long. There is no legal restraint capable of stopping that, only punishing it.

                          Shut down model distribution and it'll move to torrent. Put the kids in the legal system and they are going to face lifelong consequences for 12-year-old assholery. (To be fair, victims often face long repercussions for being targeted, but that's not imposed by the state which demands a higher standard.) Hold parents accountable and it will disproportionately impact families who spend more hours working and can't supervise their kids 24/7.

                          So I'm short on answers, but open to discussion.

                          atomicorange@lemmy.worldA 1 Reply Last reply
                          3
                          • W [email protected]

                            Honestly I think we need to understand that this is no different to sticking a photo of someone's head on a porn magazine photo. It's not real. It's just less janky.

                            I would categorise it as sexual harassment, not abuse. Still serious, but a different level

                            ladyautumn@lemmy.blahaj.zoneL This user is from outside of this forum
                            ladyautumn@lemmy.blahaj.zoneL This user is from outside of this forum
                            [email protected]
                            wrote last edited by [email protected]
                            #13

                            Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.

                            If the person in the image is underaged then it should be classified as child pornography. If the woman who's photo is being used hasnt consented to this then it should be classified as sexual exploitation.

                            Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.

                            atomicorange@lemmy.worldA F R G 4 Replies Last reply
                            21
                            • L [email protected]

                              Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the "AI" companies are generating a ton of csam and nobody is doing anything about it.

                              L This user is from outside of this forum
                              L This user is from outside of this forum
                              [email protected]
                              wrote last edited by
                              #14

                              Do deepfake explicit images created from a non-explicit image actually qualify as CSAM?

                              S 1 Reply Last reply
                              5
                              • W [email protected]

                                Disagree. Not CSAM when no abuse has taken place.

                                That's my point.

                                atomicorange@lemmy.worldA This user is from outside of this forum
                                atomicorange@lemmy.worldA This user is from outside of this forum
                                [email protected]
                                wrote last edited by
                                #15

                                If someone put a camera in the girls’ locker room and distributed photos from that, would you consider it CSAM? No contact would have taken place so the kids would be unaware when they were photographed, is it still abuse?

                                If so, how is the psychological effect of a convincing deepfake any different?

                                bombomom@lemmy.worldB G 2 Replies Last reply
                                6
                                • D [email protected]

                                  Lawmakers are grappling with how to address ...

                                  Just a reminder that the government is actively voting against regulations on AI, because obviously a lot of these people are pocketing lobbyist money

                                  V This user is from outside of this forum
                                  V This user is from outside of this forum
                                  [email protected]
                                  wrote last edited by
                                  #16

                                  Even in countries a lot less corrupt than the US this is an issue.

                                  Especially because the US government/companies doesn't do jack shit for people

                                  1 Reply Last reply
                                  4
                                  • L [email protected]

                                    Do deepfake explicit images created from a non-explicit image actually qualify as CSAM?

                                    S This user is from outside of this forum
                                    S This user is from outside of this forum
                                    [email protected]
                                    wrote last edited by
                                    #17

                                    Drawing a sexy cartoon that looks like an adult, with a caption that says "I'm 12", counts. So yeah, probably.

                                    cole@lemdro.idC 1 Reply Last reply
                                    3
                                    • ladyautumn@lemmy.blahaj.zoneL [email protected]

                                      Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.

                                      If the person in the image is underaged then it should be classified as child pornography. If the woman who's photo is being used hasnt consented to this then it should be classified as sexual exploitation.

                                      Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.

                                      atomicorange@lemmy.worldA This user is from outside of this forum
                                      atomicorange@lemmy.worldA This user is from outside of this forum
                                      [email protected]
                                      wrote last edited by
                                      #18

                                      Thank you. Focusing on the harm the victims is the right way to understand this issue. Too many people in here hunting for a semantic loophole.

                                      1 Reply Last reply
                                      4
                                      • atomicorange@lemmy.worldA [email protected]

                                        If someone put a camera in the girls’ locker room and distributed photos from that, would you consider it CSAM? No contact would have taken place so the kids would be unaware when they were photographed, is it still abuse?

                                        If so, how is the psychological effect of a convincing deepfake any different?

                                        bombomom@lemmy.worldB This user is from outside of this forum
                                        bombomom@lemmy.worldB This user is from outside of this forum
                                        [email protected]
                                        wrote last edited by [email protected]
                                        #19

                                        Taking secret nude pictures of someone is quite a bit different than....not taking nude pictures of them.

                                        It's not CSAM to put a picture of someone's face on an adult model and show it to your friend. It's certainly sexual harassment, but it isn't CSAM.

                                        atomicorange@lemmy.worldA 1 Reply Last reply
                                        11
                                        • W [email protected]

                                          That could be a socially healthy place to end up at. I don't see it anytime soon though. Just look at the other response I got.

                                          bombomom@lemmy.worldB This user is from outside of this forum
                                          bombomom@lemmy.worldB This user is from outside of this forum
                                          [email protected]
                                          wrote last edited by
                                          #20

                                          Anyone with half a brain will certainly claim as much. Even if people don't fully believe it, it will blunt the most serious of social consequences.

                                          1 Reply Last reply
                                          1
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups