Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Technology
  3. Kids are making deepfakes of each other, and laws aren’t keeping up

Kids are making deepfakes of each other, and laws aren’t keeping up

Scheduled Pinned Locked Moved Technology
technology
172 Posts 77 Posters 1 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • S [email protected]

    For example, Louisiana mandates a minimum five-year jail sentence no matter the age of the perpetrator.

    That's just on it's face stupid. A thirteen year old boy is absolutely gonna wanna see girls in his age group naked. That's not pedophilia. It's wanting to see the girls he fantasizes about at school every day. Source: I was a thirteen year old boy.

    It shouldn't be treated the same as when an adult man generates it; there should be nuance. I'm not saying it's ok for a thirteen year old to generate said content: I'm saying tailor the punishment to fit the reality of the differences in motivations. Leave it to Louisiana to once again use a cudgel rather than sense.

    I'm so glad I went through puberty at a time when this kind of shit wasn't available. The thirteen year old version of me would absolutely have got myself in a lot of trouble. And depending on what state I was in, seventeen year old me could have ended listed as a sex predetor for sending dick pics to my gf cause I produced child pornography. God, some states have stupid laws.

    A This user is from outside of this forum
    A This user is from outside of this forum
    [email protected]
    wrote last edited by
    #80

    Punishment for an adult man doing this: Prison

    Punishment for a 13 year old by doing this: Publish his browsing and search history in the school newsletter.

    G 1 Reply Last reply
    11
    • L [email protected]

      As a father of teenage girls, I don't necessarily disagree with this assessment, but I would personally see to it that anyone making sexual deepfakes of my daughters is equitably and thoroughly punished.

      S This user is from outside of this forum
      S This user is from outside of this forum
      [email protected]
      wrote last edited by
      #81

      Yes, absolutely. But with recognition that a thirteen year old kid isn't a predator but a horny little kid. I'll let others determine what that punishment is, but I don't believe it's prison. Community service maybe. Written apology. Stuff like that. Second offense, ok, we're ratcheting up the punishment, but still not adult prison.

      M L tomenzgg@midwest.socialT 3 Replies Last reply
      7
      • J [email protected]

        Oh I just assumed that every Conservative jerks off to kids

        daft_ish@lemmy.dbzer0.comD This user is from outside of this forum
        daft_ish@lemmy.dbzer0.comD This user is from outside of this forum
        [email protected]
        wrote last edited by
        #82

        Get some receipts and that will be a start.

        skulblaka@sh.itjust.worksS 1 Reply Last reply
        0
        • P [email protected]

          Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

          daft_ish@lemmy.dbzer0.comD This user is from outside of this forum
          daft_ish@lemmy.dbzer0.comD This user is from outside of this forum
          [email protected]
          wrote last edited by
          #83

          Welp, if I had kids they would have one of those scramble suits like in a scanner darkly.

          It would of course be their choice to wear them but Id definitely look for ways to limit their time in areas with cameras present.

          E gsus4@mander.xyzG 2 Replies Last reply
          7
          • daft_ish@lemmy.dbzer0.comD [email protected]

            Get some receipts and that will be a start.

            skulblaka@sh.itjust.worksS This user is from outside of this forum
            skulblaka@sh.itjust.worksS This user is from outside of this forum
            [email protected]
            wrote last edited by
            #84

            Receipts you say?

            We're at 56 pages of this now for a nice round count of 1400 charges

            So far as I am aware all of these are publicly searchable court cases

            daft_ish@lemmy.dbzer0.comD 1 Reply Last reply
            11
            • skulblaka@sh.itjust.worksS [email protected]

              Receipts you say?

              We're at 56 pages of this now for a nice round count of 1400 charges

              So far as I am aware all of these are publicly searchable court cases

              daft_ish@lemmy.dbzer0.comD This user is from outside of this forum
              daft_ish@lemmy.dbzer0.comD This user is from outside of this forum
              [email protected]
              wrote last edited by [email protected]
              #85

              Alright, now we just need the main stream media to run the story.

              I mean with all the zealotry against drag shows they should be ready to run with this one right?

              skulblaka@sh.itjust.worksS 1 Reply Last reply
              4
              • daft_ish@lemmy.dbzer0.comD [email protected]

                Alright, now we just need the main stream media to run the story.

                I mean with all the zealotry against drag shows they should be ready to run with this one right?

                skulblaka@sh.itjust.worksS This user is from outside of this forum
                skulblaka@sh.itjust.worksS This user is from outside of this forum
                [email protected]
                wrote last edited by
                #86

                You'd think so, right?

                1 Reply Last reply
                3
                • F [email protected]

                  When someone makes child porn they put a child in a sexual situation - which is something that we have amassed a pile of evidence is extremely harmful to the child.

                  For all you have said - "without the consent" - "being sexualised" - "commodifies their existence" - you haven't told us what the harm is. If you think those things are in and of themselves harmful then I need to know more about what you mean because:

                  1. if someone thinks of me sexually without my consent I am not harmed
                  2. if someone sexualises me in their mind I am not harmed
                  3. I don't know what the "commodification of one's existence" can actually mean - I can't buy or sell "the existence of women" (does buying something's existence mean the same as buying the thing, or something else?) the same I can aluminium, and I don't see how being able to (easily) make (realistic) nude images of someone changes this in any way

                  It is genuinely incredible to me that you could be so unempathetic,

                  I am not unempathetic, but I attribute the blame for what makes me feel bad about the situation is that girls are being made to feel bad and ashamed not that a particular technology is now being used in one step of that.

                  ladyautumn@lemmy.blahaj.zoneL This user is from outside of this forum
                  ladyautumn@lemmy.blahaj.zoneL This user is from outside of this forum
                  [email protected]
                  wrote last edited by [email protected]
                  #87

                  I am just genuinely speechless than you seemingly do not understand how sickening and invasive it is for your peers to create and share sexual content of you without your consent. Yes its extremely harmful. Its not a matter of feeling ashamed, its a matter of literally feeling like your value to the world is dictated by your role in the sexualities of heterosexual boys and men. It is feeling like your own body doesnt belong to you but can be freely claimed by others. It is losing trust in all your male friends and peers, because it feels like without you knowing they've already decided that you're a sexual experience for them.

                  We do know the harm of this kind of sexualization. Women and girls have been talking about it for generations. This isnt new, just a new streamlined way to spread it. It should be illegal. It should be against the law to turn someone's images into AI generated pornography. It should also be illegal to share those images with others.

                  F 1 Reply Last reply
                  3
                  • D [email protected]

                    I don't understand fully how this technology works, but, if people are using it to create sexual content of underage individuals, doesn't that mean the LLM would need to have been trained on sexual content of underage individuals? Seems like going after the company and whatever it's source material is would be the obvious choice here

                    K This user is from outside of this forum
                    K This user is from outside of this forum
                    [email protected]
                    wrote last edited by
                    #88

                    I agree with the other comments, but wanted to add how deepfakes work to show how simple they are, and how much less information they need than LLMs.

                    Step 1: Basically you take a bunch of photos and videos of a specific person, and blur their faces out.

                    Step 2: This is the hardest step, but still totally feasable for a decent home computer. You train a neural network to un-blur all the faces for that person. Now you have a neural net that's really good at turning blurry faces into that particular person's face.

                    Step 3: Blur the faces in photos/videos of other people and apply your special neural network. It will turn all the blurry faces into the only face it knows how, often with shockingly realistic results.

                    G 1 Reply Last reply
                    5
                    • P [email protected]

                      Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

                      2ugly2live@lemmy.world2 This user is from outside of this forum
                      2ugly2live@lemmy.world2 This user is from outside of this forum
                      [email protected]
                      wrote last edited by
                      #89

                      God I'm glad I'm not a kid now. I never would have survived.

                      jjlinux@lemmy.mlJ 1 Reply Last reply
                      21
                      • S [email protected]

                        Yes, absolutely. But with recognition that a thirteen year old kid isn't a predator but a horny little kid. I'll let others determine what that punishment is, but I don't believe it's prison. Community service maybe. Written apology. Stuff like that. Second offense, ok, we're ratcheting up the punishment, but still not adult prison.

                        M This user is from outside of this forum
                        M This user is from outside of this forum
                        [email protected]
                        wrote last edited by
                        #90

                        written apology? they'll just use chatgpt for that

                        1 Reply Last reply
                        2
                        • L [email protected]

                          As a father of teenage girls, I don't necessarily disagree with this assessment, but I would personally see to it that anyone making sexual deepfakes of my daughters is equitably and thoroughly punished.

                          S This user is from outside of this forum
                          S This user is from outside of this forum
                          [email protected]
                          wrote last edited by
                          #91

                          There is a difference between ruining the life of a 13 year old boy for the rest of his life with no recourse and no expectations.

                          Vs scaring the shit out of them and making them work their ass off doing an ass load of community service for a summer.

                          L 1 Reply Last reply
                          13
                          • L [email protected]

                            Hey so, at least in the US, drawings can absolutely be considered CSAM

                            R This user is from outside of this forum
                            R This user is from outside of this forum
                            [email protected]
                            wrote last edited by
                            #92

                            Well, US laws are all bullshit anyway, so makes sense

                            L 1 Reply Last reply
                            1
                            • K [email protected]

                              I agree with the other comments, but wanted to add how deepfakes work to show how simple they are, and how much less information they need than LLMs.

                              Step 1: Basically you take a bunch of photos and videos of a specific person, and blur their faces out.

                              Step 2: This is the hardest step, but still totally feasable for a decent home computer. You train a neural network to un-blur all the faces for that person. Now you have a neural net that's really good at turning blurry faces into that particular person's face.

                              Step 3: Blur the faces in photos/videos of other people and apply your special neural network. It will turn all the blurry faces into the only face it knows how, often with shockingly realistic results.

                              G This user is from outside of this forum
                              G This user is from outside of this forum
                              [email protected]
                              wrote last edited by [email protected]
                              #93

                              Cheers for the explanation, had no idea that's how it works.

                              So it's even worse than @[email protected] thinks, the person creating the deep fake has to have access to CP then if they want to deepfake it!

                              V S swelter_spark@reddthat.comS 3 Replies Last reply
                              1
                              • S [email protected]

                                Yes, absolutely. But with recognition that a thirteen year old kid isn't a predator but a horny little kid. I'll let others determine what that punishment is, but I don't believe it's prison. Community service maybe. Written apology. Stuff like that. Second offense, ok, we're ratcheting up the punishment, but still not adult prison.

                                L This user is from outside of this forum
                                L This user is from outside of this forum
                                [email protected]
                                wrote last edited by
                                #94

                                I did say equitable punishment. Equivalent. Whatever.

                                A written apology is a cop-out for the damage this behaviour leaves behind.

                                Something tells me you don't have teenage daughters.

                                S 1 Reply Last reply
                                4
                                • S [email protected]

                                  There is a difference between ruining the life of a 13 year old boy for the rest of his life with no recourse and no expectations.

                                  Vs scaring the shit out of them and making them work their ass off doing an ass load of community service for a summer.

                                  L This user is from outside of this forum
                                  L This user is from outside of this forum
                                  [email protected]
                                  wrote last edited by
                                  #95

                                  ruining the life of a 13 year old boy for the rest of his life with no recourse

                                  And what about the life of the girl this boy would have ruined?

                                  This is not "boys will be boys" shit. Girls have killed themselves over this kind of thing (I have personal experience with suicidal teenage girls, both as a past friend and as a father).

                                  I don't think it's unreasonable to expect an equivalent punishment that has the potential to ruin his life.

                                  V jjlinux@lemmy.mlJ D 3 Replies Last reply
                                  13
                                  • ladyautumn@lemmy.blahaj.zoneL [email protected]

                                    I am just genuinely speechless than you seemingly do not understand how sickening and invasive it is for your peers to create and share sexual content of you without your consent. Yes its extremely harmful. Its not a matter of feeling ashamed, its a matter of literally feeling like your value to the world is dictated by your role in the sexualities of heterosexual boys and men. It is feeling like your own body doesnt belong to you but can be freely claimed by others. It is losing trust in all your male friends and peers, because it feels like without you knowing they've already decided that you're a sexual experience for them.

                                    We do know the harm of this kind of sexualization. Women and girls have been talking about it for generations. This isnt new, just a new streamlined way to spread it. It should be illegal. It should be against the law to turn someone's images into AI generated pornography. It should also be illegal to share those images with others.

                                    F This user is from outside of this forum
                                    F This user is from outside of this forum
                                    [email protected]
                                    wrote last edited by
                                    #96

                                    Its not a matter of feeling ashamed, its a matter of literally feeling like your value to the world is dictated by your role in the sexualities of heterosexual boys and men. It is feeling like your own body doesnt belong to you but can be freely claimed by others. It is losing trust in all your male friends and peers, because it feels like without you knowing they’ve already decided that you’re a sexual experience for them.

                                    Why is it these things? Why does someone doing something with something which is not your body make it feel like your body doesn't belong to you? Why does it not instead make it feel like images of your body don't belong to you? Several of these things could equally be used to describe the situation when someone is fantasised about without their knowledge - why is that different?
                                    In Germany there's a legal concept called "right to one's own image" but there isn't in many other countries, and besides, what you're describing goes beyond this.

                                    My thinking behind these questions is that I cannot see anything inherent, anything necessary about the creation of fake sexual images of someone which leads to these harms, and that instead there is an aspect of our society which very explicitly punishes and shames people - woman far more so than men - for being in this situation, and that without that, we would be having a very different conversation.

                                    Starting from the position that the harm is in the creation of the images is like starting from the position that the harm of rape is in "defiling" the person raped. Rape isn't wrong because it makes you worthless to society - society is wrong for devaluing rape victims. Society is wrong for devaluing and shaming those who have fake images made of them.

                                    We do know the harm of this kind of sexualization. Women and girls have been talking about it for generations. This isnt new, just a new streamlined way to spread it. It should be illegal.

                                    Can you be more explicit about what it's the same as?

                                    ladyautumn@lemmy.blahaj.zoneL 1 Reply Last reply
                                    1
                                    • atomicorange@lemmy.worldA [email protected]

                                      Are you OK with sexually explicit photos of children taken without their knowledge? They’re not being actively put in a sexual situation if you’re snapping photos with a hidden camera in a locker room, for example. You ok with that?

                                      The harm is:

                                      • Those photos now exist in the world and can lead to direct harm to the victim by their exposure
                                      • it normalizes pedophilia and creates a culture of trading images, leading to more abuse to meet demand for more images
                                      • The people sharing those photos learn to treat people like objects for their sexual gratification, ignoring their consent and agency. They are more likely to mistreat people they have learned to objectify.
                                      • your body should not be used for the profit or gratification of others without your consent. In my mind this includes taking or using your picture without your consent.
                                      F This user is from outside of this forum
                                      F This user is from outside of this forum
                                      [email protected]
                                      wrote last edited by
                                      #97

                                      Are you OK with sexually explicit photos of children taken without their knowledge? They’re not being actively put in a sexual situation if you’re snapping photos with a hidden camera in a locker room, for example. You ok with that?

                                      No, but the harm certainly is not the same as CSAM and it should not be treated the same.

                                      • it normalizes pedophilia and creates a culture of trading images, leading to more abuse to meet demand for more images
                                      • The people sharing those photos learn to treat people like objects for their sexual gratification, ignoring their consent and agency. They are more likely to mistreat people they have learned to objectify.

                                      as far as I know there is no good evidence that this is the case and is a big controversy in the topic of fake child porn, i.e. whether it leads to more child abuse (encouraging paedophiles) or less (gives them a safe outlet) or no change.

                                      your body should not be used for the profit or gratification of others without your consent. In my mind this includes taking or using your picture without your consent.

                                      If someone fantasises about me without my consent I do not give a shit, and I don't think there's any justification for it. I would give a shit if it affected me somehow (this is your first bullet point, but for a different situation, to be clear) but that's different.

                                      atomicorange@lemmy.worldA 1 Reply Last reply
                                      0
                                      • daft_ish@lemmy.dbzer0.comD [email protected]

                                        Welp, if I had kids they would have one of those scramble suits like in a scanner darkly.

                                        It would of course be their choice to wear them but Id definitely look for ways to limit their time in areas with cameras present.

                                        E This user is from outside of this forum
                                        E This user is from outside of this forum
                                        [email protected]
                                        wrote last edited by
                                        #98

                                        That's just called the outside now. Assume you are on camera at all times the moment you step out the front door. To be safe in the surveillance we live in today, best act as though you are being recorded in your own home as well.

                                        daft_ish@lemmy.dbzer0.comD vanilla_puddinfudge@infosec.pubV 2 Replies Last reply
                                        7
                                        • L [email protected]

                                          ruining the life of a 13 year old boy for the rest of his life with no recourse

                                          And what about the life of the girl this boy would have ruined?

                                          This is not "boys will be boys" shit. Girls have killed themselves over this kind of thing (I have personal experience with suicidal teenage girls, both as a past friend and as a father).

                                          I don't think it's unreasonable to expect an equivalent punishment that has the potential to ruin his life.

                                          V This user is from outside of this forum
                                          V This user is from outside of this forum
                                          [email protected]
                                          wrote last edited by
                                          #99

                                          It is not abnormal to see different punishment for people under the age of 18.
                                          Good education about sex and what sexual assault does with their victims (same with guns, drugs including alcohol etc).

                                          You can still course correct the behaviour of a 13 year old. There is also a difference between generating the porn and abusing it by sharing it etc.

                                          The girls should be helped and the boys should be punished, but mainly their behaviour needs to be correcte

                                          1 Reply Last reply
                                          7
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups