Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Technology
  3. GenAI website goes dark after explicit fakes exposed

GenAI website goes dark after explicit fakes exposed

Scheduled Pinned Locked Moved Technology
technology
102 Posts 50 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • J [email protected]

    Who actually gets hurt in AI generated cp? The servers?

    B This user is from outside of this forum
    B This user is from outside of this forum
    [email protected]
    wrote on last edited by
    #11

    J 1 Reply Last reply
    0
    • B [email protected]

      J This user is from outside of this forum
      J This user is from outside of this forum
      [email protected]
      wrote on last edited by
      #12

      I'm no pedo, but what you do in your own home and hurts nobody is your own thing.

      reseller_pledge609@lemmy.dbzer0.comR 1 Reply Last reply
      0
      • J [email protected]

        I'm no pedo, but what you do in your own home and hurts nobody is your own thing.

        reseller_pledge609@lemmy.dbzer0.comR This user is from outside of this forum
        reseller_pledge609@lemmy.dbzer0.comR This user is from outside of this forum
        [email protected]
        wrote on last edited by
        #13

        Yes, but how is the AI making the images or videos? It has to be trained on SOMETHING.

        So, regardless of direct harm or not, harm is done at some point in the process and it needs to be stopped before it slips and gets worse because people "get used to" it.

        K M M 3 Replies Last reply
        0
        • chozo@fedia.ioC [email protected]

          Just for what it's worth, you don't need CSAM in the training material for a generative AI to produce CSAM. The models know what children look like, and what naked adults look like, so they can readily extrapolate from there.

          The fact that you don't need to actually supply any real CSAM to the training material is the reasoning being offered for supporting AI CSAM. It's gross, but it's also hard to argue with. We allow for all types of illegal subjects to be presented in porn; incest, rape, murder, etc. While most mainstream sites won't allow those types of material, none of them are technically outlawed - partly because of freedom of speech and artistic expression and yadda yadda, but also partly because it all comes with the understanding that it's a fake, made-for-film production and that nobody involved had their consent violated, so it's okay because none of it was actually rape, incest, or murder. And if AI CSAM can be made without actually violating the consent of any real people, then what makes it different?

          I don't know how I feel about it, myself. The idea of "ethically-sourced" CSAM doesn't exactly sit right with me, but if it's possible to make it in a truly victimless manner, then I find it hard to argue outright banning something just because I don't like it.

          C This user is from outside of this forum
          C This user is from outside of this forum
          [email protected]
          wrote on last edited by
          #14

          There is also the angle of generated CSAM looking real adding difficulty in prosecuting real CSAM producers.

          M P 2 Replies Last reply
          0
          • W This user is from outside of this forum
            W This user is from outside of this forum
            [email protected]
            wrote on last edited by
            #15

            Its a very difficult subject, both sides have merit. I can see the "CSAM created without abuse could be used in treatment/management of people with these horrible urges" but I can also see "Allowing people to create CSAM could normalise it and lead to more actual abuse".

            Sadly its incredibly difficult for academics to study this subject and see which of those two is more prevalent.

            1 Reply Last reply
            0
            • reseller_pledge609@lemmy.dbzer0.comR [email protected]

              Yes, but how is the AI making the images or videos? It has to be trained on SOMETHING.

              So, regardless of direct harm or not, harm is done at some point in the process and it needs to be stopped before it slips and gets worse because people "get used to" it.

              K This user is from outside of this forum
              K This user is from outside of this forum
              [email protected]
              wrote on last edited by
              #16

              I wouldn't think it needs to have child porn in the training data to be able to generate it. It has porn as the data, it knows what kids look like, merge the two. I think that works for anything AI knows about, make this resemble this.

              reseller_pledge609@lemmy.dbzer0.comR 1 Reply Last reply
              0
              • reseller_pledge609@lemmy.dbzer0.comR [email protected]

                Yes, but how is the AI making the images or videos? It has to be trained on SOMETHING.

                So, regardless of direct harm or not, harm is done at some point in the process and it needs to be stopped before it slips and gets worse because people "get used to" it.

                M This user is from outside of this forum
                M This user is from outside of this forum
                [email protected]
                wrote on last edited by
                #17

                Ai can combine two things. It can train on completely normal pictures of children, and it can train on completely normal porn, and then it can put those together.

                This is the same reason it can do something like Godzilla with Sailor Moon's hair, not because it trained on images of Godzilla with Sailor Moon's hair, but because it can combine those two separate things.

                reseller_pledge609@lemmy.dbzer0.comR R 2 Replies Last reply
                0
                • M [email protected]

                  Ai can combine two things. It can train on completely normal pictures of children, and it can train on completely normal porn, and then it can put those together.

                  This is the same reason it can do something like Godzilla with Sailor Moon's hair, not because it trained on images of Godzilla with Sailor Moon's hair, but because it can combine those two separate things.

                  reseller_pledge609@lemmy.dbzer0.comR This user is from outside of this forum
                  reseller_pledge609@lemmy.dbzer0.comR This user is from outside of this forum
                  [email protected]
                  wrote on last edited by
                  #18

                  Fair enough. I still think it shouldn't be allowed though.

                  F 1 Reply Last reply
                  0
                  • K [email protected]

                    I wouldn't think it needs to have child porn in the training data to be able to generate it. It has porn as the data, it knows what kids look like, merge the two. I think that works for anything AI knows about, make this resemble this.

                    reseller_pledge609@lemmy.dbzer0.comR This user is from outside of this forum
                    reseller_pledge609@lemmy.dbzer0.comR This user is from outside of this forum
                    [email protected]
                    wrote on last edited by
                    #19

                    That's fair, but I still think it shouldn't be accepted or allowed.

                    K J 2 Replies Last reply
                    0
                    • reseller_pledge609@lemmy.dbzer0.comR [email protected]

                      Yes, but how is the AI making the images or videos? It has to be trained on SOMETHING.

                      So, regardless of direct harm or not, harm is done at some point in the process and it needs to be stopped before it slips and gets worse because people "get used to" it.

                      M This user is from outside of this forum
                      M This user is from outside of this forum
                      [email protected]
                      wrote on last edited by
                      #20

                      needs to be stopped before it slips and gets worse because people "get used to" it.

                      Ah, right, almost finally forgot the killer games rhetoric.

                      reseller_pledge609@lemmy.dbzer0.comR 1 Reply Last reply
                      0
                      • M [email protected]

                        needs to be stopped before it slips and gets worse because people "get used to" it.

                        Ah, right, almost finally forgot the killer games rhetoric.

                        reseller_pledge609@lemmy.dbzer0.comR This user is from outside of this forum
                        reseller_pledge609@lemmy.dbzer0.comR This user is from outside of this forum
                        [email protected]
                        wrote on last edited by
                        #21

                        I also don't agree with the killer games thing, but humans are very adaptable as a species.

                        Normally that's a good thing, but in a case like this exposure to something shocking or upsetting can make it less shocking or upsetting over time (obviously not in every case). So, if AI is being used for something like this and being reported on isn't it possible that people might slowly get desensitized to it over time?

                        M 1 Reply Last reply
                        0
                        • M [email protected]

                          Ai can combine two things. It can train on completely normal pictures of children, and it can train on completely normal porn, and then it can put those together.

                          This is the same reason it can do something like Godzilla with Sailor Moon's hair, not because it trained on images of Godzilla with Sailor Moon's hair, but because it can combine those two separate things.

                          R This user is from outside of this forum
                          R This user is from outside of this forum
                          [email protected]
                          wrote on last edited by
                          #22

                          Only the real things are actual humans who have likely not consented to ever being in this database at all let alone having parts of their likeness being used for this horrific shit. There is no moral argument for this garbage:

                          J 1 Reply Last reply
                          0
                          • dumbass@leminal.spaceD [email protected]

                            I don't like saying the full phrase, it's a disgusting merger of words that shouldn't exist.

                            F This user is from outside of this forum
                            F This user is from outside of this forum
                            [email protected]
                            wrote on last edited by
                            #23

                            FYI, the current accepted term is csam. Children sexual abuse material. The reason why CP is wrong is that porn implies, or should imply, that there's consent on the sexual act, and children cannot consent.

                            You are right, it's a disgusting merger exactly because it implies something that's absolutely incorrect and wrong.

                            T 1 Reply Last reply
                            0
                            • reseller_pledge609@lemmy.dbzer0.comR [email protected]

                              I also don't agree with the killer games thing, but humans are very adaptable as a species.

                              Normally that's a good thing, but in a case like this exposure to something shocking or upsetting can make it less shocking or upsetting over time (obviously not in every case). So, if AI is being used for something like this and being reported on isn't it possible that people might slowly get desensitized to it over time?

                              M This user is from outside of this forum
                              M This user is from outside of this forum
                              [email protected]
                              wrote on last edited by
                              #24

                              But what if pedophiles in therapy are less likely to commit a crime if they have access to respective porn? Even better then, if it can be AI generated, no?

                              J 1 Reply Last reply
                              0
                              • reseller_pledge609@lemmy.dbzer0.comR [email protected]

                                That's fair, but I still think it shouldn't be accepted or allowed.

                                K This user is from outside of this forum
                                K This user is from outside of this forum
                                [email protected]
                                wrote on last edited by
                                #25

                                It seems pretty understandable that companies wouldn't allow it, it's more that if it is illegal (like in some places) then that gets into really sketchy territory imo.

                                1 Reply Last reply
                                0
                                • reseller_pledge609@lemmy.dbzer0.comR [email protected]

                                  Fair enough. I still think it shouldn't be allowed though.

                                  F This user is from outside of this forum
                                  F This user is from outside of this forum
                                  [email protected]
                                  wrote on last edited by
                                  #26

                                  Why? Not pressing but just curious what the logic is

                                  1 Reply Last reply
                                  0
                                  • B [email protected]

                                    The craziest thing to me is there was movements to advocate the creation of CP through AI to help those addicted to it as it "wasn't real" and there were no victims involved in it. But no comments regarding how the LLM gained the models to generate those images or the damages that will come when such things get normalized.

                                    It just should never be normalized or exist.

                                    B This user is from outside of this forum
                                    B This user is from outside of this forum
                                    [email protected]
                                    wrote on last edited by
                                    #27

                                    Nuanced take coming, take a breath:

                                    I agree that Child Sexual Abuse is a horrible practice along with all other violence and oppression, sexual or not. But the attraction de facto exists and has done for thousands of years, even through intense taboos. It seems our current strategy of shaming and ignoring it has been ineffective. The definition of insanity being repeating the same thing expecting different results and all that.

                                    Short of eugenics (and from previous trials maybe not even then) we might not be able to get rid of it.

                                    So when do we try other ways of dealing with it?

                                    I'm not saying generative AI is the solution, but I'm pretty sure denying harder isn't it either.

                                    L 1 Reply Last reply
                                    0
                                    • dumbass@leminal.spaceD [email protected]

                                      I don't like saying the full phrase, it's a disgusting merger of words that shouldn't exist.

                                      K This user is from outside of this forum
                                      K This user is from outside of this forum
                                      [email protected]
                                      wrote on last edited by
                                      #28

                                      Very true, thanks for your sensitivity @dumbass

                                      F 1 Reply Last reply
                                      0
                                      • dumbass@leminal.spaceD [email protected]

                                        I don't like saying the full phrase, it's a disgusting merger of words that shouldn't exist.

                                        N This user is from outside of this forum
                                        N This user is from outside of this forum
                                        [email protected]
                                        wrote on last edited by
                                        #29

                                        and it's wrong, too. it's not pornography, it's rape.

                                        1 Reply Last reply
                                        0
                                        • J [email protected]

                                          Who actually gets hurt in AI generated cp? The servers?

                                          S This user is from outside of this forum
                                          S This user is from outside of this forum
                                          [email protected]
                                          wrote on last edited by
                                          #30

                                          All the little girls it learned from.

                                          J 1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups