Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Technology
  3. GenAI website goes dark after explicit fakes exposed

GenAI website goes dark after explicit fakes exposed

Scheduled Pinned Locked Moved Technology
technology
102 Posts 50 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • C [email protected]

    This is the type of shit that radicalizes me against generative AI. It's done so much more harm than good.

    B This user is from outside of this forum
    B This user is from outside of this forum
    [email protected]
    wrote on last edited by
    #4

    The craziest thing to me is there was movements to advocate the creation of CP through AI to help those addicted to it as it "wasn't real" and there were no victims involved in it. But no comments regarding how the LLM gained the models to generate those images or the damages that will come when such things get normalized.

    It just should never be normalized or exist.

    C ? chozo@fedia.ioC B 4 Replies Last reply
    0
    • B [email protected]

      The craziest thing to me is there was movements to advocate the creation of CP through AI to help those addicted to it as it "wasn't real" and there were no victims involved in it. But no comments regarding how the LLM gained the models to generate those images or the damages that will come when such things get normalized.

      It just should never be normalized or exist.

      C This user is from outside of this forum
      C This user is from outside of this forum
      [email protected]
      wrote on last edited by
      #5

      Anything like that involving children or child like individuals is a hard fucking no from me. It's like those mfs who have art of a little anime girl and go "actually shes a 5000 vampire." They know exactly what the fuck they're doing. I also hate the argument of "it's not real" like mf the sentiment is still there.

      1 Reply Last reply
      0
      • B [email protected]

        The craziest thing to me is there was movements to advocate the creation of CP through AI to help those addicted to it as it "wasn't real" and there were no victims involved in it. But no comments regarding how the LLM gained the models to generate those images or the damages that will come when such things get normalized.

        It just should never be normalized or exist.

        ? Offline
        ? Offline
        Guest
        wrote on last edited by
        #6

        Probably got all the data to train for it from the pentagon. They're known for having tons of it and a lot of their staff (more than 25%) are used to seeing it frequently.

        Easily searchable, though I don't like to search for that shit, but here's 1 post if you literally add pentagon to c____ p___ in a search a million articles on DIFFERENT subjects (than this house bill) come up https://thehill.com/policy/cybersecurity/451383-house-bill-aims-to-stop-use-of-pentagon-networks-for-sharing-child/

        swelter_spark@reddthat.comS 1 Reply Last reply
        0
        • dumbass@leminal.spaceD [email protected]

          Ai cp, they found AI generated cp that had been generated on their service...

          Explicit fakes makes it sound less bad.

          They were allowing AI cp to be made.

          venusaur@lemmy.worldV This user is from outside of this forum
          venusaur@lemmy.worldV This user is from outside of this forum
          [email protected]
          wrote on last edited by
          #7

          Is “CP” so you don’t get flagged, or is it for sensitivity.

          dumbass@leminal.spaceD 1 Reply Last reply
          1
          0
          • venusaur@lemmy.worldV [email protected]

            Is “CP” so you don’t get flagged, or is it for sensitivity.

            dumbass@leminal.spaceD This user is from outside of this forum
            dumbass@leminal.spaceD This user is from outside of this forum
            [email protected]
            wrote on last edited by
            #8

            I don't like saying the full phrase, it's a disgusting merger of words that shouldn't exist.

            F K N 3 Replies Last reply
            0
            • B [email protected]

              The craziest thing to me is there was movements to advocate the creation of CP through AI to help those addicted to it as it "wasn't real" and there were no victims involved in it. But no comments regarding how the LLM gained the models to generate those images or the damages that will come when such things get normalized.

              It just should never be normalized or exist.

              chozo@fedia.ioC This user is from outside of this forum
              chozo@fedia.ioC This user is from outside of this forum
              [email protected]
              wrote on last edited by
              #9

              Just for what it's worth, you don't need CSAM in the training material for a generative AI to produce CSAM. The models know what children look like, and what naked adults look like, so they can readily extrapolate from there.

              The fact that you don't need to actually supply any real CSAM to the training material is the reasoning being offered for supporting AI CSAM. It's gross, but it's also hard to argue with. We allow for all types of illegal subjects to be presented in porn; incest, rape, murder, etc. While most mainstream sites won't allow those types of material, none of them are technically outlawed - partly because of freedom of speech and artistic expression and yadda yadda, but also partly because it all comes with the understanding that it's a fake, made-for-film production and that nobody involved had their consent violated, so it's okay because none of it was actually rape, incest, or murder. And if AI CSAM can be made without actually violating the consent of any real people, then what makes it different?

              I don't know how I feel about it, myself. The idea of "ethically-sourced" CSAM doesn't exactly sit right with me, but if it's possible to make it in a truly victimless manner, then I find it hard to argue outright banning something just because I don't like it.

              C 1 Reply Last reply
              0
              • alphane_moon@lemmy.worldA [email protected]
                This post did not contain any content.
                J This user is from outside of this forum
                J This user is from outside of this forum
                [email protected]
                wrote on last edited by
                #10

                Who actually gets hurt in AI generated cp? The servers?

                B S B M G 5 Replies Last reply
                0
                • J [email protected]

                  Who actually gets hurt in AI generated cp? The servers?

                  B This user is from outside of this forum
                  B This user is from outside of this forum
                  [email protected]
                  wrote on last edited by
                  #11

                  J 1 Reply Last reply
                  0
                  • B [email protected]

                    J This user is from outside of this forum
                    J This user is from outside of this forum
                    [email protected]
                    wrote on last edited by
                    #12

                    I'm no pedo, but what you do in your own home and hurts nobody is your own thing.

                    reseller_pledge609@lemmy.dbzer0.comR 1 Reply Last reply
                    0
                    • J [email protected]

                      I'm no pedo, but what you do in your own home and hurts nobody is your own thing.

                      reseller_pledge609@lemmy.dbzer0.comR This user is from outside of this forum
                      reseller_pledge609@lemmy.dbzer0.comR This user is from outside of this forum
                      [email protected]
                      wrote on last edited by
                      #13

                      Yes, but how is the AI making the images or videos? It has to be trained on SOMETHING.

                      So, regardless of direct harm or not, harm is done at some point in the process and it needs to be stopped before it slips and gets worse because people "get used to" it.

                      K M M 3 Replies Last reply
                      0
                      • chozo@fedia.ioC [email protected]

                        Just for what it's worth, you don't need CSAM in the training material for a generative AI to produce CSAM. The models know what children look like, and what naked adults look like, so they can readily extrapolate from there.

                        The fact that you don't need to actually supply any real CSAM to the training material is the reasoning being offered for supporting AI CSAM. It's gross, but it's also hard to argue with. We allow for all types of illegal subjects to be presented in porn; incest, rape, murder, etc. While most mainstream sites won't allow those types of material, none of them are technically outlawed - partly because of freedom of speech and artistic expression and yadda yadda, but also partly because it all comes with the understanding that it's a fake, made-for-film production and that nobody involved had their consent violated, so it's okay because none of it was actually rape, incest, or murder. And if AI CSAM can be made without actually violating the consent of any real people, then what makes it different?

                        I don't know how I feel about it, myself. The idea of "ethically-sourced" CSAM doesn't exactly sit right with me, but if it's possible to make it in a truly victimless manner, then I find it hard to argue outright banning something just because I don't like it.

                        C This user is from outside of this forum
                        C This user is from outside of this forum
                        [email protected]
                        wrote on last edited by
                        #14

                        There is also the angle of generated CSAM looking real adding difficulty in prosecuting real CSAM producers.

                        M P 2 Replies Last reply
                        0
                        • W This user is from outside of this forum
                          W This user is from outside of this forum
                          [email protected]
                          wrote on last edited by
                          #15

                          Its a very difficult subject, both sides have merit. I can see the "CSAM created without abuse could be used in treatment/management of people with these horrible urges" but I can also see "Allowing people to create CSAM could normalise it and lead to more actual abuse".

                          Sadly its incredibly difficult for academics to study this subject and see which of those two is more prevalent.

                          1 Reply Last reply
                          0
                          • reseller_pledge609@lemmy.dbzer0.comR [email protected]

                            Yes, but how is the AI making the images or videos? It has to be trained on SOMETHING.

                            So, regardless of direct harm or not, harm is done at some point in the process and it needs to be stopped before it slips and gets worse because people "get used to" it.

                            K This user is from outside of this forum
                            K This user is from outside of this forum
                            [email protected]
                            wrote on last edited by
                            #16

                            I wouldn't think it needs to have child porn in the training data to be able to generate it. It has porn as the data, it knows what kids look like, merge the two. I think that works for anything AI knows about, make this resemble this.

                            reseller_pledge609@lemmy.dbzer0.comR 1 Reply Last reply
                            0
                            • reseller_pledge609@lemmy.dbzer0.comR [email protected]

                              Yes, but how is the AI making the images or videos? It has to be trained on SOMETHING.

                              So, regardless of direct harm or not, harm is done at some point in the process and it needs to be stopped before it slips and gets worse because people "get used to" it.

                              M This user is from outside of this forum
                              M This user is from outside of this forum
                              [email protected]
                              wrote on last edited by
                              #17

                              Ai can combine two things. It can train on completely normal pictures of children, and it can train on completely normal porn, and then it can put those together.

                              This is the same reason it can do something like Godzilla with Sailor Moon's hair, not because it trained on images of Godzilla with Sailor Moon's hair, but because it can combine those two separate things.

                              reseller_pledge609@lemmy.dbzer0.comR R 2 Replies Last reply
                              0
                              • M [email protected]

                                Ai can combine two things. It can train on completely normal pictures of children, and it can train on completely normal porn, and then it can put those together.

                                This is the same reason it can do something like Godzilla with Sailor Moon's hair, not because it trained on images of Godzilla with Sailor Moon's hair, but because it can combine those two separate things.

                                reseller_pledge609@lemmy.dbzer0.comR This user is from outside of this forum
                                reseller_pledge609@lemmy.dbzer0.comR This user is from outside of this forum
                                [email protected]
                                wrote on last edited by
                                #18

                                Fair enough. I still think it shouldn't be allowed though.

                                F 1 Reply Last reply
                                0
                                • K [email protected]

                                  I wouldn't think it needs to have child porn in the training data to be able to generate it. It has porn as the data, it knows what kids look like, merge the two. I think that works for anything AI knows about, make this resemble this.

                                  reseller_pledge609@lemmy.dbzer0.comR This user is from outside of this forum
                                  reseller_pledge609@lemmy.dbzer0.comR This user is from outside of this forum
                                  [email protected]
                                  wrote on last edited by
                                  #19

                                  That's fair, but I still think it shouldn't be accepted or allowed.

                                  K J 2 Replies Last reply
                                  0
                                  • reseller_pledge609@lemmy.dbzer0.comR [email protected]

                                    Yes, but how is the AI making the images or videos? It has to be trained on SOMETHING.

                                    So, regardless of direct harm or not, harm is done at some point in the process and it needs to be stopped before it slips and gets worse because people "get used to" it.

                                    M This user is from outside of this forum
                                    M This user is from outside of this forum
                                    [email protected]
                                    wrote on last edited by
                                    #20

                                    needs to be stopped before it slips and gets worse because people "get used to" it.

                                    Ah, right, almost finally forgot the killer games rhetoric.

                                    reseller_pledge609@lemmy.dbzer0.comR 1 Reply Last reply
                                    0
                                    • M [email protected]

                                      needs to be stopped before it slips and gets worse because people "get used to" it.

                                      Ah, right, almost finally forgot the killer games rhetoric.

                                      reseller_pledge609@lemmy.dbzer0.comR This user is from outside of this forum
                                      reseller_pledge609@lemmy.dbzer0.comR This user is from outside of this forum
                                      [email protected]
                                      wrote on last edited by
                                      #21

                                      I also don't agree with the killer games thing, but humans are very adaptable as a species.

                                      Normally that's a good thing, but in a case like this exposure to something shocking or upsetting can make it less shocking or upsetting over time (obviously not in every case). So, if AI is being used for something like this and being reported on isn't it possible that people might slowly get desensitized to it over time?

                                      M 1 Reply Last reply
                                      0
                                      • M [email protected]

                                        Ai can combine two things. It can train on completely normal pictures of children, and it can train on completely normal porn, and then it can put those together.

                                        This is the same reason it can do something like Godzilla with Sailor Moon's hair, not because it trained on images of Godzilla with Sailor Moon's hair, but because it can combine those two separate things.

                                        R This user is from outside of this forum
                                        R This user is from outside of this forum
                                        [email protected]
                                        wrote on last edited by
                                        #22

                                        Only the real things are actual humans who have likely not consented to ever being in this database at all let alone having parts of their likeness being used for this horrific shit. There is no moral argument for this garbage:

                                        J 1 Reply Last reply
                                        0
                                        • dumbass@leminal.spaceD [email protected]

                                          I don't like saying the full phrase, it's a disgusting merger of words that shouldn't exist.

                                          F This user is from outside of this forum
                                          F This user is from outside of this forum
                                          [email protected]
                                          wrote on last edited by
                                          #23

                                          FYI, the current accepted term is csam. Children sexual abuse material. The reason why CP is wrong is that porn implies, or should imply, that there's consent on the sexual act, and children cannot consent.

                                          You are right, it's a disgusting merger exactly because it implies something that's absolutely incorrect and wrong.

                                          T 1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups