Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Technology
  3. GenAI website goes dark after explicit fakes exposed

GenAI website goes dark after explicit fakes exposed

Scheduled Pinned Locked Moved Technology
technology
102 Posts 50 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • S [email protected]

    If you think that AI is only trained on legal images, I can't convince you otherwise.

    E This user is from outside of this forum
    E This user is from outside of this forum
    [email protected]
    wrote on last edited by
    #72

    What AI are you talking about? Are you suggesting the commercial models from OpenAI are trained using CP? Or just that there are some models out there that were trained using CP? Because yeah, anyone can create a model at home and train it with whatever. But suggesting that OpenAI has a DB of tagged CP is a different story.

    S 1 Reply Last reply
    0
    • K [email protected]

      Wow so its from the duh region I'm france, here I thought it was just sparkling dumbass

      M This user is from outside of this forum
      M This user is from outside of this forum
      [email protected]
      wrote on last edited by
      #73

      Nope, it's the fully alcoholic dumbass, not that shitty grape juice variety!

      1 Reply Last reply
      0
      • M [email protected]

        This, above any other reason, is why I'm most troubled with AI CSAM. I don't care what anyone gets off to if no one is harmed, but the fact that real CSAM could be created and be indistinguishable from AI created, is a real harm.

        And I instinctively ask, who would bother producing it for real when AI is cheap and harmless? But people produce it for reasons other than money and there are places in this world where a child's life is probably less valuable than the electricity used to create images.

        I fundamentally think AI should be completely uncensored. Because I think censorship limits and harms uses for it that might otherwise be good. I think if 12 year old me could've had an AI show me where the clitoris is on a girl or what the fuck a hymen looks like, or answer questions about my own body, I think I would've had a lot less confusion and uncertainty in my burgeoning sexuality. Maybe I'd have had less curiosity about what my classmates looked like under their clothes, leading to questionable decisions on my part.

        I can find a million arguments why AI shouldn't be censored. Like, do you know ChatGPT can be convinced to describe vaginal and oral sex in a romantic fiction is fine, but if it's anal sex, it has a much higher refusal rate? Is that subtle anti-gay encoding in the training data? It also struggles with polyamory when it's two men and a woman but less when it's two women and a man. What's the long-term impact when these biases are built into everyday tools? These are concerns I consider all the time.

        But at the end of the day, the idea that there are children out there being abused and consumed and no one will even look for them because "it's probably just AI" isn't something I can bear no matter how firm my convictions are about uncensored AI. It's something I struggle to reconcile.

        M This user is from outside of this forum
        M This user is from outside of this forum
        [email protected]
        wrote on last edited by
        #74

        Maybe the weird, extra human finger and appendage issues in AI images are a feature, not bugs. Maybe it's a naturally occurring, unintended consequences of their learning and feedback process to sabotage the output they generate in order to make it obvious the image is fake.

        /s (sort of)

        1 Reply Last reply
        0
        • F [email protected]

          FYI, the current accepted term is csam. Children sexual abuse material. The reason why CP is wrong is that porn implies, or should imply, that there's consent on the sexual act, and children cannot consent.

          You are right, it's a disgusting merger exactly because it implies something that's absolutely incorrect and wrong.

          T This user is from outside of this forum
          T This user is from outside of this forum
          [email protected]
          wrote on last edited by
          #75

          If we are pedantic, I'm not sure if "children cannot consent" is correct. Children at 16 are mature enough to give consent in legal context, we as a society just frown upon older adults mingling with them.

          B S F 3 Replies Last reply
          0
          • johnedwa@sopuli.xyzJ [email protected]

            Pictures of clothed children and naked adults.

            Nobody trained them on what things made out of spaghetti look like, but they can generate them because smushing multiple things together is precisely what they do.

            blackmist@feddit.ukB This user is from outside of this forum
            blackmist@feddit.ukB This user is from outside of this forum
            [email protected]
            wrote on last edited by
            #76

            Given the "we spared no expense" attitude to the rest of the data these things are trained on, I fear that may be wishful thinking...

            1 Reply Last reply
            0
            • E [email protected]

              What AI are you talking about? Are you suggesting the commercial models from OpenAI are trained using CP? Or just that there are some models out there that were trained using CP? Because yeah, anyone can create a model at home and train it with whatever. But suggesting that OpenAI has a DB of tagged CP is a different story.

              S This user is from outside of this forum
              S This user is from outside of this forum
              [email protected]
              wrote on last edited by
              #77

              Open AI just scours the Internet. 100% chance it's come across someone illegal and horrible. They don't pre-approve its training data.

              E 1 Reply Last reply
              0
              • J [email protected]

                I mean, you're not giving a very convincing argument.

                S This user is from outside of this forum
                S This user is from outside of this forum
                [email protected]
                wrote on last edited by
                #78

                AI models are trained on the open Internet. Not curated. Open Internet has horrible things.

                J 1 Reply Last reply
                0
                • S [email protected]

                  Open AI just scours the Internet. 100% chance it's come across someone illegal and horrible. They don't pre-approve its training data.

                  E This user is from outside of this forum
                  E This user is from outside of this forum
                  [email protected]
                  wrote on last edited by
                  #79

                  But you have to describe it. It doesn't just suck in images at random. I imagine someone will remove CP when the images are reviewed. Or do you think they just download all images and add them to the training set without even looking at them?

                  S 1 Reply Last reply
                  0
                  • T [email protected]

                    If we are pedantic, I'm not sure if "children cannot consent" is correct. Children at 16 are mature enough to give consent in legal context, we as a society just frown upon older adults mingling with them.

                    B This user is from outside of this forum
                    B This user is from outside of this forum
                    [email protected]
                    wrote on last edited by
                    #80

                    buddy, theres a whole range of ages between 0 and 16, and i use that number 0 advisedly.

                    1 Reply Last reply
                    0
                    • T [email protected]

                      If we are pedantic, I'm not sure if "children cannot consent" is correct. Children at 16 are mature enough to give consent in legal context, we as a society just frown upon older adults mingling with them.

                      S This user is from outside of this forum
                      S This user is from outside of this forum
                      [email protected]
                      wrote on last edited by
                      #81

                      When I was 16 I would have totally posed for porn and I would have been completely consenting. But it would have been illegal. I wonder where we should draw the line, and if the current one is the best one.

                      1 Reply Last reply
                      0
                      • swelter_spark@reddthat.comS [email protected]

                        When my dad worked for the DoD, he was assigned a laptop for work that had explicit photos of children on it.

                        S This user is from outside of this forum
                        S This user is from outside of this forum
                        [email protected]
                        wrote on last edited by
                        #82

                        For what scope would they do that?

                        swelter_spark@reddthat.comS 1 Reply Last reply
                        0
                        • T [email protected]

                          If we are pedantic, I'm not sure if "children cannot consent" is correct. Children at 16 are mature enough to give consent in legal context, we as a society just frown upon older adults mingling with them.

                          F This user is from outside of this forum
                          F This user is from outside of this forum
                          [email protected]
                          wrote on last edited by
                          #83

                          Legally speaking children can't consent, which is why it's illegal and the basis of my statement. I wasn't being pedantic, I was showing a new terminology.

                          1 Reply Last reply
                          0
                          • M [email protected]

                            Generated AI CP should be illegalized even if its creation did not technically harm anyone. The reason is, presumably it looks too close to real CP, so close that it: 1) normalizes consumption of CP, 2) grows a market for CP, and 3) Real CP could get off the hook by claiming it is AI.

                            While there are similar reasons to be against clearly not real CP (e.g. hentai), this type at least does not have problem #3. For example, there doesnt need to be an investigation into whether a picture is real or not.

                            N This user is from outside of this forum
                            N This user is from outside of this forum
                            [email protected]
                            wrote on last edited by
                            #84

                            Fun fact it's already illegal. If it's indistinguishable from the real thing it's a crime.

                            R 1 Reply Last reply
                            0
                            • S [email protected]

                              AI models are trained on the open Internet. Not curated. Open Internet has horrible things.

                              J This user is from outside of this forum
                              J This user is from outside of this forum
                              [email protected]
                              wrote on last edited by
                              #85

                              So is that the Gen AI problem or the open internets problem. It sounds like you hate the open internet and awful people who put real cp online and not Gen AI.

                              S 1 Reply Last reply
                              0
                              • I [email protected]

                                What the fuck is AI being trained on to produce the stuff?

                                F This user is from outside of this forum
                                F This user is from outside of this forum
                                [email protected]
                                wrote on last edited by
                                #86

                                if you have a soup of all liquids and a sieve that only lets coffee and ice cream through it produces coffee ice cream (metaphor, don't think too hard about it)

                                that's how gen ai works. each step sieves out raw data to get closer to the prompt.

                                1 Reply Last reply
                                0
                                • M [email protected]

                                  But what if pedophiles in therapy are less likely to commit a crime if they have access to respective porn? Even better then, if it can be AI generated, no?

                                  J This user is from outside of this forum
                                  J This user is from outside of this forum
                                  [email protected]
                                  wrote on last edited by
                                  #87

                                  Japan is country that has legal drawn cp. It's available in physical store and online. Yet Japan is much lower than most developed country in the world in terms of actual sexual child abuse.

                                  1 Reply Last reply
                                  0
                                  • S [email protected]

                                    For what scope would they do that?

                                    swelter_spark@reddthat.comS This user is from outside of this forum
                                    swelter_spark@reddthat.comS This user is from outside of this forum
                                    [email protected]
                                    wrote on last edited by
                                    #88

                                    All laptops are supposed to be formatted and have the necessary software freshly installed before being assigned to someone. Either it wasn't wiped by accident, or the person whose job it was found the CP and left it, hoping my dad would report it. He deleted it, though, because was afraid he'd be blamed.

                                    1 Reply Last reply
                                    0
                                    • J [email protected]

                                      But who is actually getting hurt? No kid has gotten hurt using Gen AI.

                                      O This user is from outside of this forum
                                      O This user is from outside of this forum
                                      [email protected]
                                      wrote on last edited by
                                      #89

                                      A child whose abuse images are used to generate AI CP can be re-victimized by it, without even getting at the issues with normalizing it.

                                      J 1 Reply Last reply
                                      0
                                      • J [email protected]

                                        The biggest issue with this line of thinking is, how do you prove it's CP without a victim. I suppose at a certain threshold it becomes obvious, but that can be a very blurry line (there was a famous case where a porn star had to be flown to a court case to prove the video wasn't CP, but can't find the link right now).

                                        So your left with a crime that was committed with no victim and no proof, which can be really easy to abuse.

                                        C This user is from outside of this forum
                                        C This user is from outside of this forum
                                        [email protected]
                                        wrote on last edited by
                                        #90

                                        This sort of reminds myself on the discussion on "what is a women". Is Siri a women? Many might say so, but t the same time Siri is not even human.

                                        The question on how old the person on a specific generated image might be and if it even depicts a person at all, can only be answered through society. There is no scientific or any logical answer for this.

                                        So this will always have grey areas and differing opinions and can be rulings in different cultures.

                                        In the end it is about discussions about ethics not logic.

                                        J 1 Reply Last reply
                                        0
                                        • C [email protected]

                                          This sort of reminds myself on the discussion on "what is a women". Is Siri a women? Many might say so, but t the same time Siri is not even human.

                                          The question on how old the person on a specific generated image might be and if it even depicts a person at all, can only be answered through society. There is no scientific or any logical answer for this.

                                          So this will always have grey areas and differing opinions and can be rulings in different cultures.

                                          In the end it is about discussions about ethics not logic.

                                          J This user is from outside of this forum
                                          J This user is from outside of this forum
                                          [email protected]
                                          wrote on last edited by
                                          #91

                                          Definitely, and that's why hard/strict laws or rules can be dangerous. Much like the famous "I know it when I see it" judgment on obscenity.

                                          C 1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups