Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Technology
  3. First off, I am second positive, pro porn, pro sex work, and don't believe sex work should be shameful, and that there is nothing **inherently** wrong about buying intimacy from a willing seller.

First off, I am second positive, pro porn, pro sex work, and don't believe sex work should be shameful, and that there is nothing **inherently** wrong about buying intimacy from a willing seller.

Scheduled Pinned Locked Moved Technology
83 Posts 27 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • K [email protected]

    With any model in use, currently, that is impossible to meet. All models are trained on real images.

    yes but if i go to thispersondoesnotexist.com and generate a random person, is that going to resemble the likeness of any given real person close enough to perceptibly be them?

    You are literally using the schizo argument right now. "If an artists creates a piece depicting no specific person, but his understanding of persons is based inherently on the facial structures of other people that he knows and recognizes, therefore he must be stealing their likeness"

    U This user is from outside of this forum
    U This user is from outside of this forum
    [email protected]
    wrote on last edited by
    #65

    No, the problem is a lack of consent of the person being used.

    And now, being used to generate depictions of rape and CSAM.

    K 1 Reply Last reply
    0
    • W [email protected]

      without a victim

      You are wrong.

      AI media models have to be trained on real media. The illegal content would mean illegal media and bentiting/supporting/profiting from a crime at minimum.

      M This user is from outside of this forum
      M This user is from outside of this forum
      [email protected]
      wrote on last edited by
      #66

      Excuse me? I am very offended by your insinuations here. It honestly makes me not want to share my thought and opinions at all. I am not in any way interested in this kind of content.

      I encourage you to read my other posts in the different threads here and see. I am not an apologist, and do not condone it either.

      I do genuinely believe AI can generate content it is not trained on, that's why I claimed it can generate illegal content without a victim. Because it can combine stuff from things it is trained on and end up with something original.

      I am interested in learning and discussing the consequences of an emerging and novel technology on society. This is a part of that. Even if it is uncomfortable to discuss.

      You made me wish I didn't..

      Y 1 Reply Last reply
      0
      • M [email protected]

        Excuse me? I am very offended by your insinuations here. It honestly makes me not want to share my thought and opinions at all. I am not in any way interested in this kind of content.

        I encourage you to read my other posts in the different threads here and see. I am not an apologist, and do not condone it either.

        I do genuinely believe AI can generate content it is not trained on, that's why I claimed it can generate illegal content without a victim. Because it can combine stuff from things it is trained on and end up with something original.

        I am interested in learning and discussing the consequences of an emerging and novel technology on society. This is a part of that. Even if it is uncomfortable to discuss.

        You made me wish I didn't..

        Y This user is from outside of this forum
        Y This user is from outside of this forum
        [email protected]
        wrote on last edited by
        #67

        Don't pay any attention to that kinda stupid comment. Anyone posting that kind of misinformation about AI is either trolling or incapable of understanding how generative AI works.

        You are right it is a victimless crime (for the creation of content). I could create porn with minions without using real minion porn to put the randomnest example I could think of. There's the whole defamation thing of publishing content without someone's permission but that I feel is a discussion irrelevant of AI (we could already create nasty images of someone before AI, AI just makes it easier). But using such content for personal use... It is victimless. I have a hard time thinking against it. Would availability of AI created content with unethical themes allow people to get that out of their system without creating victims? Would that make the far riskier and horrible business of creating illegal content with real unwilful people disappear? Or at the very least much more uncommon? Or would make people more willing to consume thw content creating a feelibg of fake safety towards content previously illegal? There's a lot of implications that we should really be thinking about and how it would affect society, for better or worse...

        M 1 Reply Last reply
        0
        • Y [email protected]

          Don't pay any attention to that kinda stupid comment. Anyone posting that kind of misinformation about AI is either trolling or incapable of understanding how generative AI works.

          You are right it is a victimless crime (for the creation of content). I could create porn with minions without using real minion porn to put the randomnest example I could think of. There's the whole defamation thing of publishing content without someone's permission but that I feel is a discussion irrelevant of AI (we could already create nasty images of someone before AI, AI just makes it easier). But using such content for personal use... It is victimless. I have a hard time thinking against it. Would availability of AI created content with unethical themes allow people to get that out of their system without creating victims? Would that make the far riskier and horrible business of creating illegal content with real unwilful people disappear? Or at the very least much more uncommon? Or would make people more willing to consume thw content creating a feelibg of fake safety towards content previously illegal? There's a lot of implications that we should really be thinking about and how it would affect society, for better or worse...

          M This user is from outside of this forum
          M This user is from outside of this forum
          [email protected]
          wrote on last edited by
          #68

          Don't pay any attention to that kinda stupid comment. Anyone posting that kind of misinformation about AI is either trolling or incapable of understanding how generative AI works

          Thank you 😊

          1 Reply Last reply
          0
          • J [email protected]

            That's the same as saying we shouldn't be able to make videos with murder in them because there is no way to tell if they're real or not.

            F This user is from outside of this forum
            F This user is from outside of this forum
            [email protected]
            wrote on last edited by
            #69

            That’s a good point, but there’s much less of a market for murder video industry

            J 1 Reply Last reply
            0
            • F [email protected]

              That’s a good point, but there’s much less of a market for murder video industry

              J This user is from outside of this forum
              J This user is from outside of this forum
              [email protected]
              wrote on last edited by
              #70

              I mean, a lot of TV has murders in it. There is a huge market for showing realistic murder.

              But I get the feeling your saying that there isn't a huge market for showing real people dying realistically without their permission. But that's more a technicality. The question is, is the content or the production of the content illegal. If it's not a real person, who is the victim of the crime.

              F 1 Reply Last reply
              0
              • J [email protected]

                I mean, a lot of TV has murders in it. There is a huge market for showing realistic murder.

                But I get the feeling your saying that there isn't a huge market for showing real people dying realistically without their permission. But that's more a technicality. The question is, is the content or the production of the content illegal. If it's not a real person, who is the victim of the crime.

                F This user is from outside of this forum
                F This user is from outside of this forum
                [email protected]
                wrote on last edited by
                #71

                Yeah the latter. Also murder in films for the most part of storytelling. It’s not murder simulations for serial killers to get off to, you know what I mean?

                1 Reply Last reply
                0
                • ultragigagigantic@lemmy.mlU [email protected]

                  There is no ethical consumption while living a capitalist way of life.

                  P This user is from outside of this forum
                  P This user is from outside of this forum
                  [email protected]
                  wrote on last edited by
                  #72

                  😆as if this has something to do with that

                  But to your argument: It is perfectly possible to tune capitalism using laws to get veerry social.

                  I mean every “actually existing communist country” is in its core still a capitalist system, or how you argue against that?

                  1 Reply Last reply
                  0
                  • S [email protected]

                    We could be sure of it if AI curated it's inputs, which really isn't too much to ask.

                    P This user is from outside of this forum
                    P This user is from outside of this forum
                    [email protected]
                    wrote on last edited by
                    #73

                    Well AI is by design not able to curate its training data, but companies training the models would in theory be able to. But it is not feasible to sanitise this huge stack of data.

                    1 Reply Last reply
                    0
                    • U [email protected]

                      No, the problem is a lack of consent of the person being used.

                      And now, being used to generate depictions of rape and CSAM.

                      K This user is from outside of this forum
                      K This user is from outside of this forum
                      [email protected]
                      wrote on last edited by
                      #74

                      yeah but like, legally, is this even a valid argument? Sure there is techically probably like 0.0001% of the average person being used in any given result of an AI generated image. I don't think that gives anyone explicit rights to that portion however.

                      That's like arguing that a photographer who captured you in a random photo in public that became super famous is now required to pay you royalties for being in that image, even though you are literally just a random fucking person.

                      You can argue about consent all you want, but at the end of the day if you're posting images of yourself online, you are consenting to other people looking at them, at a minimum. Arguably implicitly consenting to other people being able to use those images. (because you can't stop people from doing that, except for copyright, but that's not very strict in most cases)

                      And now, being used to generate depictions of rape and CSAM.

                      i dont see how this is even relevant, unless the person in question is a minor, a victim, or becoming a victim, otherwise it's no different than me editing an image of someone to make it look like they got shot in the face. Is that shitty? Sure. But i don't know of any laws that prevent you from doing that, unless it's explicitly to do with something like blackmail, extortion, or harassment.

                      U 1 Reply Last reply
                      0
                      • D [email protected]

                        But the thing is it's not a relevant law here at all as nothing is being distributed and no one is being harmed. Would you say the same thing if AI is not involved? Sure it can be creepy and weird and whatnot but it's not inhertly harmful or at least it's not obvious how it would be.

                        K This user is from outside of this forum
                        K This user is from outside of this forum
                        [email protected]
                        wrote on last edited by
                        #75

                        the only perceivable reason to create these videos is either for private consumption, in which case, who gives a fuck. Or for public distribution, otherwise you wouldn't create them. And you'd have to be a bit of a weird breed to create AI porn of specific people for private consumption.

                        If AI isn't involved, the same general principles would apply, except it might include more people now.

                        D 1 Reply Last reply
                        0
                        • K [email protected]

                          the only perceivable reason to create these videos is either for private consumption, in which case, who gives a fuck. Or for public distribution, otherwise you wouldn't create them. And you'd have to be a bit of a weird breed to create AI porn of specific people for private consumption.

                          If AI isn't involved, the same general principles would apply, except it might include more people now.

                          D This user is from outside of this forum
                          D This user is from outside of this forum
                          [email protected]
                          wrote on last edited by
                          #76

                          I've been thinking about this more and I think one interesting argument here is "toxic culture growth". As in even if the thing is not distributed it might grow undesired toxic cultures througu indirect exposures (like social media or forums discussions) even without the direct sharing.

                          I think this is slippery to the point of government mind control but maybe there's something valuable to research here either way.

                          K 1 Reply Last reply
                          0
                          • K [email protected]

                            yeah but like, legally, is this even a valid argument? Sure there is techically probably like 0.0001% of the average person being used in any given result of an AI generated image. I don't think that gives anyone explicit rights to that portion however.

                            That's like arguing that a photographer who captured you in a random photo in public that became super famous is now required to pay you royalties for being in that image, even though you are literally just a random fucking person.

                            You can argue about consent all you want, but at the end of the day if you're posting images of yourself online, you are consenting to other people looking at them, at a minimum. Arguably implicitly consenting to other people being able to use those images. (because you can't stop people from doing that, except for copyright, but that's not very strict in most cases)

                            And now, being used to generate depictions of rape and CSAM.

                            i dont see how this is even relevant, unless the person in question is a minor, a victim, or becoming a victim, otherwise it's no different than me editing an image of someone to make it look like they got shot in the face. Is that shitty? Sure. But i don't know of any laws that prevent you from doing that, unless it's explicitly to do with something like blackmail, extortion, or harassment.

                            U This user is from outside of this forum
                            U This user is from outside of this forum
                            [email protected]
                            wrote on last edited by
                            #77

                            yeah but like, legally, is this even a valid argument?

                            Personally, legal is only what the law allows the wealthy to do, and provides punishments for the working class.

                            Morally, that's what you're doing when you use AI to generate CSAM. Its the same idea why we ban all pre-created CSAM, as well, because you are victimizing the person every single time.

                            i dont see how this is even relevant, unless the person in question is a minor, a victim, or becoming a victim,

                            It makes them a victim.

                            But i don’t know of any laws that prevent you from doing that, unless it’s explicitly to do with something like blackmail, extortion, or harassment.

                            The law exists to protect the ruling class while not binding them, and to bind the working class without protecting them.

                            Does a facial structure recognition model use the likeness of other people?

                            Yes.

                            Even though it can detect any person that meets the requirements established by its training data? There is no suitable method to begin to breakdown at what point that persons likeness begins, and at what point it ends. it’s simply an impossible task.

                            Exactly. So, without consent, it shouldn't be used. Periodt.

                            K 1 Reply Last reply
                            0
                            • U [email protected]

                              yeah but like, legally, is this even a valid argument?

                              Personally, legal is only what the law allows the wealthy to do, and provides punishments for the working class.

                              Morally, that's what you're doing when you use AI to generate CSAM. Its the same idea why we ban all pre-created CSAM, as well, because you are victimizing the person every single time.

                              i dont see how this is even relevant, unless the person in question is a minor, a victim, or becoming a victim,

                              It makes them a victim.

                              But i don’t know of any laws that prevent you from doing that, unless it’s explicitly to do with something like blackmail, extortion, or harassment.

                              The law exists to protect the ruling class while not binding them, and to bind the working class without protecting them.

                              Does a facial structure recognition model use the likeness of other people?

                              Yes.

                              Even though it can detect any person that meets the requirements established by its training data? There is no suitable method to begin to breakdown at what point that persons likeness begins, and at what point it ends. it’s simply an impossible task.

                              Exactly. So, without consent, it shouldn't be used. Periodt.

                              K This user is from outside of this forum
                              K This user is from outside of this forum
                              [email protected]
                              wrote on last edited by
                              #78

                              Personally, legal is only what the law allows the wealthy to do, and provides punishments for the working class.

                              if you have schizoprehnia, sure. Legal is what the law defines as ok. Whether or not people get charged for it is another thing. The question is "do you have the legal right to do it or not"

                              Morally, that’s what you’re doing when you use AI to generate CSAM. Its the same idea why we ban all pre-created CSAM, as well, because you are victimizing the person every single time.

                              legally, the reasoning behind this is because it's just extremely illegal, there are almost no if not zero, where it would be ok or reasonable, and therefore the moral framework tends to be developed around that. I don't necessarily agree with it always being victimization, because there are select instances where it just doesn't really make sense to consider it that, there are acts you commit that would make it victimization. However i like to subscribe to the philosophy that it is "abusive" material, and therefore innately wrong. Like blackmail, i find that to be a little bit more strict and conducive to that sort of definition.

                              It makes them a victim.

                              at one point in time yes, perpetually in some capacity, they will exist as having been a victim, or having been victimized at one point. I also don't really consider it to be healthy or productive to engage in "once a victim always a victim" mentality, because i think it sets a questionable precedent for mental health care. Semantically someone who was a victim once, is still a victim of that specific event, however it's a temporally relevant victimization, i just think people are getting a little loose on the usage of that word recently.

                              I'm still not sure how it makes that person a victim, unless it meets one of the described criteria i laid out, in which case it very explicitly becomes an abusive work. Otherwise it's questionable how you would even attribute victimization to the victim in question, because there is no explicit victim to even consider. I guess you could consider everybody even remotely tangentially relevant to be a victim, but that then opens a massive blackhole of logical reasoning which can't trivially be closed.

                              To propose a hypothetical here. Let's say there is a person who we will call bob. Bob has created a depiction of "abuse" in such a horrendous manner that even laying your eyes upon such a work will forever ruin you. We will define the work in question to be a piece of art, depicting no person in particular, arguably barely resembling a person at all, however the specific definition remains to the reader. You could hypothetically in this instance argue that even viewing the work is capable of making people a "victim" to it. However you want to work that one out.

                              The problem here, is that bob hasn't created this work in complete isolation, because he's just a person, he interacts with people, has a family, has friends, acquaintances, he's a normal person, aside from the horrors beyond human comprehension he has created. Therefore, in some capacity the influence of these people in his life, has to have influenced the work he engaged in on that piece. Are the people who know/knew bob, victims of this work as well, regardless of whether or not they have seen it, does the very act of being socially related to bob make them a victim of the work? For the purposes of the hypothetical we'll assume they haven't seen the work, and that he has only shown it to people he doesn't personally know.

                              I would argue, and i think most people would agree with me, that there is no explicit tie in between the work that bob has created, and the people he knows personally. Therefore, it would be a stretch to argue that because those people were tangentially relevant to bob, are now victims, even though they have not been influenced by it. Could it influence them in some external way, possibly causing some sort of external reaction? Yeah, that's a different story. We're not worried about that.

                              This is essentially the problem we have with AI, there is no explicit resemblance to any given person (unless defined, which i have already explicitly opted out of) or it has inherently based the image off of via training (which i have also somewhat, explicitly opted out of as well) there are two fundamental problems here that need to be answered. First of all, how are these people being victimized? By posting images publicly on the internet? Seems like they consented to people at least being aware of them, if not to some degree manipulating images of them, because there is nothing to stop that from happening (as everyone already knows from NFTs) And second of all, how are we defining these victims? What's the mechanism we use to determine the identity of these people, otherwise, we're just schizophrenically handwaving the term around calling people victims when we have no explicit way of determining that. You cannot begin to call someone a victim, if it's not even know whether they were victimized or not. You're setting an impossible precedent here.

                              Even if you can summarily answer those two questions in a decidedly explicit manner, it's still questionable whether that would even matter. Because now you would have to demonstrate some form of explicit victimization and damage resulting from that victimization. Otherwise you're just making the argument of "it's mine because i said so"

                              The law exists to protect the ruling class while not binding them, and to bind the working class without protecting them.

                              again if you're schizo, sure.

                              Yes.

                              on a loosely defined basis, yeah, in some capacity it uses the likeness of that person, but to what degree? How significantly? If the woman in the mona lisa picture was 4% some lady the artist saw three times a week due to their habits/routine would that make them suddenly entitled to some of that art piece in particular? What about the rest of it? You're running down an endless corridor of infinitely unfalsifiable, and falsifiable statements. There is no clear answer here.

                              Exactly. So, without consent, it shouldn’t be used. Periodt.

                              you need to explicitly define consent, and use, because without defining those, it's literally impossible to even begin determining the end position here.

                              U 1 Reply Last reply
                              0
                              • D [email protected]

                                I've been thinking about this more and I think one interesting argument here is "toxic culture growth". As in even if the thing is not distributed it might grow undesired toxic cultures througu indirect exposures (like social media or forums discussions) even without the direct sharing.

                                I think this is slippery to the point of government mind control but maybe there's something valuable to research here either way.

                                K This user is from outside of this forum
                                K This user is from outside of this forum
                                [email protected]
                                wrote on last edited by
                                #79

                                I’ve been thinking about this more and I think one interesting argument here is “toxic culture growth”. As in even if the thing is not distributed it might grow undesired toxic cultures through indirect exposures (like social media or forums discussions) even without the direct sharing.

                                this is another big potential as well. Does it perpetuate cultural behaviors that you want to see in society at large? Similar things like this have resulted from misogyny and the relevant history of feminism.

                                It's a whole thing.

                                I think this is slippery to the point of government mind control but maybe there’s something valuable to research here either way.

                                i think there is probably a level of government regulation that is productive, i'm just curious about how we even define where that line starts and ends, because there is not really an explicitly clear point, unless you have solid external inferences to start from.

                                D 1 Reply Last reply
                                0
                                • K [email protected]

                                  I’ve been thinking about this more and I think one interesting argument here is “toxic culture growth”. As in even if the thing is not distributed it might grow undesired toxic cultures through indirect exposures (like social media or forums discussions) even without the direct sharing.

                                  this is another big potential as well. Does it perpetuate cultural behaviors that you want to see in society at large? Similar things like this have resulted from misogyny and the relevant history of feminism.

                                  It's a whole thing.

                                  I think this is slippery to the point of government mind control but maybe there’s something valuable to research here either way.

                                  i think there is probably a level of government regulation that is productive, i'm just curious about how we even define where that line starts and ends, because there is not really an explicitly clear point, unless you have solid external inferences to start from.

                                  D This user is from outside of this forum
                                  D This user is from outside of this forum
                                  [email protected]
                                  wrote on last edited by
                                  #80

                                  Honestly I'm quite happy with "social justice warrior" approach. Sure it's flawed and weak to manipulation for now but as a strategy for society to self correct its quite brilliant.

                                  I'm optimistic society itself should be able to correct itself for this issue as well though considering the current climate the correction might be very chaotic.

                                  K 1 Reply Last reply
                                  0
                                  • K [email protected]

                                    Personally, legal is only what the law allows the wealthy to do, and provides punishments for the working class.

                                    if you have schizoprehnia, sure. Legal is what the law defines as ok. Whether or not people get charged for it is another thing. The question is "do you have the legal right to do it or not"

                                    Morally, that’s what you’re doing when you use AI to generate CSAM. Its the same idea why we ban all pre-created CSAM, as well, because you are victimizing the person every single time.

                                    legally, the reasoning behind this is because it's just extremely illegal, there are almost no if not zero, where it would be ok or reasonable, and therefore the moral framework tends to be developed around that. I don't necessarily agree with it always being victimization, because there are select instances where it just doesn't really make sense to consider it that, there are acts you commit that would make it victimization. However i like to subscribe to the philosophy that it is "abusive" material, and therefore innately wrong. Like blackmail, i find that to be a little bit more strict and conducive to that sort of definition.

                                    It makes them a victim.

                                    at one point in time yes, perpetually in some capacity, they will exist as having been a victim, or having been victimized at one point. I also don't really consider it to be healthy or productive to engage in "once a victim always a victim" mentality, because i think it sets a questionable precedent for mental health care. Semantically someone who was a victim once, is still a victim of that specific event, however it's a temporally relevant victimization, i just think people are getting a little loose on the usage of that word recently.

                                    I'm still not sure how it makes that person a victim, unless it meets one of the described criteria i laid out, in which case it very explicitly becomes an abusive work. Otherwise it's questionable how you would even attribute victimization to the victim in question, because there is no explicit victim to even consider. I guess you could consider everybody even remotely tangentially relevant to be a victim, but that then opens a massive blackhole of logical reasoning which can't trivially be closed.

                                    To propose a hypothetical here. Let's say there is a person who we will call bob. Bob has created a depiction of "abuse" in such a horrendous manner that even laying your eyes upon such a work will forever ruin you. We will define the work in question to be a piece of art, depicting no person in particular, arguably barely resembling a person at all, however the specific definition remains to the reader. You could hypothetically in this instance argue that even viewing the work is capable of making people a "victim" to it. However you want to work that one out.

                                    The problem here, is that bob hasn't created this work in complete isolation, because he's just a person, he interacts with people, has a family, has friends, acquaintances, he's a normal person, aside from the horrors beyond human comprehension he has created. Therefore, in some capacity the influence of these people in his life, has to have influenced the work he engaged in on that piece. Are the people who know/knew bob, victims of this work as well, regardless of whether or not they have seen it, does the very act of being socially related to bob make them a victim of the work? For the purposes of the hypothetical we'll assume they haven't seen the work, and that he has only shown it to people he doesn't personally know.

                                    I would argue, and i think most people would agree with me, that there is no explicit tie in between the work that bob has created, and the people he knows personally. Therefore, it would be a stretch to argue that because those people were tangentially relevant to bob, are now victims, even though they have not been influenced by it. Could it influence them in some external way, possibly causing some sort of external reaction? Yeah, that's a different story. We're not worried about that.

                                    This is essentially the problem we have with AI, there is no explicit resemblance to any given person (unless defined, which i have already explicitly opted out of) or it has inherently based the image off of via training (which i have also somewhat, explicitly opted out of as well) there are two fundamental problems here that need to be answered. First of all, how are these people being victimized? By posting images publicly on the internet? Seems like they consented to people at least being aware of them, if not to some degree manipulating images of them, because there is nothing to stop that from happening (as everyone already knows from NFTs) And second of all, how are we defining these victims? What's the mechanism we use to determine the identity of these people, otherwise, we're just schizophrenically handwaving the term around calling people victims when we have no explicit way of determining that. You cannot begin to call someone a victim, if it's not even know whether they were victimized or not. You're setting an impossible precedent here.

                                    Even if you can summarily answer those two questions in a decidedly explicit manner, it's still questionable whether that would even matter. Because now you would have to demonstrate some form of explicit victimization and damage resulting from that victimization. Otherwise you're just making the argument of "it's mine because i said so"

                                    The law exists to protect the ruling class while not binding them, and to bind the working class without protecting them.

                                    again if you're schizo, sure.

                                    Yes.

                                    on a loosely defined basis, yeah, in some capacity it uses the likeness of that person, but to what degree? How significantly? If the woman in the mona lisa picture was 4% some lady the artist saw three times a week due to their habits/routine would that make them suddenly entitled to some of that art piece in particular? What about the rest of it? You're running down an endless corridor of infinitely unfalsifiable, and falsifiable statements. There is no clear answer here.

                                    Exactly. So, without consent, it shouldn’t be used. Periodt.

                                    you need to explicitly define consent, and use, because without defining those, it's literally impossible to even begin determining the end position here.

                                    U This user is from outside of this forum
                                    U This user is from outside of this forum
                                    [email protected]
                                    wrote on last edited by
                                    #81

                                    I refuse to debate ideas on how to make ethical CSAM with you.

                                    Go find a pedo to figure out how you want to try to make CSAM, and you can well akshully all tou want.

                                    K 1 Reply Last reply
                                    0
                                    • U [email protected]

                                      I refuse to debate ideas on how to make ethical CSAM with you.

                                      Go find a pedo to figure out how you want to try to make CSAM, and you can well akshully all tou want.

                                      K This user is from outside of this forum
                                      K This user is from outside of this forum
                                      [email protected]
                                      wrote on last edited by
                                      #82

                                      all of my arguments have explicitly removed any form of anything closely resembling CSAM to the point of being illegal under existing law, or at the very least, extremely questionable.

                                      The only thing i haven't excluded is the potential to use an AI trained explicitly on humans, with no children being in the dataset, being used to generate porn of someone "under the age of 18" which it has zero basis of reality on, and cannot functionally do. That would be the only actual argument i can think of where that wouldn't already be illegal, or at least extremely questionable. Everything else i've provided a sufficient exclusion for.

                                      Have fun calling me a pedo for no reason though, i guess.

                                      1 Reply Last reply
                                      0
                                      • D [email protected]

                                        Honestly I'm quite happy with "social justice warrior" approach. Sure it's flawed and weak to manipulation for now but as a strategy for society to self correct its quite brilliant.

                                        I'm optimistic society itself should be able to correct itself for this issue as well though considering the current climate the correction might be very chaotic.

                                        K This user is from outside of this forum
                                        K This user is from outside of this forum
                                        [email protected]
                                        wrote on last edited by
                                        #83

                                        i mean, i'm not sure modern social justice is working as intended given the political landscape, but historically small communities do manage to self regulate very effectively, that one is for sure. I will give you that.

                                        The only effective way to mandate something at a societal level is going to be laws, i.e. government, otherwise you're going to have an extremely disjointed and culturally diverse society, which isn't necessarily a bad thing.

                                        1 Reply Last reply
                                        0
                                        • System shared this topic on
                                        Reply
                                        • Reply as topic
                                        Log in to reply
                                        • Oldest to Newest
                                        • Newest to Oldest
                                        • Most Votes


                                        • Login

                                        • Login or register to search.
                                        • First post
                                          Last post
                                        0
                                        • Categories
                                        • Recent
                                        • Tags
                                        • Popular
                                        • World
                                        • Users
                                        • Groups