Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Asklemmy
  3. Discord Popcorn Picture

Discord Popcorn Picture

Scheduled Pinned Locked Moved Asklemmy
asklemmy
12 Posts 9 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • gay4dudes@sh.itjust.worksG This user is from outside of this forum
    gay4dudes@sh.itjust.worksG This user is from outside of this forum
    [email protected]
    wrote on last edited by
    #1

    I got baited into posting a picture of a child eating Popcorn on Discord, not knowing it was associated with CSAM. The account got banned, but I dont care about it but more about the legal consequences. Has anyone heard of legal action against people posting it?

    A A ? 1 ? 5 Replies Last reply
    0
    • System shared this topic on
    • gay4dudes@sh.itjust.worksG [email protected]

      I got baited into posting a picture of a child eating Popcorn on Discord, not knowing it was associated with CSAM. The account got banned, but I dont care about it but more about the legal consequences. Has anyone heard of legal action against people posting it?

      A This user is from outside of this forum
      A This user is from outside of this forum
      [email protected]
      wrote on last edited by
      #2

      No. I see no way to prosecute over popcorn. If a country actually did I would find the exit. Fast!

      Not condoning child abuse

      But popcorn?

      hoshikarakitaridia@lemmy.worldH 1 Reply Last reply
      0
      • gay4dudes@sh.itjust.worksG [email protected]

        I got baited into posting a picture of a child eating Popcorn on Discord, not knowing it was associated with CSAM. The account got banned, but I dont care about it but more about the legal consequences. Has anyone heard of legal action against people posting it?

        A This user is from outside of this forum
        A This user is from outside of this forum
        [email protected]
        wrote on last edited by
        #3

        How does eating popcorn == CSAM?

        S 1 Reply Last reply
        0
        • A [email protected]

          No. I see no way to prosecute over popcorn. If a country actually did I would find the exit. Fast!

          Not condoning child abuse

          But popcorn?

          hoshikarakitaridia@lemmy.worldH This user is from outside of this forum
          hoshikarakitaridia@lemmy.worldH This user is from outside of this forum
          [email protected]
          wrote on last edited by
          #4

          Yeah.

          Discord needs to moderate, so they ban and therefore conclude their legal obligations.

          If it was CSAM and discord thinks it's bad enough, they will probably forward the information to the authorities.

          Now if the authorities think it's worth an investigation and give it the proper priority, they will start one. If the investigation concludes and they still think you've done goofed bad enough, they will persue you under criminal law.

          See how many ifs there are and how many people have to sign off on it? There's quadruple human review at minimum in there, and there's no way they think they can win on those charges when the evidence if gd damn popcorn.

          Also, you can appeal a ban. I got auto banned on discord about 2 months ago and I appealed because I know for a fact I did nothing wrong - I was literally asleep and my last messages did not even contain profanity. I was so mad cause that account is important to me. They reinstated it - to their credit - in a matter of hours. Still, could've done without the heart attack.

          TL;DR you're more than safe as long as it wasn't actual CSAM.

          gay4dudes@sh.itjust.worksG 1 Reply Last reply
          0
          • gay4dudes@sh.itjust.worksG [email protected]

            I got baited into posting a picture of a child eating Popcorn on Discord, not knowing it was associated with CSAM. The account got banned, but I dont care about it but more about the legal consequences. Has anyone heard of legal action against people posting it?

            ? Offline
            ? Offline
            Guest
            wrote on last edited by
            #5

            Helpful video: https://www.youtube.com/watch?v=Kyc_ysVgBMs

            But basically, the picture was a cropped frame from a CSAM content, which then their systems thought you are posting CSAM content when you did not.

            About the legal consequences, i am not a lawyer, but i don't think you will be visited by the police anytime soon, since the picture you posted isn't CSAM by itself, just a cropped portion which does not contain the material itself.

            1 Reply Last reply
            0
            • A [email protected]

              How does eating popcorn == CSAM?

              S This user is from outside of this forum
              S This user is from outside of this forum
              [email protected]
              wrote on last edited by
              #6

              From what other people have said and from the occasional video that's popped up on Youtube, Discord has a library of CSAM content that its automated systems match against and there are certain individuals that try to bait people to post seemingly innocent pictures that are actually frames from said videos. Discord's systems see that the image is a frame from such material and will auto-ban the account

              A 1 Reply Last reply
              0
              • S [email protected]

                From what other people have said and from the occasional video that's popped up on Youtube, Discord has a library of CSAM content that its automated systems match against and there are certain individuals that try to bait people to post seemingly innocent pictures that are actually frames from said videos. Discord's systems see that the image is a frame from such material and will auto-ban the account

                A This user is from outside of this forum
                A This user is from outside of this forum
                [email protected]
                wrote on last edited by
                #7

                This is fascinating and I have a bunch of questions, basically all centered around the fact that possession of the such content is outlawed. I don't exoect OP to know, but maybe someone else does:

                Isn't it illegal to have a library of such content? Is there a legal carveout for that, like Coca Cola importing cocaine?

                How is the library compiled, maintained, and added to?

                Is the library specific to Discord or is it a shared library maintained by some centralized "authority" or developer? If it's specific to Discord then can we assume there are many different libraries of illegally produced and possessed content compiled and maintained by various social media companies? Who's got that job? Do they get therapy in their benefits package?

                gay4dudes@sh.itjust.worksG communism@lemmy.mlC 2 Replies Last reply
                0
                • hoshikarakitaridia@lemmy.worldH [email protected]

                  Yeah.

                  Discord needs to moderate, so they ban and therefore conclude their legal obligations.

                  If it was CSAM and discord thinks it's bad enough, they will probably forward the information to the authorities.

                  Now if the authorities think it's worth an investigation and give it the proper priority, they will start one. If the investigation concludes and they still think you've done goofed bad enough, they will persue you under criminal law.

                  See how many ifs there are and how many people have to sign off on it? There's quadruple human review at minimum in there, and there's no way they think they can win on those charges when the evidence if gd damn popcorn.

                  Also, you can appeal a ban. I got auto banned on discord about 2 months ago and I appealed because I know for a fact I did nothing wrong - I was literally asleep and my last messages did not even contain profanity. I was so mad cause that account is important to me. They reinstated it - to their credit - in a matter of hours. Still, could've done without the heart attack.

                  TL;DR you're more than safe as long as it wasn't actual CSAM.

                  gay4dudes@sh.itjust.worksG This user is from outside of this forum
                  gay4dudes@sh.itjust.worksG This user is from outside of this forum
                  [email protected]
                  wrote on last edited by
                  #8

                  Yeah, like I found the exact same image posted on Twitter since 2 years and it is still up.

                  1 Reply Last reply
                  0
                  • A [email protected]

                    This is fascinating and I have a bunch of questions, basically all centered around the fact that possession of the such content is outlawed. I don't exoect OP to know, but maybe someone else does:

                    Isn't it illegal to have a library of such content? Is there a legal carveout for that, like Coca Cola importing cocaine?

                    How is the library compiled, maintained, and added to?

                    Is the library specific to Discord or is it a shared library maintained by some centralized "authority" or developer? If it's specific to Discord then can we assume there are many different libraries of illegally produced and possessed content compiled and maintained by various social media companies? Who's got that job? Do they get therapy in their benefits package?

                    gay4dudes@sh.itjust.worksG This user is from outside of this forum
                    gay4dudes@sh.itjust.worksG This user is from outside of this forum
                    [email protected]
                    wrote on last edited by
                    #9

                    As far as I understand they use a tool called PhotoDNA (AI company acquired by Discord) which they use to scan pictures.

                    1 Reply Last reply
                    0
                    • gay4dudes@sh.itjust.worksG [email protected]

                      I got baited into posting a picture of a child eating Popcorn on Discord, not knowing it was associated with CSAM. The account got banned, but I dont care about it but more about the legal consequences. Has anyone heard of legal action against people posting it?

                      1 This user is from outside of this forum
                      1 This user is from outside of this forum
                      [email protected]
                      wrote on last edited by
                      #10

                      Websites have false positives all the time and while it sucks, it's infeasible for them to have human reviewers checking everything and it's better to have false positives than false negatives... What isn't acceptable is that the appeals process uses the exact same models as the flagging process so it gets the exact same false positives and false negatives...

                      Pic related as it was one of the first to reveal how broken the appeals process in most social media platforms was.

                      1000046823

                      1 Reply Last reply
                      0
                      • gay4dudes@sh.itjust.worksG [email protected]

                        I got baited into posting a picture of a child eating Popcorn on Discord, not knowing it was associated with CSAM. The account got banned, but I dont care about it but more about the legal consequences. Has anyone heard of legal action against people posting it?

                        ? Offline
                        ? Offline
                        Guest
                        wrote on last edited by
                        #11

                        Can someone explain this to me, cus huh?

                        1 Reply Last reply
                        0
                        • A [email protected]

                          This is fascinating and I have a bunch of questions, basically all centered around the fact that possession of the such content is outlawed. I don't exoect OP to know, but maybe someone else does:

                          Isn't it illegal to have a library of such content? Is there a legal carveout for that, like Coca Cola importing cocaine?

                          How is the library compiled, maintained, and added to?

                          Is the library specific to Discord or is it a shared library maintained by some centralized "authority" or developer? If it's specific to Discord then can we assume there are many different libraries of illegally produced and possessed content compiled and maintained by various social media companies? Who's got that job? Do they get therapy in their benefits package?

                          communism@lemmy.mlC This user is from outside of this forum
                          communism@lemmy.mlC This user is from outside of this forum
                          [email protected]
                          wrote on last edited by
                          #12

                          Do they get therapy in their benefits package?

                          https://www.theguardian.com/media/2024/dec/18/kenya-facebook-moderators-sue-after-diagnoses-of-severe-ptsd

                          This kind of moderation is generally outsourced to people in the global south paid pennies. And no, they don't get therapy.

                          1 Reply Last reply
                          0
                          • System shared this topic on
                            System shared this topic on
                          Reply
                          • Reply as topic
                          Log in to reply
                          • Oldest to Newest
                          • Newest to Oldest
                          • Most Votes


                          • Login

                          • Login or register to search.
                          • First post
                            Last post
                          0
                          • Categories
                          • Recent
                          • Tags
                          • Popular
                          • World
                          • Users
                          • Groups