Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Selfhosted
  3. How to combat large amounts of Ai scrapers

How to combat large amounts of Ai scrapers

Scheduled Pinned Locked Moved Selfhosted
selfhosted
39 Posts 27 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • igilq@szmer.infoI [email protected]

    Well, someone had great idea to use zipbombs. I saw it somewhere but I don’t remember where.

    drunkanroot@sh.itjust.worksD This user is from outside of this forum
    drunkanroot@sh.itjust.worksD This user is from outside of this forum
    [email protected]
    wrote last edited by
    #29

    Anubis has this built in if it detects bots it turns the diffuclty to impossible

    1 Reply Last reply
    0
    • M [email protected]

      In my case I use https://www.bunkerweb.io/ as my proxy for that, but there are other tools like for example https://github.com/TecharoHQ/anubis

      drunkanroot@sh.itjust.worksD This user is from outside of this forum
      drunkanroot@sh.itjust.worksD This user is from outside of this forum
      [email protected]
      wrote last edited by
      #30

      bunkerweb looks intresting

      1 Reply Last reply
      0
      • vegancheesecake@lemmy.blahaj.zoneV [email protected]

        You probably don't need me to tell you, but keep good backups. Friend of mine recently had his account nuked without any reason given, and without the possibility of recourse.

        a mail from Oracle, informing about the immediate termination of service, and deletion of all data

        W This user is from outside of this forum
        W This user is from outside of this forum
        [email protected]
        wrote last edited by
        #31

        as I heard that's pretty common at oracle, but it's good to spread the word

        1 Reply Last reply
        0
        • F [email protected]

          I understand, but the shift in user behaviour is significant and I think websites are not taking it into account. If the users move more and more to AI, and since Google introduced AI mode it's only a question of time until it becomes the default, we will see more and more of what we thing are AI crawlers and less and less organic users.

          AI seems to be the new middleman between you and the user, and if you block the middleman, you block the user. For people with hobby websites or established sites it may make sense because people either know of them, or getting more exposure is not a wish or requirement, but for everyone else, it will be painful.

          L This user is from outside of this forum
          L This user is from outside of this forum
          [email protected]
          wrote last edited by
          #32

          So, what I'm reading is, if your "users" are bad (or bots), just get better users.

          Sounds like a net win.

          1 Reply Last reply
          1
          • drunkanroot@sh.itjust.worksD [email protected]

            everytime i check nginx logs its more scrapers then i can count and i could not find any good open source solutions

            gandalf_der_12te@discuss.tchncs.deG This user is from outside of this forum
            gandalf_der_12te@discuss.tchncs.deG This user is from outside of this forum
            [email protected]
            wrote last edited by [email protected]
            #33

            What's bothering you?

            • Is it to give out data for AI training? I guess you can't fundamentally protect against this, except by limiting how much content is provided to each address.
            • Or is it the resource strain that it causes on your server? In that case i recommend limiting how much a single client / IP address can request in a day.
            drunkanroot@sh.itjust.worksD 1 Reply Last reply
            0
            • drunkanroot@sh.itjust.worksD [email protected]

              everytime i check nginx logs its more scrapers then i can count and i could not find any good open source solutions

              P This user is from outside of this forum
              P This user is from outside of this forum
              [email protected]
              wrote last edited by
              #34

              does anubis not work?

              drunkanroot@sh.itjust.worksD 1 Reply Last reply
              3
              • P [email protected]

                does anubis not work?

                drunkanroot@sh.itjust.worksD This user is from outside of this forum
                drunkanroot@sh.itjust.worksD This user is from outside of this forum
                [email protected]
                wrote last edited by
                #35

                i can only get it to protect one container. i have 3 that i need protected and i cant figure out how to run more then one instance of it.

                1 Reply Last reply
                1
                • gandalf_der_12te@discuss.tchncs.deG [email protected]

                  What's bothering you?

                  • Is it to give out data for AI training? I guess you can't fundamentally protect against this, except by limiting how much content is provided to each address.
                  • Or is it the resource strain that it causes on your server? In that case i recommend limiting how much a single client / IP address can request in a day.
                  drunkanroot@sh.itjust.worksD This user is from outside of this forum
                  drunkanroot@sh.itjust.worksD This user is from outside of this forum
                  [email protected]
                  wrote last edited by
                  #36

                  its the strain of it i mostly run instances and frontends so the training is not a huge problem

                  gandalf_der_12te@discuss.tchncs.deG 1 Reply Last reply
                  0
                  • F [email protected]

                    If nginx, here's an open-source blocker/honeypot: https://github.com/raminf/RoboNope-nginx

                    If you have it set up to be proxied or hosted by Cloudflare, they have their own solution: https://blog.cloudflare.com/declaring-your-aindependence-block-ai-bots-scrapers-and-crawlers-with-a-single-click/

                    drunkanroot@sh.itjust.worksD This user is from outside of this forum
                    drunkanroot@sh.itjust.worksD This user is from outside of this forum
                    [email protected]
                    wrote last edited by
                    #37

                    ill check robonope out seems promising

                    1 Reply Last reply
                    0
                    • F [email protected]

                      Anubis is the name of the tool. Also, Cloudflare just announced they have something against AI scrapers.

                      drunkanroot@sh.itjust.worksD This user is from outside of this forum
                      drunkanroot@sh.itjust.worksD This user is from outside of this forum
                      [email protected]
                      wrote last edited by
                      #38

                      ive been using Anubis my only issue is i would have to run more then one instance and i dont like cloudflare personaly

                      1 Reply Last reply
                      0
                      • drunkanroot@sh.itjust.worksD [email protected]

                        its the strain of it i mostly run instances and frontends so the training is not a huge problem

                        gandalf_der_12te@discuss.tchncs.deG This user is from outside of this forum
                        gandalf_der_12te@discuss.tchncs.deG This user is from outside of this forum
                        [email protected]
                        wrote last edited by [email protected]
                        #39

                        the keyword you need is "DDoS protection" i guess

                        it keeps the server from getting overloaded due to too many requests

                        1 Reply Last reply
                        0
                        Reply
                        • Reply as topic
                        Log in to reply
                        • Oldest to Newest
                        • Newest to Oldest
                        • Most Votes


                        • Login

                        • Login or register to search.
                        • First post
                          Last post
                        0
                        • Categories
                        • Recent
                        • Tags
                        • Popular
                        • World
                        • Users
                        • Groups