Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Programmer Humor
  3. lads

lads

Scheduled Pinned Locked Moved Programmer Humor
programmerhumor
58 Posts 23 Posters 1 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • mod_pp@lemmy.worldM This user is from outside of this forum
    mod_pp@lemmy.worldM This user is from outside of this forum
    [email protected]
    wrote last edited by
    #1
    This post did not contain any content.
    M R E 3 Replies Last reply
    726
    • mod_pp@lemmy.worldM [email protected]
      This post did not contain any content.
      M This user is from outside of this forum
      M This user is from outside of this forum
      [email protected]
      wrote last edited by
      #2

      I heard this picture

      1 Reply Last reply
      8
      • mod_pp@lemmy.worldM [email protected]
        This post did not contain any content.
        R This user is from outside of this forum
        R This user is from outside of this forum
        [email protected]
        wrote last edited by
        #3

        Hail Anubis-chan.

        R J xylight@lemdro.idX 3 Replies Last reply
        66
        • R [email protected]

          Hail Anubis-chan.

          R This user is from outside of this forum
          R This user is from outside of this forum
          [email protected]
          wrote last edited by
          #4

          What's this about?

          R 1 Reply Last reply
          12
          • R [email protected]

            What's this about?

            R This user is from outside of this forum
            R This user is from outside of this forum
            [email protected]
            wrote last edited by [email protected]
            #5

            Anubis is a simple anti-scraper defense that weighs a web client's soul by giving it a tiny proof-of-work workload (some calculation that doesn't have an efficient solution, like cryptography) before letting it pass through to the actual website. The workload is insignificant for human users, but very taxing for high-volume scrapers. The calculations are done on the client's side using Javascript code.

            (edit) For clarification: this works because the computation workload takes a relatively long time, not because it bogs down the CPU. Halting each request at the gate for only a few seconds adds up very quickly.

            Recently, the FSF published an article that likened Anubis to malware because it's basically arbitrary code that the user has no choice but to execute:

            [...] The problem is that Anubis makes the website send out a free JavaScript program that acts like malware. A website using Anubis will respond to a request for a webpage with a free JavaScript program and not the page that was requested. If you run the JavaScript program sent through Anubis, it will do some useless computations on random numbers and keep one CPU entirely busy. It could take less than a second or over a minute. When it is done, it sends the computation results back to the website. The website will verify that the useless computation was done by looking at the results and only then give access to the originally requested page.

            Here's the article, and here's aussie linux man talking about it.

            J quill7513@slrpnk.netQ D I C 5 Replies Last reply
            65
            • R [email protected]

              Anubis is a simple anti-scraper defense that weighs a web client's soul by giving it a tiny proof-of-work workload (some calculation that doesn't have an efficient solution, like cryptography) before letting it pass through to the actual website. The workload is insignificant for human users, but very taxing for high-volume scrapers. The calculations are done on the client's side using Javascript code.

              (edit) For clarification: this works because the computation workload takes a relatively long time, not because it bogs down the CPU. Halting each request at the gate for only a few seconds adds up very quickly.

              Recently, the FSF published an article that likened Anubis to malware because it's basically arbitrary code that the user has no choice but to execute:

              [...] The problem is that Anubis makes the website send out a free JavaScript program that acts like malware. A website using Anubis will respond to a request for a webpage with a free JavaScript program and not the page that was requested. If you run the JavaScript program sent through Anubis, it will do some useless computations on random numbers and keep one CPU entirely busy. It could take less than a second or over a minute. When it is done, it sends the computation results back to the website. The website will verify that the useless computation was done by looking at the results and only then give access to the originally requested page.

              Here's the article, and here's aussie linux man talking about it.

              J This user is from outside of this forum
              J This user is from outside of this forum
              [email protected]
              wrote last edited by
              #6

              aussie linux man

              How did I know exactly who you were talking about before clicking the link?

              sabata11792@ani.socialS 1 Reply Last reply
              2
              • R [email protected]

                Anubis is a simple anti-scraper defense that weighs a web client's soul by giving it a tiny proof-of-work workload (some calculation that doesn't have an efficient solution, like cryptography) before letting it pass through to the actual website. The workload is insignificant for human users, but very taxing for high-volume scrapers. The calculations are done on the client's side using Javascript code.

                (edit) For clarification: this works because the computation workload takes a relatively long time, not because it bogs down the CPU. Halting each request at the gate for only a few seconds adds up very quickly.

                Recently, the FSF published an article that likened Anubis to malware because it's basically arbitrary code that the user has no choice but to execute:

                [...] The problem is that Anubis makes the website send out a free JavaScript program that acts like malware. A website using Anubis will respond to a request for a webpage with a free JavaScript program and not the page that was requested. If you run the JavaScript program sent through Anubis, it will do some useless computations on random numbers and keep one CPU entirely busy. It could take less than a second or over a minute. When it is done, it sends the computation results back to the website. The website will verify that the useless computation was done by looking at the results and only then give access to the originally requested page.

                Here's the article, and here's aussie linux man talking about it.

                quill7513@slrpnk.netQ This user is from outside of this forum
                quill7513@slrpnk.netQ This user is from outside of this forum
                [email protected]
                wrote last edited by
                #7

                fwiw Anubis is working on a more respectful update, this was their first pass solution for what was basically a break glass emergency. i understand FSF's concern, but Anubis is the only thing that's making a free and open internet remotely possible right now, and far better it that nightmare fuel like cloudflare

                D 1 Reply Last reply
                45
                • quill7513@slrpnk.netQ [email protected]

                  fwiw Anubis is working on a more respectful update, this was their first pass solution for what was basically a break glass emergency. i understand FSF's concern, but Anubis is the only thing that's making a free and open internet remotely possible right now, and far better it that nightmare fuel like cloudflare

                  D This user is from outside of this forum
                  D This user is from outside of this forum
                  [email protected]
                  wrote last edited by
                  #8

                  How does it factor in the "free" and "open"?

                  It seems to be more about IP protection that any other thing.

                  R K 2 Replies Last reply
                  1
                  • J [email protected]

                    aussie linux man

                    How did I know exactly who you were talking about before clicking the link?

                    sabata11792@ani.socialS This user is from outside of this forum
                    sabata11792@ani.socialS This user is from outside of this forum
                    [email protected]
                    wrote last edited by
                    #9

                    The outro song played in my head...

                    1 Reply Last reply
                    1
                    • D [email protected]

                      How does it factor in the "free" and "open"?

                      It seems to be more about IP protection that any other thing.

                      R This user is from outside of this forum
                      R This user is from outside of this forum
                      [email protected]
                      wrote last edited by [email protected]
                      #10
                      • A web server that can't discriminate between a request made by a human and one made by a machine has to handle all requests. It may not be an issue for large companies like Amazon or Microsoft, but small websites will suffer timeouts and outages.
                      • Without a locally hosted solution like Anubis, small websites would have to move behind a large centralized service like Cloudflare.
                      • Otherwise they might not be able to continue operating and only large corporate-backed services like Twitter and Reddit would survive.

                      The alternative is having to choose between Reddit and Cloudflare. Does that look "free" and "open" to you?

                      D 1 Reply Last reply
                      32
                      • D [email protected]

                        How does it factor in the "free" and "open"?

                        It seems to be more about IP protection that any other thing.

                        K This user is from outside of this forum
                        K This user is from outside of this forum
                        [email protected]
                        wrote last edited by
                        #11

                        Free software

                        users have the freedom to run, copy, distribute, study, change and improve the software

                        https://www.gnu.org/philosophy/free-sw.en.html

                        Open source

                        https://en.wikipedia.org/wiki/The_Open_Source_Definition

                        1. No discrimination against fields of endeavor, like commercial use

                        You are removing the terms software and source. The code is freely available and to be open source should be usable for whatever purpose.

                        As an aside, it’s used by smaller sites frequently to prevent overwhelming scraping that could take down the site, which has become far more rampant recently due to AI bots

                        D 1 Reply Last reply
                        0
                        • R [email protected]
                          • A web server that can't discriminate between a request made by a human and one made by a machine has to handle all requests. It may not be an issue for large companies like Amazon or Microsoft, but small websites will suffer timeouts and outages.
                          • Without a locally hosted solution like Anubis, small websites would have to move behind a large centralized service like Cloudflare.
                          • Otherwise they might not be able to continue operating and only large corporate-backed services like Twitter and Reddit would survive.

                          The alternative is having to choose between Reddit and Cloudflare. Does that look "free" and "open" to you?

                          D This user is from outside of this forum
                          D This user is from outside of this forum
                          [email protected]
                          wrote last edited by [email protected]
                          #12

                          That whole thing is under two wrong suppositions.

                          It assumes that we sites are under constant ddos and that cannot exist if there is not ddos protection.

                          This is false.

                          It assumes that anubis is effective against ddos attacks. Which is not. Is a mitigation, but any ddos attack worth is name would not have any issue bringing down a site with anubis. As the sever still have to handle request even if they are smaller requests.

                          Anubis only use case is to make AI scrappers to consume more energy while scrapping, while also making many legitimate users also use more energy. It's just being promoted in the anti-AI wave, but I don't really see much usefulness into it.

                          R H 2 Replies Last reply
                          2
                          • R [email protected]

                            Anubis is a simple anti-scraper defense that weighs a web client's soul by giving it a tiny proof-of-work workload (some calculation that doesn't have an efficient solution, like cryptography) before letting it pass through to the actual website. The workload is insignificant for human users, but very taxing for high-volume scrapers. The calculations are done on the client's side using Javascript code.

                            (edit) For clarification: this works because the computation workload takes a relatively long time, not because it bogs down the CPU. Halting each request at the gate for only a few seconds adds up very quickly.

                            Recently, the FSF published an article that likened Anubis to malware because it's basically arbitrary code that the user has no choice but to execute:

                            [...] The problem is that Anubis makes the website send out a free JavaScript program that acts like malware. A website using Anubis will respond to a request for a webpage with a free JavaScript program and not the page that was requested. If you run the JavaScript program sent through Anubis, it will do some useless computations on random numbers and keep one CPU entirely busy. It could take less than a second or over a minute. When it is done, it sends the computation results back to the website. The website will verify that the useless computation was done by looking at the results and only then give access to the originally requested page.

                            Here's the article, and here's aussie linux man talking about it.

                            D This user is from outside of this forum
                            D This user is from outside of this forum
                            [email protected]
                            wrote last edited by
                            #13

                            Beautiful

                            1 Reply Last reply
                            0
                            • K [email protected]

                              Free software

                              users have the freedom to run, copy, distribute, study, change and improve the software

                              https://www.gnu.org/philosophy/free-sw.en.html

                              Open source

                              https://en.wikipedia.org/wiki/The_Open_Source_Definition

                              1. No discrimination against fields of endeavor, like commercial use

                              You are removing the terms software and source. The code is freely available and to be open source should be usable for whatever purpose.

                              As an aside, it’s used by smaller sites frequently to prevent overwhelming scraping that could take down the site, which has become far more rampant recently due to AI bots

                              D This user is from outside of this forum
                              D This user is from outside of this forum
                              [email protected]
                              wrote last edited by [email protected]
                              #14

                              I'm not saying it's not open source or free. I say that it does not contribute to make the web free and open. It really only contribute into making everyone waste more energy surfing the web.

                              The web is already too heavy we do NOT need PoW added to that.

                              I don't think even a raspberry 2 would go down over a web scrap. And Anubis cannot protect from proper ddos so...

                              K 1 Reply Last reply
                              0
                              • D [email protected]

                                That whole thing is under two wrong suppositions.

                                It assumes that we sites are under constant ddos and that cannot exist if there is not ddos protection.

                                This is false.

                                It assumes that anubis is effective against ddos attacks. Which is not. Is a mitigation, but any ddos attack worth is name would not have any issue bringing down a site with anubis. As the sever still have to handle request even if they are smaller requests.

                                Anubis only use case is to make AI scrappers to consume more energy while scrapping, while also making many legitimate users also use more energy. It's just being promoted in the anti-AI wave, but I don't really see much usefulness into it.

                                R This user is from outside of this forum
                                R This user is from outside of this forum
                                [email protected]
                                wrote last edited by [email protected]
                                #15

                                It assumes that we sites are under constant ddos

                                It is literally happening. https://www.youtube.com/watch?v=cQk2mPcAAWo https://thelibre.news/foss-infrastructure-is-under-attack-by-ai-companies/

                                It assumes that anubis is effective against ddos attacks

                                It's being used by some little-known entities like the LKML, FreeBSD, SourceHut, UNESCO, and the fucking UN, so I'm assuming it probably works well enough. https://policytoolbox.iiep.unesco.org/ https://xeiaso.net/notes/2025/anubis-works/

                                anti-AI wave

                                Oh, you're one of those people. Enough said. (edit) By the way, Anubis' author seems to be a big fan of machine learning and AI.

                                (edit 2 just because I'm extra cross that you don't seem to understand this part)

                                Do you know what a web crawler does when a process finishes grabbing the response from the web server? Do you think it takes a little break to conserve energy and let all the other remaining processes do their thing? No, it spawns another bloody process to scrape the next hyperlink.

                                D 1 Reply Last reply
                                25
                                • D [email protected]

                                  That whole thing is under two wrong suppositions.

                                  It assumes that we sites are under constant ddos and that cannot exist if there is not ddos protection.

                                  This is false.

                                  It assumes that anubis is effective against ddos attacks. Which is not. Is a mitigation, but any ddos attack worth is name would not have any issue bringing down a site with anubis. As the sever still have to handle request even if they are smaller requests.

                                  Anubis only use case is to make AI scrappers to consume more energy while scrapping, while also making many legitimate users also use more energy. It's just being promoted in the anti-AI wave, but I don't really see much usefulness into it.

                                  H This user is from outside of this forum
                                  H This user is from outside of this forum
                                  [email protected]
                                  wrote last edited by
                                  #16

                                  Websites were under a constant noise of malicious requests even before AI, but now AI scraping of Lemmy instances usually triples traffic. While some sites can cope with this, this means a three-fold increase in hosting costs in order to essentially fuel investment portfolios.

                                  AI scrapers will already use as much energy as available, so making them use more per site measn less sites being scraped, not more total energy used.

                                  And this is not DDoS, the objective of scrapers is to get the data, not bring the site down, so while the server must reply to all requests, the clients can't get the data out without doing more work than the server.

                                  D 1 Reply Last reply
                                  8
                                  • D [email protected]

                                    I'm not saying it's not open source or free. I say that it does not contribute to make the web free and open. It really only contribute into making everyone waste more energy surfing the web.

                                    The web is already too heavy we do NOT need PoW added to that.

                                    I don't think even a raspberry 2 would go down over a web scrap. And Anubis cannot protect from proper ddos so...

                                    K This user is from outside of this forum
                                    K This user is from outside of this forum
                                    [email protected]
                                    wrote last edited by
                                    #17

                                    I don’t think even a raspberry 2 would go down over a web scrap

                                    Absolutely depends on what software the server is running, if there’s proper caching involved. If running some PoW is involved to scrape 1 page it shouldn’t be too much of an issue, as opposed to just blindly following and ingesting every link.

                                    Additionally, you can choose “good bots” like the internet archive, and they’re currently working on a list of “good bots”

                                    https://github.com/TecharoHQ/anubis/blob/main/docs/docs/admin/policies.mdx

                                    AI companies ingesting data nonstop to train their models doesn’t make for a open and free internet, and will likely lead to the opposite, where users no longer even browse the web but trust in AI responses that maybe be hallucinated.

                                    D 1 Reply Last reply
                                    2
                                    • H [email protected]

                                      Websites were under a constant noise of malicious requests even before AI, but now AI scraping of Lemmy instances usually triples traffic. While some sites can cope with this, this means a three-fold increase in hosting costs in order to essentially fuel investment portfolios.

                                      AI scrapers will already use as much energy as available, so making them use more per site measn less sites being scraped, not more total energy used.

                                      And this is not DDoS, the objective of scrapers is to get the data, not bring the site down, so while the server must reply to all requests, the clients can't get the data out without doing more work than the server.

                                      D This user is from outside of this forum
                                      D This user is from outside of this forum
                                      [email protected]
                                      wrote last edited by
                                      #18

                                      AI does not triple traffic. It's a completely irrational statement to make.

                                      There's a very limited number of companies training big LLM models, and these companies do train a model a few times per year. I would bet that the number of requests per year of s resource by an AI scrapper is on the dozens at most.

                                      Using as much energy as a available per scrapping doesn't even make physical sense. What does that sentence even mean?

                                      H grysbok@lemmy.sdf.orgG 2 Replies Last reply
                                      0
                                      • R [email protected]

                                        It assumes that we sites are under constant ddos

                                        It is literally happening. https://www.youtube.com/watch?v=cQk2mPcAAWo https://thelibre.news/foss-infrastructure-is-under-attack-by-ai-companies/

                                        It assumes that anubis is effective against ddos attacks

                                        It's being used by some little-known entities like the LKML, FreeBSD, SourceHut, UNESCO, and the fucking UN, so I'm assuming it probably works well enough. https://policytoolbox.iiep.unesco.org/ https://xeiaso.net/notes/2025/anubis-works/

                                        anti-AI wave

                                        Oh, you're one of those people. Enough said. (edit) By the way, Anubis' author seems to be a big fan of machine learning and AI.

                                        (edit 2 just because I'm extra cross that you don't seem to understand this part)

                                        Do you know what a web crawler does when a process finishes grabbing the response from the web server? Do you think it takes a little break to conserve energy and let all the other remaining processes do their thing? No, it spawns another bloody process to scrape the next hyperlink.

                                        D This user is from outside of this forum
                                        D This user is from outside of this forum
                                        [email protected]
                                        wrote last edited by [email protected]
                                        #19

                                        Some websites being under ddos attack =/= all sites are under constant ddos attack, nor it cannot exist without it.

                                        First there's a logic fallacy in there. Being used by does not mean it's useful. Many companies use AI for some task, does that make AI useful? Not.

                                        The logic it's still there all anubis can do against ddos is raising a little the barrier before the site goes down. That's call mitigation not protection. If you are targeted for a ddos that mitigation is not going to do much, and your site is going down regardless.

                                        C 1 Reply Last reply
                                        0
                                        • K [email protected]

                                          I don’t think even a raspberry 2 would go down over a web scrap

                                          Absolutely depends on what software the server is running, if there’s proper caching involved. If running some PoW is involved to scrape 1 page it shouldn’t be too much of an issue, as opposed to just blindly following and ingesting every link.

                                          Additionally, you can choose “good bots” like the internet archive, and they’re currently working on a list of “good bots”

                                          https://github.com/TecharoHQ/anubis/blob/main/docs/docs/admin/policies.mdx

                                          AI companies ingesting data nonstop to train their models doesn’t make for a open and free internet, and will likely lead to the opposite, where users no longer even browse the web but trust in AI responses that maybe be hallucinated.

                                          D This user is from outside of this forum
                                          D This user is from outside of this forum
                                          [email protected]
                                          wrote last edited by [email protected]
                                          #20

                                          There a small number of AI companies training full LLM models. And they usually do a few trains per years. What most people see as "AI bots" are not actually that.

                                          The influence of AI over the net is another topic. But anubis is also not doing anything about that as it just makes so the AI bots waste more energy getting the data or at most that data under "anubis protection" does not enter the training dataset. The AI will still be there.

                                          Am I in the list of "good bots" ?sometimes I scrap websites for price tracking or change tracking. If I see a website running malware on my end I would most likely just block that site, one legitimate user less.

                                          S S 2 Replies Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups