AI haters build tarpits to trap and trick AI scrapers that ignore robots.txt
-
pedroapero@lemmy.mlreplied to Guest on 31 Jan 2025, 16:21 last edited by
manual and builds are here: https://zadzmo.org/code/nepenthes/
-
landedgentry@lemmy.zipreplied to Guest on 31 Jan 2025, 16:23 last edited by
what is your deal?
-
rumba@lemmy.zipreplied to Guest on 31 Jan 2025, 16:29 last edited by
I liked it back when link aggregators were the go-to for discovery. You could have sites that were real gems that were just tucked away.
I think the indexing started out ok. Counting backlinks and using that as a ranking was pretty genius, right up until people realized they could game the system, then google realized that artificially screwing with their own system was worth money, then the used ads to modify ranking.
ads to modify discoverability the death of free internet
-
theparadox@lemmy.worldreplied to Guest on 31 Jan 2025, 16:29 last edited by
The only AI company that responded to Ars' request to comment was OpenAI, whose spokesperson confirmed that OpenAI is already working on a way to fight tarpitting.
Ah yes. It extremely common for one of the top companies in an industry to spitefully expend resources fighting the efforts of...
One or two people
Please, continue to grace us with you unbiased wisdom. Clearly you've read the article and aren't just trying to simp for AI or start flame wars like a petulant child.
-
appoxo@lemmy.dbzer0.comreplied to Guest on 31 Jan 2025, 16:45 last edited by
Not like you can load balance requests of the malicious subdirectories to a non-prod hardware. Can be decommissioned hardware.
-
_cryptagion@lemmy.dbzer0.comreplied to Guest on 31 Jan 2025, 16:50 last edited by
Well, luckily for them, it's a pretty simple fix. Congrats on being a part of making them jot down a note to prevent tarpitting when they get around to it. You've saved the internet!
And stop pretending like you're unbiased either. We both have our preconceived notions, and you're not more likely to be open to change yours than I am. In fact, given the hysterical hyperventilating anti-AI "activists" get to, we both know you're not ever going to change your mind on AI, and as such you'll glom onto any small action you think is gonna stick it to the man, no matter whether that action is going to have any practical effect on the push for AI or not.
-
_cryptagion@lemmy.dbzer0.comreplied to Guest on 31 Jan 2025, 16:52 last edited by
I'm just vibing, watching the hysterics you guys get up to.
-
landedgentry@lemmy.zipreplied to Guest on 31 Jan 2025, 16:54 last edited by
No you’re being a petulant, naysaying child. Leave us alone and go play with your duplos. Adults are talking.
-
_cryptagion@lemmy.dbzer0.comreplied to Guest on 31 Jan 2025, 16:55 last edited by
How many hobby website admins have load balancing for their small sites? How many have decommissioned hardware? Because if you find me a corporation wiling to accept the liability doing something like this could open them up to, I'll pay you a million dollars.
-
_cryptagion@lemmy.dbzer0.comreplied to Guest on 31 Jan 2025, 16:58 last edited by
Bigotry? From a lemmy user? Never seen it before!
If you don't like what I'm saying, block me and move along. Or report my comments, if you think they're offensive enough. If I'm breaking a rule or the mods don't like what I have to say, maybe they'll remove them, or even ban me from the comm! That's the limit of your options for getting rid of me though.
-
echodot@feddit.ukreplied to Guest on 31 Jan 2025, 17:11 last edited by
Interesting. Mega supporters are now cold-blooded.
-
landedgentry@lemmy.zipreplied to Guest on 31 Jan 2025, 17:44 last edited by
Bigotry lmao talk about a Hail Mary.
-
lovablesidekick@lemmy.worldreplied to Guest on 31 Jan 2025, 22:14 last edited by
I get that the Internet doesn't contain an infinite number of domains. Max visits to a each one can be limited. Hel-lo, McFly?
-
vrighter@discuss.tchncs.dereplied to Guest on 1 Feb 2025, 05:08 last edited by
it's one domain. It's infinite pages under that domain. Limiting max visits per domain is a very different thing than trying to detect loops which aren't there. You are now making a completely different argument. In fact it sounds suspiciously like the only thing I said they could do: have some arbitrary threshold, beyond which they give up... because there's no way of detecting otherwise
-
lovablesidekick@lemmy.worldreplied to Guest on 1 Feb 2025, 05:26 last edited by
I'm a software developer responding to a coding problem. If it's all under one domain then avoiding infinite visits is even simpler - I would create a list of known huge websites like google and wikipedia, and limit the visits to any domain that is not on that list. This would eliminate having to track where the honeypot is deployed to.
-
vrighter@discuss.tchncs.dereplied to Guest on 1 Feb 2025, 10:57 last edited by
yes but now you've shifted the problem again. You went from detecting infinite sites by detecting loops in an infinite tree without loops or with infinite distinct urls, to somehow keeping a list of all infinite distinct urls to avoid going to one twice(which you wouldn't anyway, because there are infinite links), to assuming you have a list that already detected which sites these are so you could avoid them and therefore not have to worry about detecting them (the very thing you started with).
It's ok to admit that your initial idea was wrong. You did not solve a coding problem. You changed the requirements so it's not your problem anymore.
And storing a domain whitelist would't work either, btw. A tarpit entrance is just one url among lots of legitimate ones, in legitimate domains.
-
lovablesidekick@lemmy.worldreplied to Guest on 1 Feb 2025, 22:45 last edited by
Okay fine, I 100% concede that you're right. Bye now.
-
demonsword@lemmy.worldreplied to Guest 30 days ago last edited by
corrected... I guess maybe my IQ isn't on the right side of the bell curve too
-
krompus@lemmy.worldreplied to Guest 30 days ago last edited by
Ignorance is bliss.
79/88