FediDB has stoped crawling until they get robots.txt support
-
It is not possible to detect bots. Attempting to do so will invariably lead to false positives denying access to your content to what is usually the most at-risk & marginalized folks
Just implement a cache and forget about it. If read only content is causing you too much load, you're doing something terribly wrong.
-
Thank you for providing the link.
-
Then I’m not sure what point you were trying to make in the above conversation lol.
-
The point was "don't add another bot into the pile of millions of bots that hit people's servers every day unless you're gonna be polite about it"
-
While I agree with you, the quantity of robots has greatly increased of late. While still not as numerous as users, they are hitting every link and wrecking your caches by not focusing on hotspots like humans do.
-
You need a bigger cache. If you dont have enough RAM, host it on a CDN
-
Sure thing! Help me pay for it?
-
that website feels like uncovering a piece of ancient alien weaponry
-
False positives? Meh who cares ... That's what appeals are for. Real people should realize after not too long
-
Every time I tried to appeal, I either got no response or met someone who had no idea what I was talking about.
Occasionally a bank fixes the problem for a week. Then its back again.