Woof.group's been getting slower and slower lately, and I think it's cuz we're getting hammered by (maybe LLM?) scrapers. Hard to say, really, but don't buy that there are *that* many Windows/Chrome users clicking around through every single tag page.
Gonna add a bunch of the LLM bots to robots.txt--I know many of the big players just ignore robots.txt and fudge their UAs, but maybe it'll make a little dent. Fully 2% of our requests are ByteDance, and 5% are ahrefs--both of those should be blockable.
No idea what to do about what I suspect is residential proxy traffic, which makes up the vast majority of our load. I assume throwing Anubis in front of a Mastodon instance is going to break a ton of legitimate use cases.
@aphyr is there a way to add “nofollow” to every link? We’ve started doing that (professional service) and removing features like sorts because they’re always followed by these fucking bots. A 70k long list, previously sortable by 8 parameters. Both ascending and descending.
I hate it here. You have my sympathy. (I would zip bomb them professionally if I could.)