Gonna add a bunch of the LLM bots to robots.txt--I know many of the big players just ignore robots.txt and fudge their UAs, but maybe it'll make a little dent. Fully 2% of our requests are ByteDance, and 5% are ahrefs--both of those should be blockable.
No idea what to do about what I suspect is residential proxy traffic, which makes up the vast majority of our load. I assume throwing Anubis in front of a Mastodon instance is going to break a ton of legitimate use cases.