Got a warning for my blog going over 100GB in bandwidth this month… which sounded incredibly unusual. My blog is text and a couple images and I haven’t posted anything to it in ages… like how would that even be possible?
Turns out it’s possible when you have crawlers going apeshit on your server. Am I even reading this right? 12,181 with 181 zeros at the end for ‘Unknown robot’? This is actually bonkers.


Phew, so I’m a dumbass and not reading it right. I wonder how they’ve managed to use 3MB per visit?
The robots are a problem, but luckily we’re not into the hepamegaquintogilarillions… Yet.
12,000 visits, with 181 of those to the robots.txt file makes way, way more sense, albeit, still an abusive number of visits.
I couldn’t wrap my head around how large the number was and how many visits that would actually entail to reach that number in 25 days. Turns out that would be roughly 5.64 quinquinquagintillion visits per nanosecond. Call it a hunch, but I suspect my server might not handle that.
The traffic is really suspicious. Have you by any chance a health or heartbeat endpoint which provides continuous output? That would explain why so many hits cause so much traffic.
It’s super weird for sure. I’m not sure how the bots have managed to use so much more bandwidth with only 30k more hits than regular traffic, I guess they probably don’t rely on any caching and fetch each page from scratch?
Still going through my stats, but it doesn’t look like I’ve gotten much traffic via any API endpoint (running WordPress). I had a few wallpapers available for download and it looks like for whatever reason the bots have latched onto those.