Got a warning for my blog going over 100GB in bandwidth this month… which sounded incredibly unusual. My blog is text and a couple images and I haven’t posted anything to it in ages… like how would that even be possible?
Turns out it’s possible when you have crawlers going apeshit on your server. Am I even reading this right? 12,181 with 181 zeros at the end for ‘Unknown robot’? This is actually bonkers.


The traffic is really suspicious. Have you by any chance a health or heartbeat endpoint which provides continuous output? That would explain why so many hits cause so much traffic.
It’s super weird for sure. I’m not sure how the bots have managed to use so much more bandwidth with only 30k more hits than regular traffic, I guess they probably don’t rely on any caching and fetch each page from scratch?
Still going through my stats, but it doesn’t look like I’ve gotten much traffic via any API endpoint (running WordPress). I had a few wallpapers available for download and it looks like for whatever reason the bots have latched onto those.