Got a warning for my blog going over 100GB in bandwidth this month… which sounded incredibly unusual. My blog is text and a couple images and I haven’t posted anything to it in ages… like how would that even be possible?
Turns out it’s possible when you have crawlers going apeshit on your server. Am I even reading this right? 12,181 with 181 zeros at the end for ‘Unknown robot’? This is actually bonkers.


Have you ever tried writing a scrapper. I have for offline reference material. You’ll do that a few times and know. I usually only want a relatively small site (say a Khan Academy lesson which doesn’t save text offline, just videos) and put in a large delay between requests but I’ll still come back after thinking I have it down and it’s thrashed something