Got a warning for my blog going over 100GB in bandwidth this month… which sounded incredibly unusual. My blog is text and a couple images and I haven’t posted anything to it in ages… like how would that even be possible?

Turns out it’s possible when you have crawlers going apeshit on your server. Am I even reading this right? 12,181 with 181 zeros at the end for ‘Unknown robot’? This is actually bonkers.

  • [object Object]@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    This here is the implementation of sha256 in the slow language JavaScript:

    const msgUint8 = new TextEncoder().encode(message);
    const hashBuffer = await window.crypto.subtle.digest("SHA-256", msgUint8);
    const hashHex = new Uint8Array(hashBuffer).toHex();
    

    You imagined that JS had to have that done from scratch, with sticks and mud? Every OS has cryptographic facilities, and every major browser supplies an API to that.

    As for using it to filter out bots, Anubis does in fact get it a bit wrong. You have to incur this cost at every webpage hit, not once a week. So you can’t just put Anubis in front of the site, you need to have the JS on every page, and if the challenge is not solved until the next hit, then you pop up the full page saying ‘nuh-uh’, and probably make the browser do a harder challenge and also check a bunch of heuristics like go-away does.

    It’s still debatable whether it will stop bots who would just have to crank sha256 24/7 in between page downloads, but it does add cost that bot owners have to eat.