• Maroon@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    3 hours ago

    Anubis + BadBotBlocker + fail2ban was sufficient to stop almost ALL scraping and stupid bots. I added some rate limits as well that stopped some of the more niche bots.

    Of course, I’d be surprised if notabug didn’t implement all this. I’m sure they’re far more savvy than noobs like to to protect their site.

    I’m just afraid that they might be pushed into using Cloudflare or other such centralised services.

  • Flagstaff@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 hours ago

    It’s happening… just as I warned… We’re only going to see these attacks exponentially proliferate, everywhere. The smallest will be totally unprepared, and the next ones up the ladder, barely; etc.

  • HaraldvonBlauzahn@feddit.orgOP
    link
    fedilink
    arrow-up
    0
    ·
    6 hours ago

    What are the best ways to support such projects? We absolutely need diversity of projects and infrastructure. It should be a big lesson that putting everything into one single repo site is Not A Good Idea.