• davel@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    13 days ago

    Try feeding them nonhalting problems that send them into infinite loops of token consumption.

    • veroxii@aussie.zone
      link
      fedilink
      arrow-up
      3
      ·
      13 days ago

      I like the idea but most chatbots have timeout limits. And even agentic workflows have number of step limits to stop infinite loops.

      However this is because it’s super easy for LLMs to get stuck in loops. You don’t even need a nonhalting problem. They’re stupid enough on their own.