Probably should’ve just asked Wolfram Alpha

  • just_an_average_joe@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    3 hours ago

    I think thats an issue with AI, it has been so much trained on complex questions that now when you ask a simple one, it mistakes it for a complex one and answers it that way

    • sping@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 hours ago

      The issue is it’s an LLM. It puts words in an order that’s statistically plausible but has no reasoning power.