• jet@hackertalks.com
    link
    fedilink
    English
    arrow-up
    83
    ·
    5 months ago

    Honestly, I’m really impressed… Ddg works across VPN, and TOR. That includes these chatbots. That’s a great improvement for privacy

  • sabreW4K3@lazysoci.al
    link
    fedilink
    English
    arrow-up
    30
    ·
    5 months ago

    Is there a YouTube video under 10 minutes that compares the different AI models available from DuckDuckGo?

    • simple@lemm.ee
      link
      fedilink
      English
      arrow-up
      21
      ·
      5 months ago

      Dunno, but Llama 3 is the best open source model and Claude 3 is the best overall model they offer.

    • Howdy@lemmy.zip
      link
      fedilink
      English
      arrow-up
      18
      ·
      edit-2
      5 months ago

      I use mixtral8x7b locally and it’s been great. I am genuinely excited to see ddg offering it and the service in general. Now I can use this service when not on my network.

      • rutrum@lm.paradisus.day
        link
        fedilink
        English
        arrow-up
        5
        ·
        5 months ago

        What GPU are you using to run it? And what UI are you using to interface with it? (I know of gpt4all and the generic sounding ui-text-generation program or something)

        • Howdy@lemmy.zip
          link
          fedilink
          English
          arrow-up
          9
          ·
          edit-2
          5 months ago

          I am using this: https://github.com/oobabooga/text-generation-webui … It is running great with my AMD 7900XT. It also ran great with my 5700xt. It sets up itself within a conda virtual environment so it takes all mess out of getting the packages to work correctly. It can use NVIDIA cards too.

          Once you get it installed you can then get your models from huggingface.co

          I’m on arch, btw. ;)

          Edit: I just went and reinstalled it and saw it supports these gpus

        • pflanzenregal@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          5 months ago

          Open-webui is the best self hosted LLM chat interface IMO. It works seamlessly with Ollama, but also supports other openAI-API compatible APIs AFAIK.

          I’m using both in combination with each other and both downloading and using models is super easy. Also integrates well with VSCode extension “Continue”, an open source Copilot alternative (setup might require editing the extension’s config file).

    • Otter@lemmy.caOP
      link
      fedilink
      English
      arrow-up
      11
      ·
      5 months ago

      A lot of it might come down to individual tasks or personal preference.

      Personally I liked Claude better than GTP3.5 for general queries, and I have yet to explore the other two

  • Showroom7561@lemmy.ca
    link
    fedilink
    English
    arrow-up
    15
    ·
    5 months ago

    It still amazes me just how quickly these models can spit out complex answers pretty much instantly.

    • Cyrus Draegur@lemm.ee
      link
      fedilink
      arrow-up
      21
      ·
      5 months ago

      …complex almost entirely wholly-hallucinated answers that only have as much bearing on reality as ‘some dude who is very talkative and heard about a bunch of stuff second-hand, and who is also high as balls and experiencing a manic episode where they think they know everything’

      • Showroom7561@lemmy.ca
        link
        fedilink
        English
        arrow-up
        4
        ·
        5 months ago

        LOL. Yeah, sometimes, answers can be very much “I’m winging it today”, but certain prompts, especially for story ideas, can be very interesting and usable.

        I’ve always said that if you know a lot about a subject, you can easily spot how AI generally tries to fake it until it makes it.

        But if you have no idea about something, the answers you get are certainly better than what your buddy might tell you 😂

        But to my point, it comes up with long form content so fast that you wonder how the hell it actually processed the question that quickly.

        • Cyrus Draegur@lemm.ee
          link
          fedilink
          English
          arrow-up
          5
          ·
          5 months ago

          it has in fact been a delightful creative aid for brainstorming fiction, actually!

          The things are fantastic at “yes-and” improvisation and extrapolating from a premise.

          If I want to build a world and populate it with loosely defined ‘impressionistic’ background info that doesn’t necessarily require fully fledged lore that interconnects, it can do a great job at showing where the lore could go if i decided to explore there. It’s great at suggesting character names, place names, and ways to fill in blanks that make it easier for me to pick or reject individual elements.

          In a story idea I’ve been marinating for a while, one character possesses advanced medical knowledge in a world where germ theory, medicine, and surgery never developed because people had access to ‘healing magic’. The problem is, healing magic works on all organisms - including parasites, bacteria, and cancer, which means trying to ‘heal’ someone with an infection makes the infection worse because the pathogens benefited from the healing magic.

          I asked AI to extrapolate more detail about this character’s background and it suggested that his father was the village healer and simply didn’t mention his mother at all.

          Those two little details exploded in my imagination as an entire history of emotional conflict:
          His mother fell ill with a bacterial infection that magic couldn’t fix when he was too little to do anything about it even though he knew what was wrong and how to help her, and so he blamed himself.
          His ‘strange ideas’ about physiology, epidemiology, and concepts like hygiene and medicine put him at odds with the traditional teachings his father, and made the other people in his village view him as a ‘problem child’.
          This led him to be quiet and withdrawn until he befriends the protagonist, and it is her falling ill when the same disease that killed his mother that motivates him to try again with the rudimentary resources he was able to secretly scrape together since.

          (this is an ‘isekai inversion’ where all the reincarnators are disillusioned and discouraged, and the protagonist is a native of that world who travels around finding them, putting them in touch with one another, and motivating them to pursue their specializations again. A nuclear engineer, for instance, won’t be able to get much done in a world where the scientific method hasn’t been codified, manufacturing doesn’t exist let alone precision machining, and chemistry has not clawed its way to distinction out of the vague, secretive, formless depths of alchemy)

  • Howdy@lemmy.zip
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    5 months ago

    Do we think they are going to charge for this once out of beta? Even though they have done great work on making it anonymous, I don’t see anything about them not using the input/output as data to “better” their service. So perhaps it would remain free?

  • smileyhead@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    5 months ago

    Who needs to spend 10s installing a dedicated AI chatbot assistant app, when every app you have can be an AI assistant app!