Might be helpful for those that
- don’t have access to hardware that can run things locally
- understand the benefits and limitations of generative AI
Link: https://duckduckgo.com/?q=DuckDuckGo&ia=chat
As a nice coincidence, one of the first results when I searched for a news update was this discussion:
https://discuss.privacyguides.net/t/adding-a-new-category-about-ai-chatbots/17860/2
Honestly, I’m really impressed… Ddg works across VPN, and TOR. That includes these chatbots. That’s a great improvement for privacy
Same. Genuinely impressed
Is there a YouTube video under 10 minutes that compares the different AI models available from DuckDuckGo?
Dunno, but Llama 3 is the best open source model and Claude 3 is the best overall model they offer.
You provided no reasoning but I choose to just believe you. Thank you wise person in the Internet.
I can reaffirm what they said with slightly more proof.
https://huggingface.co/spaces/lmsys/chatbot-arena-leaderboard
nice link, didn’t know that
Thanks for the info
I use mixtral8x7b locally and it’s been great. I am genuinely excited to see ddg offering it and the service in general. Now I can use this service when not on my network.
What GPU are you using to run it? And what UI are you using to interface with it? (I know of gpt4all and the generic sounding ui-text-generation program or something)
I am using this: https://github.com/oobabooga/text-generation-webui … It is running great with my AMD 7900XT. It also ran great with my 5700xt. It sets up itself within a conda virtual environment so it takes all mess out of getting the packages to work correctly. It can use NVIDIA cards too.
Once you get it installed you can then get your models from huggingface.co
I’m on arch, btw. ;)
Edit: I just went and reinstalled it and saw it supports these gpus
That’s right, “text-generation-webui”. At least its unambiguous lol. Thanks for sharing.
Open-webui is the best self hosted LLM chat interface IMO. It works seamlessly with Ollama, but also supports other openAI-API compatible APIs AFAIK.
I’m using both in combination with each other and both downloading and using models is super easy. Also integrates well with VSCode extension “Continue”, an open source Copilot alternative (setup might require editing the extension’s config file).
A lot of it might come down to individual tasks or personal preference.
Personally I liked Claude better than GTP3.5 for general queries, and I have yet to explore the other two
Thank you.
It still amazes me just how quickly these models can spit out complex answers pretty much instantly.
…complex almost entirely wholly-hallucinated answers that only have as much bearing on reality as ‘some dude who is very talkative and heard about a bunch of stuff second-hand, and who is also high as balls and experiencing a manic episode where they think they know everything’
LOL. Yeah, sometimes, answers can be very much “I’m winging it today”, but certain prompts, especially for story ideas, can be very interesting and usable.
I’ve always said that if you know a lot about a subject, you can easily spot how AI generally tries to fake it until it makes it.
But if you have no idea about something, the answers you get are certainly better than what your buddy might tell you 😂
But to my point, it comes up with long form content so fast that you wonder how the hell it actually processed the question that quickly.
it has in fact been a delightful creative aid for brainstorming fiction, actually!
The things are fantastic at “yes-and” improvisation and extrapolating from a premise.
If I want to build a world and populate it with loosely defined ‘impressionistic’ background info that doesn’t necessarily require fully fledged lore that interconnects, it can do a great job at showing where the lore could go if i decided to explore there. It’s great at suggesting character names, place names, and ways to fill in blanks that make it easier for me to pick or reject individual elements.
In a story idea I’ve been marinating for a while, one character possesses advanced medical knowledge in a world where germ theory, medicine, and surgery never developed because people had access to ‘healing magic’. The problem is, healing magic works on all organisms - including parasites, bacteria, and cancer, which means trying to ‘heal’ someone with an infection makes the infection worse because the pathogens benefited from the healing magic.
I asked AI to extrapolate more detail about this character’s background and it suggested that his father was the village healer and simply didn’t mention his mother at all.
Those two little details exploded in my imagination as an entire history of emotional conflict:
His mother fell ill with a bacterial infection that magic couldn’t fix when he was too little to do anything about it even though he knew what was wrong and how to help her, and so he blamed himself.
His ‘strange ideas’ about physiology, epidemiology, and concepts like hygiene and medicine put him at odds with the traditional teachings his father, and made the other people in his village view him as a ‘problem child’.
This led him to be quiet and withdrawn until he befriends the protagonist, and it is her falling ill when the same disease that killed his mother that motivates him to try again with the rudimentary resources he was able to secretly scrape together since.(this is an ‘isekai inversion’ where all the reincarnators are disillusioned and discouraged, and the protagonist is a native of that world who travels around finding them, putting them in touch with one another, and motivating them to pursue their specializations again. A nuclear engineer, for instance, won’t be able to get much done in a world where the scientific method hasn’t been codified, manufacturing doesn’t exist let alone precision machining, and chemistry has not clawed its way to distinction out of the vague, secretive, formless depths of alchemy)
Cool
Lotsa processing power behind it all!
Could easily replace some people and significantly improve things.
Do we think they are going to charge for this once out of beta? Even though they have done great work on making it anonymous, I don’t see anything about them not using the input/output as data to “better” their service. So perhaps it would remain free?
Who needs to spend 10s installing a dedicated AI chatbot assistant app, when every app you have can be an AI assistant app!
It’s really fast.
Wonder where is hosted