Might be helpful for those that
- don’t have access to hardware that can run things locally
- understand the benefits and limitations of generative AI
Link: https://duckduckgo.com/?q=DuckDuckGo&ia=chat
As a nice coincidence, one of the first results when I searched for a news update was this discussion:
https://discuss.privacyguides.net/t/adding-a-new-category-about-ai-chatbots/17860/2
Open-webui is the best self hosted LLM chat interface IMO. It works seamlessly with Ollama, but also supports other openAI-API compatible APIs AFAIK.
I’m using both in combination with each other and both downloading and using models is super easy. Also integrates well with VSCode extension “Continue”, an open source Copilot alternative (setup might require editing the extension’s config file).