Was using my SO’s laptop, I had been talking (not searching, or otherwise typing) about some VPN solutions for my homelab, and had the curiosity to use the new big copilot button and ask what it can do. The beginning of this context was actually me asking if it can turn off my computer for me (it cannot) and I ask this.
Very unnerved, I hate to be so paranoid to think that it actually picked up on the context of me talking, but again: SO’s laptop, so none of my technical search history to pull off of.
A privacy minded person probably wouldn’t use these tools to begin with tbh, they would likely run their own LLM instead.
I guess that’s why OP brought up that they were using someone else’s computer.
Also, a truly privacy-minded person wouldn’t refuse to use a hosted AI product at all. We generally just make ourselves aware that we don’t have privacy when using it, and never type anything sensitive into it. Also, have you seen what it costs to run a capable LLM?
Just don’t pull a samsung
I’ve just started messing with GPT4all for CPU based language models which can run relatively well on older gaming hardware, and a coral accelerator module for my NVR presence detection with Frigate only cost 30$
That’s what I’ve been playing with. Cool stuff even though it’s limited because of my 8GB nvidia card
It’ll be interesting to see how the technology advances in even 2 or 3 years, even with just an 8GB card, thanks to optimizations etc.
Meet GPT4all