I really hate when companies do that kind of crap. I just imagine a little toddler stomping around going “No! No! Nooo!”
I really hate when companies do that kind of crap. I just imagine a little toddler stomping around going “No! No! Nooo!”
Is there a way to host an LLM in a docker container on my home server but still leverage the GPU on my main PC?
Thank you for that answer! That makes sense.
Lurking beginner here, why is this bad?
I wonder how much schooling someone would need for this? What’s the salary and benefits like? I think I could do this.
Dragon Ball Z was and always shall be one of the most important franchises in my life. This news is incredibly sad to hear. What a legend and influence. RIP Mr. Toriyama.
Is that actually true or is that just their legal team playing it overly safe? Because if it is true that’s incredibly stupid.
“Suspect is hatless! Repeat, hatless!”
I have a similar setup except I use pfSense as my router and pihole for DNS, but I’m sure you can get the same results with your setup. I’m running HAProxy for my reverse proxy and configs for each of my docker containers so any traffic on 443 or 80 gets sent to the container IP on whatever unique port it uses. I then have DNS entries for each URL I want to access the container by, with all of those entries just pointing to HAProxy. Works like a charm.
I have HAProxy running on the pihole itself but there’s no reason you couldn’t just run that in it’s own container. pfSense also let’s you install an HAProxy package to handle it on the router itself. I don’t know if opensense supports packages like that though.
You can even get fancy and do SSL offloading to access everything over HTTPS.