shark@lemmy.org to Selfhosted@lemmy.worldEnglish · 11 days agoWhat's your self-hosting success of the week?message-squaremessage-square90linkfedilinkarrow-up11
arrow-up11message-squareWhat's your self-hosting success of the week?shark@lemmy.org to Selfhosted@lemmy.worldEnglish · 11 days agomessage-square90linkfedilink
minus-squareShimitar@downonthestreet.eulinkfedilinkEnglisharrow-up0·10 days agoNVIDIA Corporation GA104GL [RTX A4000] (rev a1) From lspci It has 16gb of VRAM, not too much but enough to run gpt:OSS 20b and a few other models pretty nice. I noticed that it’s better to stick to a single model, I imagine that unload and reload the model in VRAM takes time.
NVIDIA Corporation GA104GL [RTX A4000] (rev a1)
From lspci
It has 16gb of VRAM, not too much but enough to run gpt:OSS 20b and a few other models pretty nice.
I noticed that it’s better to stick to a single model, I imagine that unload and reload the model in VRAM takes time.