shark@lemmy.org to Selfhosted@lemmy.worldEnglish · 14 hours agoWhat's your self-hosting success of the week?message-squaremessage-square75fedilinkarrow-up161arrow-down12
arrow-up159arrow-down1message-squareWhat's your self-hosting success of the week?shark@lemmy.org to Selfhosted@lemmy.worldEnglish · 14 hours agomessage-square75fedilink
minus-squareShimitarAlinkfedilinkEnglisharrow-up1·6 hours agoNVIDIA Corporation GA104GL [RTX A4000] (rev a1) From lspci It has 16gb of VRAM, not too much but enough to run gpt:OSS 20b and a few other models pretty nice. I noticed that it’s better to stick to a single model, I imagine that unload and reload the model in VRAM takes time.
NVIDIA Corporation GA104GL [RTX A4000] (rev a1)
From lspci
It has 16gb of VRAM, not too much but enough to run gpt:OSS 20b and a few other models pretty nice.
I noticed that it’s better to stick to a single model, I imagine that unload and reload the model in VRAM takes time.