Estebiu@lemmy.dbzer0.comtoSelfhosted@lemmy.world•Help with Home Server Architecture and Hardware Selection?English
1·
11 hours agoYou can still run smaller models on cheaper gpus, no need for the greatest gpu ever. Btw, I use it for other things too, not only LLMs
Uhh, a lot of big words here. I mostly just play around with it… Never used LLMs for anything more serious than a couple of test, so I don’t even know now many tokens can my setup generate…