jogai_san@lemmy.world to Selfhosted@lemmy.worldEnglish · 17 hours agoSelf-Host Weekly (30 January 2026)selfh.stexternal-linkmessage-square24fedilinkarrow-up154arrow-down14
arrow-up150arrow-down1external-linkSelf-Host Weekly (30 January 2026)selfh.stjogai_san@lemmy.world to Selfhosted@lemmy.worldEnglish · 17 hours agomessage-square24fedilink
minus-squareSanPe_@lemmy.worldlinkfedilinkEnglisharrow-up2·13 hours agoThe “else-hosted” LLM AI is really not my thing, but selfhosted even less…
minus-squareirmadlad@lemmy.worldlinkfedilinkEnglisharrow-up4·12 hours agoIf I had the proper equipment, I’d run AI if it were self contained and not pinging out to another LLM.
minus-squareDavid J. Atkinson@c.imlinkfedilinkarrow-up2·11 hours ago@irmadlad @selfhosted That is precisely the challenge. I’m not sure it is possible.
minus-squareirmadlad@lemmy.worldlinkfedilinkEnglisharrow-up4·9 hours agoI mean, I can run a few of the private AI stacks, but it is excruciatingly slow as to make it not worth the time. I would want something pretty responsive.
minus-squareDavid J. Atkinson@c.imlinkfedilinkarrow-up2·5 hours ago@irmadlad This what I want too. It’s farther down my to-do list, but I’ll be sure to let you know if I discover anything. Good luck!
The “else-hosted” LLM AI is really not my thing, but selfhosted even less…
If I had the proper equipment, I’d run AI if it were self contained and not pinging out to another LLM.
@irmadlad @selfhosted That is precisely the challenge. I’m not sure it is possible.
I mean, I can run a few of the private AI stacks, but it is excruciatingly slow as to make it not worth the time. I would want something pretty responsive.
@irmadlad This what I want too. It’s farther down my to-do list, but I’ll be sure to let you know if I discover anything. Good luck!