• krooklochurm@lemmy.ca
    link
    fedilink
    English
    arrow-up
    5
    ·
    6 hours ago

    So… the models are all trained up and now they need to run them is what I’m reading.

    You need lots of vram to train a model.

    An llm once trained can be run in much less vram and a lot of ram