• hoshikarakitaridia@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    2
    ·
    20 hours ago

    AI or servers probably. I have 40gb and that’s what I would need more ram for.

    I’m still salty because I had the idea of going cpu & ram sticks for AI inference literally days before the big AI companies. And my stupid ass didn’t buy them in time before the prices skyrocketed. Fuck me I guess.

    • NotMyOldRedditName@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      20 hours ago

      It does work, but it’s not really fast. I upgraded to 96gb ddr4 from 32gb a year or so ago, and being able to play with the bigger models was fun, but it’s not something I could do anything productive with it was so slow.

      • Possibly linux@lemmy.zip
        link
        fedilink
        English
        arrow-up
        4
        ·
        20 hours ago

        Your bottle necked by memory bandwidth

        You need ddr5 with lots of memory channels for it to he useful

      • tal@lemmy.today
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        20 hours ago

        You can have applications where wall clock tine time is not all that critical but large model size is valuable, or where a model is very sparse, so does little computation relative to the size of the model, but for the major applications, like today’s generative AI chatbots, I think that that’s correct.

        • NotMyOldRedditName@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          20 hours ago

          Ya, that’s fair. If I was doing something I didn’t care about time on, it did work. And we weren’t talking hours, it it could be many minutes though.

    • panda_abyss@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      ·
      19 hours ago

      I’m often using 100gb of cram for ai.

      Earlier this year I was going to buy a bunch of 1tb ram used servers and I wish I had.