I have an RTX 2080ti.

I still play in 1080p 60Hz, and the 2080 is plenty. But I’m looking to train some ML models, and the 11GB VRAM is limiting for that.

Thus, I plan to buy a new one. Also I don’t want a ML only GPU since I don’t want to maintain two GPUs.

Since I’m upgrading, I need to think of future compatibility. At some point I will move to at least 2k, although still I’m not bought into 4k as any perceivable benefit.

Given all these, I wanted to check with folks who have either card, should I consider 4090?

  • simple@lemmy.mywire.xyz
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    I bought a 4090 just to run LLM and Stable Diffusion, with some occasional gaming. But if you’re just use it for ML, get whatever is cheaper (ironically I found 4090 cheaper than 3090 when shopping around).

    • TheTrueLinuxDev@beehaw.org
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      7900 XTX recently got support for Stable Diffusion and LLM, on paper, it’s faster than 4090 RTX for FP16 computation, it does seem faster judging my experience using rented 4090 RTX on Runpod and my 7900 XTX GPU. 14 seconds (4090 RTX) vs 6 seconds (7900 XTX.)

      7900 XTX is an option if you want $1000 cheaper than 4090 RTX and have similar sized VRAM and having comparable performance to that of 4090 RTX.

    • MoonRocketeer@beehaw.org
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 year ago

      I’m doing summer research with a focus on ML. I just built my computer and picked AMD because of the price, but did not now that Nvidia was the one to pick at the moment if that’s what I wanted it for. I don’t know enough about hardware and could use the school labs anyway, but I should have done better research (ironic heh).