So I’ve been trying to install the proprietary Nvidia drivers on my homelab so I can get my fine ass art generated using Automatic1111 & Stable diffusion. I installed the Nvidia 510 server drivers, everything seems fine, then when I reboot, nothing. WTF Nvidia, why you gotta break X? Why is x even needed on a server driver. What’s your problem Nvidia!

  • fx_@feddit.de
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    Nvidia doesn’t hate linux, it just don’t care and the linux community hates nvidia

    • Vilian@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      11 months ago

      amd didn’t care a few years ago, but their drivers are open, so the community can fix it even if the company don’t care(now amd care a lot more, so it’s better) nvidia is a closed source crap, and it don’t give a fuck too

  • GenderNeutralBro@lemmy.sdf.org
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    11 months ago

    Linux is their bread and butter when it comes to servers and machine learning, but that’s a specialized environment and they don’t really care about general desktop use on arbitrary distros. They care about big businesses with big support contracts. Nobody’s running Wayland on their supercomputer clusters.

    I cannot wait until architecture-agnostic ML libraries are dominant and I can kiss CUDA goodbye for good. I swear, 90% of my tech problems over the past 5 years have boiled down to “Nvidia sucks”. I’ve changed distros three times hoping it would make things easier, and it never really does; it just creates exciting new problems to play whack-a-mole with. I currently have Ubuntu LTS working, and I’m hoping I never need to breathe on it again.

    That said, there’s honestly some grass-is-greener syndrome going on here, because you know what sucks almost as much as using Nvidia on Linux? Using Nvidia on Windows.

  • WasPentalive@beehaw.org
    link
    fedilink
    arrow-up
    0
    ·
    11 months ago

    Nvidia does not ‘hate’ Linux, Nvidia simply never thinks about Linux. They need to keep secrets so people can’t buy the cheap card and with a little programming turn it into the expensive card.

  • danielton@outpost.zeuslink.net
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    11 months ago

    I call them “novideo” because the nvidia GPU in a PC someone gave me was the bane of my existence on Linux. I ended up buying a Radeon for it because I got so tired of having no video after security updates. Nvidia seems to hate everybody except Windows for some reason. Even Apple ditched them long before they ditched Intel.

    But yet, it seems like the majority of Linux users have nvidia anyway.

    • Rassilonian Legate@mstdn.social
      link
      fedilink
      arrow-up
      1
      ·
      11 months ago

      @danielton
      @Mr_Esoteric
      >But yet, it seems like the majority of Linux users have nvidia anyway.

      Probably becouse it’s more popular among windows users, so when most people switch to linux from Windows, they use the hardware they already had, which more often than not includes an nvidia GPU