• hperrin@lemmy.world
    link
    fedilink
    arrow-up
    51
    ·
    3 months ago

    I think that kid is mostly crying because he’s got so many extra fingers that he doesn’t have a middle finger to return the gesture.

  • XenGi@lemmy.chaos.berlin
    link
    fedilink
    arrow-up
    27
    arrow-down
    6
    ·
    3 months ago

    Yes Nvidia is still a shitty company like every other. But their open source drivers run pretty well by now.

    • brenno@lemmy.world
      link
      fedilink
      arrow-up
      26
      arrow-down
      4
      ·
      3 months ago

      Assuming you’re talking about Nouveau, it’s pretty hit or miss depending on what card you have. My previous laptop had an MX330 and it couldn’t do hardware acceleration stuff and 120Hz via HDMI, not to mention screen sharing on Wayland was wonky.

      Oh, and it’s worth to mention that “their” open source driver had nothing to do with Nvidia themselves; they absolute do not care, as opposed to AMD.

      • Litanys@lem.cochrun.xyz
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        2
        ·
        3 months ago

        I think he means nvk. It’s a whole new world. I thought I just heard it’s ready. So worth checking out I guess.

        • brenno@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          3 months ago

          Judging by the docs at mesa3d, I don’t even have a card that supports it

          NVK currently supports Turing (RTX 20XX and GTX 16XX) and later GPUs. Eventually, we plan to support as far back as Kepler (GeForce 600 and 700 series) GPUs but anything pre-Turing is currently disabled by default.

      • XenGi@lemmy.chaos.berlin
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        3 months ago

        I’m not talking about Nouveau. I’m talking about the Open Source drivers from Nvidia. They released them a while ago and they’ve gotten pretty good lately.

        • Ineocla@lemmy.ml
          link
          fedilink
          arrow-up
          8
          ·
          edit-2
          3 months ago

          I heard back than that the open source driver is pretty much just a layer of abstraction and that most actual code runs in the firmware which is proprietary

        • brenno@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          2
          ·
          3 months ago

          The driver you’re likely referring to is NVK, which is also not developed by Nvidia, check out the annoucement post by Collabora, it says:

          As said above, NVK is a new open-source Vulkan driver for NVIDIA hardware in Mesa. It’s been written almost entirely from scratch using the new official headers from NVIDIA. We occasionally reference the existing nouveau OpenGL driver […]

          And also

          a few months ago, NVIDIA released an open-source version of their kernel driver. While this isn’t quite documentation, it does give us something to reference to see how NVIDIA drives their hardware. The code drop from NVIDIA isn’t a good fit for upstream Linux, but it does give us the opportunity to rework the upstream driver situation and do it right.

          So they’re developing a driver based off headers made available by Nvidia and some of the reverse engineered code from regular Nouveau. In fact, it seems to be a branch of Nouveau as it stands:

          Trying out NVK is no different than any other Mesa driver. Just pull the branch nvk/main branch from the nouveau/mesa project, build it, and give it a try

          So the “OSS drivers from Nvidia” aren’t what makes it work, it’s the whole community effort to build NVK from scratch.

          Regardless, it only supports the most recent cards using Turing architecture. From mesa3d docs:

          NVK currently supports Turing (RTX 20XX and GTX 16XX) and later GPUs. Eventually, we plan to support as far back as Kepler (GeForce 600 and 700 series) GPUs but anything pre-Turing is currently disabled by default.

        • null@slrpnk.net
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          3 months ago

          They released them a while ago and they’ve gotten pretty good lately.

          Based on what?

    • Lulzagna@lemmy.world
      link
      fedilink
      arrow-up
      8
      arrow-down
      1
      ·
      3 months ago

      The narrative around Nvidia seems to have done a complete 180 in the last year or two. I’m skeptical that it’s as good as many are stating.

      Are these OSS drivers maintained by Nvidia? Or is it Nouveau?

      • Ephera@lemmy.ml
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        3 months ago

        I know practically nothing about Nvidia’s drivers, but I can see them doing a 180° flip, because they want to grab a chunk of that AI market.
        Even before LLMs, they were investing there, and now it’s just completely settled that tons of AI-related hardware needs to be either a beefy a server machine or a low-profile edge PC. For both, Linux is very much preferrable.

        • XenGi@lemmy.chaos.berlin
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          edit-2
          3 months ago

          Nvidia pretty much always dominated the AI GPU market with their closed source driver and Cuda. Nothing has changed about that except for more competition in AI specific hardware which you can buy from several vendors now. But no one has ever used AMD cards with OpenCL for AI or ML. If you were serious about it, you always used Nvidia with Cuda or nowadays some dedicated AI accelerator card (DPUs).

    • Rustmilian@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      3 months ago

      Yes, and with the addition of NVK it’s gotten a lot better. There is still some issues on Wayland and specific problem cards, but it’s not nearly as bad as it was even 2y ago.

  • Yer Ma@lemm.ee
    link
    fedilink
    arrow-up
    15
    arrow-down
    1
    ·
    edit-2
    3 months ago

    I’ve been using NVIDIA cards on Linux for 20 years… I don’t get this

    • TrickDacy@lemmy.world
      link
      fedilink
      arrow-up
      26
      arrow-down
      3
      ·
      3 months ago

      I tried it for 2 years. After having a lot of weird issues I finally upgraded to an AMD card and so many of those issues went away. Firstly I can install updates without worrying about breaking games or random graphical things. AMD has been way more solid.

          • Bondrewd@lemmy.world
            link
            fedilink
            arrow-up
            1
            arrow-down
            3
            ·
            edit-2
            3 months ago

            Weeelll not for long. Open kernel module works like a charm for me. Wayland support is now actively worked on and is already functional for the most part. HDMI 2.1 was always supported while AMD would not be able to. HDR and 10 bit support also dropped just now.

  • Willem@kutsuya.dev
    link
    fedilink
    arrow-up
    16
    arrow-down
    2
    ·
    3 months ago

    Missing the joke here? We run a 3090 and a 3900x just fine on ArchLinux.

          • Rustmilian@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            3 months ago

            It depends, but yes. Screen recording has worked under Wayland for quite some time now.

            Spectacle used in example isn’t very good at screen recording, but it’s the best I can manage with the size cap on Lemmy.
            You’ll likely have to do some configuration. I’d recommend using OBS-Studio and Pipewire, as well as checking the AUR for patched versions if needed if you happen to be on Arch ofc.

  • SagXD@lemm.ee
    link
    fedilink
    arrow-up
    9
    ·
    3 months ago

    Question: Is buying a Laptop with a Nvidia graphic card is bad idea for Linux(XFCE user)?

    • KISSmyOS@feddit.deOP
      link
      fedilink
      arrow-up
      30
      arrow-down
      2
      ·
      edit-2
      3 months ago

      Yes, IMO. If you haven’t bought the hardware yet, there’s no reason to subject yourself to the headache of lacking Linux support, instead support companies that value open source.
      AMD and Intel GPU’s simply work out of the box with all features.

      And it’s not like on a laptop you need the highest of high end graphics acceleration anyway.

    • nutbutter@discuss.tchncs.de
      link
      fedilink
      arrow-up
      2
      ·
      3 months ago

      I own an Omen 15 inch with 3060. It has some issues but it works fine. However, my next one will definitely be AMD.

      One major issue is that I have to use my desktop manager (mutter, for Gnome on Fedora) with the Nvidia drivers, not the integrated GPU of AMD, otherwise external monitors do not work at all. This is a problem because dedicated GPU cannot go to sleep amd constantly uses at least 15 watts, reducing the battery life.

      Another issue is, a lot of times, my laptop won’t wake up after sleeping. I have checked the logs, and I am 90% sure that it is because I login to my desktop manager using dedicated GPU. If you don’t need an external monitor, or if you have a dedicated mux switch, you should not have to face any of these problems.

      A few minor problems are that I cannot use the official builds with Nvidia drivers, if I want to use secure boot. For secure boot, I have to rely on third party developers for this. An issue I saw sometime ago was, when I used Manjaro, my maximum TDP of the GPU never exceeded 79 watts. When using Fedora, ot goes up to 95 watts. On Windows it used to go upto 100 watts. Also, there are some softwares like keyboard lighting manager, bios updater etc, which work on Windows only, not even on a VM. Also, the fans never exceed 4099 RPM on Linux, whereas on Windows they could go upto 6500. But I have always seen Linux to be 10-20% faster in my Blender render tests.

      I hope this helps. If you have any questions, feel free to DM.

    • Shareni@programming.dev
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      3 months ago

      Most of the serious problems have to do with Wayland, so xfce will be fine. I’m running it on a t480 with a geforce mx150 just fine.

      If it’s a good deal, take it. Even if you do decide to switch ot Wayland at some point, those issues should be mostly fixed soon™

  • Zewu@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    3 months ago

    The funny thing is that the vast majority of NVIDIA GPUs are probably used in Linux-based systems because of the MLAI hype.

    • Dudewitbow@lemmy.zip
      link
      fedilink
      arrow-up
      6
      arrow-down
      1
      ·
      3 months ago

      what happens when you have the money to write software for your hardware and not give back to open source because $$$.

  • Chocrates@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    3 months ago

    I just bought an Nvidia GPU because my main use is vifo, and AMD is hard to make work.
    Im still salty