• Eochaid@lemmy.world
    link
    fedilink
    arrow-up
    217
    arrow-down
    4
    ·
    1 year ago

    No time for distrowars

    …says the guy that makes a meme shitting on users of every other distro.

  • Lightning66@lemmy.world
    link
    fedilink
    arrow-up
    100
    ·
    1 year ago

    Honestly what is wrong with ‘just works’. If the policies behind the project and the security and privacy is all in place using this option is nothing wrong.

    For linux to grow it needs to be more ‘just works’. Let the complex stuff and simple stuff be there. It’s not one or the other.

    • AggressivelyPassive@feddit.de
      link
      fedilink
      arrow-up
      38
      ·
      1 year ago

      Exactly.

      When I was younger, tinkering around was a hobby in itself. But today I actually used my machine and I want it to work without hassle. I don’t want to think about swap partition sizes, modeset kernel parameters and that kind of stuff. I want a reliable tool.

      That’s why so many devs use MacBooks. They’re essentially Unix machines with a proper GUI and mostly work absolutely flawlessly.

      I’ve been using MacBooks for over ten years now and had exactly one crash: when the drive was failing so hard, it couldn’t even spin up anymore.

      • FancyFeaster@lemmy.fail
        link
        fedilink
        English
        arrow-up
        12
        ·
        1 year ago

        This is exactly me. For a server it’s Linux but for everyday use/work a MacBook Pro is great. It just works. It’s great as you can fire up the command line to manage Linux servers easily. That’s how I admin my Lemmy Ansible install.

        For gaming I use Windows. It’s all about the best tool for the job.

    • pewpew@feddit.it
      link
      fedilink
      arrow-up
      15
      ·
      1 year ago

      Why should I use Arch btw if Ubuntu does everything I need? It’s not some locked down os like Windows and I can tweak it however I want

      • Prager_U@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 year ago

        You might want to configure it from scratch, with exactly the tools and utilities you want (e.g. networking utility, desktop environment). Or you might just find this process fun and interesting. Some people take issue with how Canonical is run, and decisions they make.

      • stewie3128@lemmy.world
        link
        fedilink
        arrow-up
        4
        arrow-down
        2
        ·
        edit-2
        1 year ago

        I think it’s funny that so many Linux users talk about how locked down Windows is, when 90% of them live in an effective walled garden defined by their package manager, or other inborn restriction of their distro. I doubt that even 10% are compiling from source with any regularity.

        Why do you need to wait for someone to repackage FF for you before you install it? Just go get it if you run Arch BTW, but you know the overwhelming majority of ArchBros really only know how to install it through Pacman.

        • velioc@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 year ago

          What‘s wrong with installing software from a package manager when the package I need is on there and has a decently up to date version? If its not on there I can still build from source.

          When I‘m in a situation where I just need a specific lib or cli tool or whatever and don‘t have time to potentially debug a niche compile error, installing from a package manager is more convenient and saves time.

          Except snap, which can burn in hell.

    • ashtefere@lemmy.world
      link
      fedilink
      arrow-up
      11
      arrow-down
      1
      ·
      1 year ago

      Fedora definitely doesn’t “just works”. Try installing the proprietary NVIDIA drivers then updating your kernel.

      • pascal@lemm.ee
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        4
        ·
        1 year ago

        Ford definitely doesn’t “just works”. Try installing a jet engine on the roof then fueling it with unleaded.

        I don’t want to blame you, but I think sometimes Nvidia really enjoys messing with Linux users.

        • Ullallulloo@civilloquy.com
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          1 year ago

          Regardless of whose fault it is, it’s unacceptable that half the people with a discrete GPU have nigh incompatible hardware. It’s more akin to using snow tires breaking your car than a jet engine.

      • Aganim@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Not just that, but ever since F32 every single fricking update managed to either break something completely or made some part of the OS too unstable for daily use. Bluetooth issues, crashing display server, system hanging on suspend, broken bootloader on some Secure Boot sysems (handover from EUFI to bootloader no longer happening) therefore rendering the system completely unable to boot… Just some issues I ran into when using Fedora as my daily driver for well over a year.

        Fedora is great when it works, but always keep in mind that having a bleeding edge system comes at the cost of stability.

      • Professor_Piddles@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        I keep reading this, but I haven’t had any issues at all over the past year with Fedora KDE and proprietary Nvidia drivers installed via flatpak. Is it more of a problem when installed via dnf?

      • Hadriscus@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        That was my experience ten years ago : mobile Geforce 660 with “Optimus”, two flavours of drivers, of which none worked reliably. I remember fiddling with Nouveau & Bumblebee for hours. I should try another, more stable distro on my desktop, but I rely a lot on some Windows-only programs.

    • Commiunism@lemmy.wtf
      link
      fedilink
      arrow-up
      6
      arrow-down
      1
      ·
      1 year ago

      Just works is definitely something Linux should strive for, but at least in my experience and in experience of my friends, “just works” has always been a poor experience.

      What I’m talking about is how you install a just works distro like mint or garuda, and then some package refuses to work or maybe hardware such as a sound card or multi monitor setup, so you gotta go troubleshooting, which isn’t very “just works”. What’s worse is that some of the issues aren’t talked about/documented, so you pretty much have to rely on making a post and wait for potentially hours for a response to get help. It’s also very hard to troubleshoot the system by yourself if you don’t have experience, as you don’t really know what’s running under the hood as in what came prepackaged by the distro.

    • PlumberOfDeath@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      Complaining that something works or that people prefer things that work is a very backasswards critique and deepens the presumed stereotype that home Linux users are just nerds who only like to tinker (which is just partially true).

    • ddkman@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      To be fair I’ve been using mint, and whilst THE FUCKING MULTIMONITOR DOESN’T FUCKING WORK (Uhh I wanna punch a drywall)! otherwise it has been suprisingly smooth. Especially since it is my main computer, and I use it to burn discs for older game systems (incl. x360!!!), unity development, and a bunch of other stuff. So I have to say, it is VERY close to it just works.

        • ddkman@lemm.ee
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          It works but it kinda forgets the monitor layout, especially if you remove the computer from the dock while the os is sleeping.

          It is a pain in the ass to set it up again, especially since it thinks it is a great idea to use the inbuilt monitor, even though the lid is shut.

          Also when you undock when suspended sometimes it forgets to check after waking up, and some programs, especially fullscreen video playback has a tendency to continue on a “ghost display”

          Overall it is livable but annoying especially because 33% of times it just works.

          Also this is xfce. Cinamon and Mate may be much better.

          • Hadriscus@lemm.ee
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            Ah, xfce is the lightweight DE, right ?

            And what is this dock exactly ? I’m not sure what you’re referring to.

            • ddkman@lemm.ee
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              Well yeah, “lightweight”. The only one that uses slightly less resources than windows 10.

              A docking station for a laptop? Pretty common device. A specialised port replicator.

              • Hadriscus@lemm.ee
                link
                fedilink
                arrow-up
                2
                ·
                1 year ago

                Ok, I’m not up to speed on these things. I use several monitors on my desktop computer only. I suppose this would work better than with a laptop, in the event I chose xfce as my DE ? I don’t usually hotswap monitors, they’re always plugged in.

                Thanks for clarifying

  • hot_milky@lemmy.ml
    link
    fedilink
    arrow-up
    74
    arrow-down
    1
    ·
    1 year ago

    Isn’t the point of this meme for the low IQ and high IQ people to have the same preference? Any way, I’m on Linux Mint usually -_-

  • SeaJ@lemm.ee
    link
    fedilink
    arrow-up
    80
    arrow-down
    16
    ·
    1 year ago

    Well I can tell you why Linux does not have a higher adoption rate: toxic shit like this.

    • OrnateLuna@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      45
      arrow-down
      5
      ·
      1 year ago

      Na the biggest and main reason why Linux doesn’t have a higher adoption rate (on desktop) is that it’s not preinstalled on the devices you buy.

      There are obviously other factors but they are miniscule in comparison

      • kmkz_ninja@lemmy.world
        link
        fedilink
        arrow-up
        26
        arrow-down
        19
        ·
        edit-2
        1 year ago

        Most people don’t want to have to use a cmd line to use their PC.

        Edit: Seriously, why is it such a confusing prospect to linux users that linux is difficult. Literally, every thread on here comparing distros is filled with

        “I used debian, but I had to update it every day or my graphics drivers would fail.”

        “Oh to fix that regularly occuring issue, just type ‘cgreg320 -I1I0O xx /*poweruninstall the year your motherboard was manufactured’ into the command prompt.”

        “Oh yeah, Nvidia graphics cards, AMD motherboards, Steam, Chrome, Adobe products, left-handed mice, and the letter F are unsupported on this distro.”

        Windows is easy. Not great, but easy.

        • ClumZy@sh.itjust.works
          link
          fedilink
          arrow-up
          13
          arrow-down
          2
          ·
          edit-2
          1 year ago

          Cmon, this might have been true 15 years ago, but my grandma has been using Mint for 5 years + and TRUST ME she don’t know shit about Bash. Big distros work OOTB today, as soon as you stick to regular use you’ll never see a shell in your life.

          • GamingChairModel@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            edit-2
            1 year ago

            At a certain point, though, you have to wonder whether a traditional desktop linux distro is better for regular users than just preinstalled ChromeOS on a Chromebook.

          • Jeanschyso@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            I revived an old computer using Mint, and it works great, but that’s for my brother who just browses and does spreadsheet and writing. I’m a bit more involved with how I use a computer and it is difficult enough to setup a wireless Xbox controller that I am considering automating it for future use and make that public. Note that I know fuck all about how to even begin, and I might give up halfway through, but the point stands that the motivation was triggered by a lack of user friendliness.

        • zbecker@mastodon.zbecker.cc
          cake
          link
          fedilink
          arrow-up
          10
          arrow-down
          1
          ·
          edit-2
          1 year ago

          @kmkz_ninja @OrnateLuna

          I know people who use linux mint (or other distros that aim at user friendliness) who literally never have to touch the command line. This claim that you need to use the command line was true 5 years ago, but today it is largely false.

          I am in a Linux User Group and I am literally the only person who uses a tiling window manager (I use hyprland) instead of DEs like kde, gnome, cinnamon, etc.

          • Abnorc@lemm.ee
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            I feel like it depends on which distro and hardware you choose. I remember having some weird pcie errors on xubuntu on an HP pc, and I couldn’t find a fix online. Windows is pretty hassle free on almost all hardware, probably owing to the fact that all the hardware is made to work with windows (or owing to windows excellent compatibility, maybe both?)

            Using regular Ubuntu on a laptop now it’s pretty seamless though. I haven’t had to do any command line stuff for setup as of yet, so it’s getting better.

          • SeaJ@lemm.ee
            link
            fedilink
            arrow-up
            2
            arrow-down
            2
            ·
            1 year ago

            Tried out Pop OS for my laptop which is generally seen as a simpler distro. I had to hit up terminal to attempt fprintd. Getting a fingerprint registered was a pain in the ass. Then when I did get it registered, I could not log back in through the UI. I’ll still likely switch to it sometime soom and send the logs to fprintd to eventually fix but it was still frustrating as hell since fingerprint scanners are a pretty basic feature nowadays.

            The only issues I’ve really had with my Linux Mint VM is upgrade issues and my smb mount occasionally failing. Both of those basically required terminal.

            Don’t get me wrong, Linux is a fuck ton easier than it was 15 years ago when I started testing it out. But there still is a ways to go.

        • MrBubbles96@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          “Seriously, why is it such a confusing prospect to Linux users that Linux is difficult?”

          Because honestly? It really isn’t. A couple of years ago, maybe it was hard, but now, Linux is easy to pickup and learn; so easy even someone like me who has zero programming/coding skills (not my profession) and still kinda thinks typing stuff into the terminal is basically black magic was able to pick it up and adopt it with very little hiccups and set it up for my ma on a seperate computer with no problems on her end. Unlearning Windows? That’s the hard bit, especially if you go into Linux (or even Mac, as was the case with me a long time ago) thinking it’s Windows with another skin instead of different beast althogether that has it’s own quirks one needs to get used to, just like with anything new (and just how the majority are used to Windows’s own quirks). That’s where you’ll start having a bad time.

        • dezmd@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          12
          ·
          edit-2
          1 year ago

          Most people don’t want to have to use a cmd line to use their PC.

          “I don’t want know that car needs oil changes, I just want turn key and go”

          “Uh oh, car no start.”

          /throws car in trash and buys a new one

    • H2207@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      Fueds between distros will always exist, like fueds between car manufacturers. It’s just banter, except some people take it wayyyy too far.

    • Avid Amoeba@lemmy.ca
      link
      fedilink
      arrow-up
      7
      arrow-down
      2
      ·
      1 year ago

      All the parrots doing Ubuntu bashing over the last few years are really hurting adoption in my opinion. It still is the best Linux OS for new users for many reasons, even if there are many other ones that might be better suited for other uses or preferences.

      • WeirdGoesPro@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        9
        arrow-down
        2
        ·
        1 year ago

        Riddle me this—I’ve used Windows, MacOS, Debian, Fedora, and Ubuntu to host a Plex server over the last 12ish years, and Ubuntu has been the most stable, hands down. Currently I’ve got a bunch of VM’s on ProxMox, but Plex still hums away on an Ubuntu Server LTS VM without a hitch.

        I have plenty of reasons to chose other distros for specific needs, but when I want something to just work and be easy on me, Ubuntu is the right choice, and it is definitely a solid place for anyone to start getting into the Linux way of life.

        • Avid Amoeba@lemmy.ca
          link
          fedilink
          arrow-up
          4
          arrow-down
          2
          ·
          edit-2
          1 year ago

          No riddle really. The last time I checked, Ubuntu was the most used server Linux OS. Just like RHEL, it’s tested for and used in the enterprise, but unlike RHEL, everyone gets the same copy, including you and me. It follows that it should be solid. A big part of that comes from Debian of course, but there’s additional testing and patching in Ubuntu. It’s no wonder it just works.

    • TunaCowboy@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      This is parroted all the time, all the while linux is doing just fine.

      Why is adoption rate such an important metric?

      • GamingChairModel@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        Serious answer: we need a sizeable installed user base so that the cross-platform developers don’t leave us behind. I found this article to be a pretty compelling analysis of how dependent we are on “scraps” from MacOS/Windows versions of web browsers, and how the Mozilla foundation might not prioritize desktop Linux if it runs into financial difficulties. The recent Red Hat controversy also reminds everyone of how dependent Linux as a whole depends on financial backing from deep pocketed corporations.

    • Efwis@lemmy.zip
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      12
      ·
      1 year ago

      A distro is the distribution you want to use for an OS when it comes to linux such as fedora, Ubuntu, mint, arch etc. the symbols are the icons for the individual OS’s

  • Transcriptionist@lemmy.world
    link
    fedilink
    arrow-up
    33
    arrow-down
    1
    ·
    edit-2
    1 year ago

    Image Transcription:

    A bell curve featuring numerous wojaks and Linux distribution system icons by IQ score. From left to right they are: At the left 0.1% end of the bell curve with no IQ score labels is a boomlet wojak accompanied by Ubuntu icon and the text: WHERE START BUTTON? Between 0.1% and 14% on the left side of the bell curve, encompassing the IQ scores 55 and 70 is an NPC wojak accompanied by the Arch icon and the text: I USE ARCH BTW Between 14% on the left side of the bell curve and approximately 34% on the right side of the bell curve, encompassing the IQ scores 85, 100, and 115 is a crying Zoomer wojak accompanied by the Fedora icon and the text: JUST WORKS Between 34% and 0.1% on the right side of the bell curve, encompassing the IQ scores 130 and 145 is a big brain wojak accompanied by the Gentoo icon and the text: K.I.S.S At the right 0.1% end of the bell curve is a light brown hood wojak accompanied by the Debian icon and the text: NO TIME FOR DISTROWARS

    [I am a human, if I’ve made a mistake please let me know. Please consider providing alt-text for ease of use. Thank you. 💜]

  • Meowoem@sh.itjust.works
    link
    fedilink
    arrow-up
    32
    arrow-down
    2
    ·
    1 year ago

    Quick attack users of the most popular distro before normal people start using Linux! We can’t allow a good, stable and perfectly usable distro to get popular, we need to bully everyone back to windows or terrible things might happen like the year of they Linux desktop!!

  • Ilflish@lemm.ee
    link
    fedilink
    arrow-up
    28
    arrow-down
    2
    ·
    edit-2
    1 year ago

    I use Linux Mint because I like Mint Ice Cream

    I use Parrot in honour of my parrot Loba

    I use Ubuntu Mate because I’ve always wanted one

    I use Peppermint because it’s my favourite flavor of gum

    I use Rocky Linux because he’s my favourite American Hero

    I use fedora because I know it will come back in style

  • gravity@infosec.pub
    link
    fedilink
    English
    arrow-up
    21
    ·
    1 year ago

    This was me, except I went straight from Ubuntu to Debian. At some point I wondered why I was doing all this manual maintenance. I realized that Ubuntu relies on Debian and so I switched. Haven’t looked back.

    • rambaroo@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      1 year ago

      My last Ubuntu install would break my shit all the time. Debian is so much more reliable it’s incredible. Haven’t had to mess with anything in over a year whereas Ubuntu required constant maintenance. It’s a shame it’s so popular because Canonical seems to be absolutely awful at testing their package updates compared to any other common desktop distro. I’ve heard fast fewer issues with Debian, Fedora and SUSE

      • Haui@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Interesting. I‘ve installed ubuntu server on my homeserver and added a couple of services like two years ago. I‘m constantly improving stuff and so far, nothing went wrong. I also had a couple ubuntu servers at work, no issues like ever.

        I also installed ubuntu desktop recently and it’s a little buggy (my fault as I didn’t use the lts version I suppose).

  • mailerdaemon@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    ·
    1 year ago

    I’ve been a Linux user since installing Slackware from floppy discs. These days I run Mint on my desktop/laptop and Ubuntu on servers. Does this make me weak?

    • SuperDuper@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      1 year ago

      These days I run Mint on my desktop/laptop and Ubuntu on servers. Does this make me weak?

      It makes you a king because you use what’s best for you.

    • Doubletwist@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 year ago

      I started similarly with Yggdrasil, but quickly moved to Slackware, downloading floppy images on a 2400bps modem.

      These days I use Xubuntu on my desktops/laptops and Debian on my servers.

      While back in the day I (to quote Weird Al) “beta tested every operating system, gave props to some, and others, I dissed 'em”, I just haven’t got time to deal with all that any more.

      • mailerdaemon@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I was actually a Xubuntu user for a long time, but tried Mint with Cinnamon, and found lots of things much easier and more polished, while maintaining the lightweight feel that XFCE provided.

  • GreenMario@lemm.ee
    link
    fedilink
    arrow-up
    19
    ·
    1 year ago

    Distro swapping is a rite of passage. The grass is always greener. Until you settle and stopped caring about the OS at all. Which is why I went back to Windows (7 at the time) mainly for gaming compatibility.

    Proton got me hnnng tho I’ll definitely be giving either Endeavor or OpenSuse a go when I build my next computer. Rolling distro sounds like a “set it and forget it” thing and I like that.

    • Fernando-678@kbin.social
      link
      fedilink
      arrow-up
      9
      ·
      1 year ago

      Installing a new distro feels so good to me, makes me happy. I love messing with the settings and stuff, trying a new desktop environment messing with the native apps. Man I love it.

      I’m using OpenSuse’s Gecko with rolling release. It’s beautiful.

      • GreenMario@lemm.ee
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        Been trying out EndeavourOS in a VM for a bit. Might be my next home if I don’t fall back to good Ol Reliable Debian.

        • Corroded@leminal.space
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          I’d say EndeavorOS is the way to go. So far I haven’t had nearly as many issues as I had with Manjaro

    • Carter@feddit.uk
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      I started with Arch and loved it but just recently switched to openSUSE and it might be even better.

    • okfuskee@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Garuda linux. I’m running the dragonized gaming distro and have fallen in love with it. A buddy turned me on to it a few months back and it’s perfect. Runs all my steam games through proton like a champ.

    • Loudambiance@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I got everything working great in Fedora with Proton, even my nvidia drivers. Then, a buddy had an idea that we all get and play MWII, which can not run in proton, now I’m back to windows

      • GreenMario@lemm.ee
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        This is where keeping a pocket Windows dual boot is handy. Probably kept just big enough for two games tops.

  • pascal@lemm.ee
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    3
    ·
    1 year ago

    Debian is for people who have shit to get done and don’t care about a neon colored wallpaper, mostly don’t have a wallpaper at all.

    • Intralexical@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      1 year ago

      Nix is great. But I don’t think I’d want to use it for a desktop OS base.

      (Disk space/cycle life potential, binary cache misses, broken packages, and complete incompatibility with everything else. User error, TBH, but also stuff that’s not really a problem with other systems. Well worth it as a package manager, though.)

        • Intralexical@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          1 year ago

          I’m saying that’s a way I might personally consider going if I were to set up a new computer. Rock solid base that you can still get normal packages and binaries to run on without much hassle if needed, plus Nix with more up-to-date packages that you can customize however you find most useful.

          Personally I have a mix of rolling/regular repos, AUR, Nix, Flatpak, and static binaries. They all have their uses, TBH.

      • yiliu@informis.land
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        1 year ago

        I’ve been using it on various desktops, as a PM but mostly the full OS for 6 years or so. I would hate to switch back.

        Disk space is an issue… I’ve seen the OS take as much as 100 GB. But in a world of 2TB SSDs for $100, is that a big deal?

        I don’t see why NixOS would be any worse for the lifetime of a disk than other distros.

        I’ve only hit binary cache missed for packages I created, or where I changed build options. IOW: a binary cache miss means Debian wasn’t gonna have it anyway. And on the flip side: you can change package build options! Neat!

        Broken packages are, if anything, less of a problem with Debian. Debian has lots of packages that are…not broken, but incomplete, requiring lots of manual config or whatever. NixOS is way better at that stuff.

        User error? Yeah, fair. I’m a programmer by trade, but I can definitely see how it’d be a bit much if I weren’t.

        But oh man…you should’ve seen how trivial it was to switch from PulseAudio to PipeWire (including Jack support etc), leaving no trace that Pulse was ever installed… Or switching from X to Wayland, on a system that I’ve been doing rolling updates on since 2017, all with a clear conscience… It’s beautiful.

        • Intralexical@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 year ago

          Disk space is an issue… I’ve seen the OS take as much as 100 GB. But in a world of 2TB SSDs for $100, is that a big deal?

          Yes? Storage used for the OS is space not used for projects, entertainment, docs, redundancy, snapshots, avoiding fragmentation (EXT4), etc. Money spent on SSDs is money not spent on going out, food, meeting people, basic needs, other hardware, etc.

          I don’t see why NixOS would be any worse for the lifetime of a disk than other distros.

          Untested, but I’d assume high space use combined with high update frequency, plus occasional builds-from-source and multiple simultaneous package versions, means more disk writes.

          Biased, maybe, because manual GC means you see disk use tick up more than in other package managers, and also because I personally repeatedly rebuilt a custom gigabyte-sized Derivation dozens/hundreds of times. But I think it’s a reasonable point of caution.

          I’ve only hit binary cache missed for packages I created, or where I changed build options.

          Broken packages are, if anything, less of a problem [than] with Debian. Debian has lots of packages that are…not broken, but incomplete, requiring lots of manual config or whatever.

          Maybe this is a NixPkgs vs NixOS thing. Also, using Nix mostly to supplement packages I hadn’t already installed through my distro probably meant I hit more fringe areas. But I’ve even encountered cache misses and failed builds for some pretty big Python libraries on certain commits.

          Debian-based out-of-the-box functionality for stuff is indeed also Not Great, IIRC— Stable, but yeah, sometimes maybe a bit “incomplete”. Actually, Arch-based has worked well IME.

          And on the flip side: you can change package build options! Neat!

          But oh man…you should’ve seen how trivial it was to switch from PulseAudio to PipeWire (including Jack support etc), leaving no trace that Pulse was ever installed… Or switching from X to Wayland, on a system that I’ve been doing rolling updates on since 2017, all with a clear conscience… It’s beautiful.

          Yeah. I personally don’t care about that stuff unless it directly impacts something I’m working on.

          And that’s why I say Nix is a great tool for package management, but not something I’d personally want to use as an OS base. When you’re already elbow-deep in the plumbing anyway, Nix makes it way easier to swap components out. But when you just want to install and use an application, editing Nix configs feels like more work, and it’s so much easier to just pacman/yum/apt-get install firefox or whatever and get on with your day.


          Plus, some specific red flags surrounding stability and interoperability:

          1. ALSA is apparently hardcoded to just straight-up not work with a Nix root. Not sure how NixOS handles it, but in my specific use case, I had to symlinkJoin{paths=[alsa-lib alsa-plugins]} so they could find each other. Pretty sure it took a lot of strace -f -e trace=file and nix-locate for me to figure this one out, just to get sound working.

          2. QtWebEngine/Chromium has to be run through some kind of sed -e "whatever.so" to “Patch library paths in Chromium sources” in order to even run, because it’s also hardcoded to just not work with a Nix root. IIRC, this one I figured out by just straight-up grepping on the compiled binaries after seeing the errors in strace or whereever. Seems a bit ridiculous, using a RegEx to patch a web browser when installing it so it can even run.

          3. Binaries aren’t safe either, because they probably need patchelf to be able to run on Nix.

          4. Flakes are apparently hosted as user repositories on a Microsoft-owned website, and can just randomly disappear sometimes.

          5. Qt generally takes a ton of extra steps to be able to run on Nix. And have you actually ever opened the wrapper the Nix hooks generate to see what it’s actually doing? For one of my applications just now, you get a 43kb Bash script with apparently 581 assignments to just a handful of QT and XDG-related environment variables.

          6. OpenGL doesn’t look safe either. Nix handles the drivers its own way, so to get OpenGL for Nix packages to work on other systems, you have to jump through some hoops. I assume the same amount of work in the opposite direction would be needed to use EG proprietary or statically compiled graphics applications on NixOS too.

          7. Running precompiled binaries on Nix looks… Involved, as well. Sure, there’s tools to automate it. But that only hides the complexity, and adding an opaque dependency sorta defeats the entire purpose of configurability and composability IMO.

          I’m sure most of these problems are “solved”, in the sense that NixOS implements workarounds that are the default when you install the affected derivations, and there are wrappers written for most other cases. But all of that adds maintenance, fragility, and complexity. It remarkably works well enough for userspace, but stuff like this still feels a bit house-of-cards-y for the basic OS and desktop. It’s not Nix’s fault, but so much of the work that goes into Nix seems to be just to force software that was never designed for it to run on it. Ultimately, the Linux FHS has momentum and adoption. Nix’s technical design might be compelling, but so are interoperability, stability, and simplicity.

          The NixOS enthusiasts are doing a lot of technically interesting work, but I personally find the results of that work most useful outside the NixOS ecosystem. And I do think Nix as a package manager is really great. Ever since I’ve installed it, I’ve basically incorporated it as a major component or tool in every sizable software project I’ve since started. But I just personally wouldn’t want to base an entire OS on it.