Which areas of Linux would benefit most from further standardization?
-
nix can deal with this kind of problem. Does take disk space if you're going to have radically different deps for different apps. But you can 100% install firefox from 4 years ago and new firefox on the same system and they each have the deps they need.
Someone managed to install Firefox from 2008 on a modern system using Nix. Crazy cool: https://blinry.org/nix-time-travel/
-
It works under Windows because the windows binaries come with all their dependency .dll (and/or they need some ancient visual runtime installed).
This is more or less the Flatpack way, with bundling all dependencies into the package
Just use Linux the Linux way and install your program via the package manager (including Flatpack) and let that handle the dependencies.
I run Linux for over 25 years now and had maybe a handful cases where the Userland did break and that was because I didn't followed what I was told during package upgrade.
The amount of time that I had to get out of .dll-hell on Windows on the other hand.
The Linux way is better and way more stable.I'm primarily talking about Win32 API when I talk about Windows, and for Mac primarily Foundation/AppKit (Cocoa) and other system frameworks. What third-party libraries do or don't do is their own thing.
There's also nothing wrong with bundling specialized dependencies in principle if you provide precompiled binaries. If it's shipped via the system package manager, that can manage the library versions and in fact it should do that as far as possible. Where this does become a problem is when you start shipping stuff like entire GUI toolkits (hello bundled Qt which breaks Plasma's style plugins every time because those are not ABI-compatible either).
The amount of time that I had to get out of .dll-hell on Windows on the other hand. The Linux way is better and way more stable.
Try running an old precompiled Linux game (say Unreal Tournament 2004 for example). They can be a pain to get working. This is not just some "ooooh gotcha" case, this is an important thing that's missing for software preservation and cross-compatibility, because not everything can be compiled from source by distro packagers, and not every unmaintained open-source software can be compiled on modern systems (and porting it might not be easy because of the same problem).
I suppose what Linux is severely lacking is a comprehensive upwards-compatible system API (such as Win32 or Cocoa) which reduces the churn between distros and between version releases. Something that is more than just libc.
We could maybe have had this with GNUstep, for example (and it would have solved a bunch of other stuff too). But it looks like nobody cares about GNUstep and instead it seems like people are more interested in sidestepping the problem with questionably designed systems like Flatpak.
-
The diversity of Linux distributions is one of its strengths, but it can also be challenging for app and game development. Where do we need more standards? For example, package management, graphics APIs, or other aspects of the ecosystem? Would such increased standards encourage broader adoption of the Linux ecosystem by developers?
Domain authentication and group policy analogs.
-
Generally speaking, Linux needs better binary compatibility.
Currently, if you compile something, it's usually dynamically linked against dozens of libraries that are present on your system, but if you give the executable to someone else with a different distro, they may not have those libraries or their version may be too old or incompatible.
Statically linking programs is often impossible and generally discouraged, making software distribution a nightmare. Flatpak and similar systems made things easier, but it's such a crap solution and basically involves having an entire separate OS installed in parallel, with its own problems like having a version of Mesa that's too old for a new GPU and stuff like that. Application must be able to be packaged with everything they need with them, there is no reason for dynamic linking to be so important in Linux these days.
I'm not in favor of proprietary software, but better binary compatibility is a necessity for Linux to succeed, and I'm saying this as someone who's been using Linux for over a decade and who refuses to install any proprietary software. Sometimes I find myself using apps and games in Wine even when a native version is available just to avoid the hassle of having to find and probably compile libobsoletecrap-5.so
Statically linking is absolutely a tool we should use far more often, and one we should get better at supporting.
-
Have you tried recently? We've been pretty much at parity for years now. Almost every game that doesn't run is because the devs are choosing to make it that way.
Still can't play any 3d games on Qubes OS
-
so, YaST?
I agree. OpenSuse should set the standards in this.
Tbf, they really need a designer to upgrade this visually a bit. It exudes its strong "Sys Admin only" vibes a bit much. In my opinion.
-
I'm primarily talking about Win32 API when I talk about Windows, and for Mac primarily Foundation/AppKit (Cocoa) and other system frameworks. What third-party libraries do or don't do is their own thing.
There's also nothing wrong with bundling specialized dependencies in principle if you provide precompiled binaries. If it's shipped via the system package manager, that can manage the library versions and in fact it should do that as far as possible. Where this does become a problem is when you start shipping stuff like entire GUI toolkits (hello bundled Qt which breaks Plasma's style plugins every time because those are not ABI-compatible either).
The amount of time that I had to get out of .dll-hell on Windows on the other hand. The Linux way is better and way more stable.
Try running an old precompiled Linux game (say Unreal Tournament 2004 for example). They can be a pain to get working. This is not just some "ooooh gotcha" case, this is an important thing that's missing for software preservation and cross-compatibility, because not everything can be compiled from source by distro packagers, and not every unmaintained open-source software can be compiled on modern systems (and porting it might not be easy because of the same problem).
I suppose what Linux is severely lacking is a comprehensive upwards-compatible system API (such as Win32 or Cocoa) which reduces the churn between distros and between version releases. Something that is more than just libc.
We could maybe have had this with GNUstep, for example (and it would have solved a bunch of other stuff too). But it looks like nobody cares about GNUstep and instead it seems like people are more interested in sidestepping the problem with questionably designed systems like Flatpak.
Unreal Tournament 2004 depends on SDL 1.3 when I recall correctly, and SDL is neither on Linux nor on any other OS a system library.
Binary only programs are foreign to Linux, so yes you will get issues with integrating them. Linux works best when everyone plays by the same rules and for Linux that means sources available.
Linux in its core is highly modifiable, besides the Kernel (and nowadays maybe systemd), there is no core system that could be used to define a API against.
Linux on a Home theater PC has a different system then Linux on a Server then Linux on a gaming PC then Linux on a smartphone.Linux has its own set of rules and his own way to do things and trying to force it to be something else can not and will not work.
-
The diversity of Linux distributions is one of its strengths, but it can also be challenging for app and game development. Where do we need more standards? For example, package management, graphics APIs, or other aspects of the ecosystem? Would such increased standards encourage broader adoption of the Linux ecosystem by developers?
Filesystem interactions. For example, in file open/save dialogs directories are sometimes grouped at the top and sometimes mixed in alphabetically with files. My preference is for them to be grouped, but being consistent either way would be nice.
-
One that Linux should've had 30 years ago is a standard, fully-featured dynamic library system. Its shared libraries are more akin to static libraries, just linked at runtime by ld.so instead of ld. That means that executables are tied to particular versions of shared libraries, and all of them must be present for the executable to load, leading to the dependecy hell that package managers were developed, in part, to address. The dynamically-loaded libraries that exist are generally non-standard plug-in systems.
A proper dynamic library system (like in Darwin) would allow libraries to declare what API level they're backwards-compatible with, so new versions don't necessarily break old executables. (It would ensure ABI compatibility, of course.) It would also allow processes to start running even if libraries declared by the program as optional weren't present, allowing programs to drop certain features gracefully, so we wouldn't need different executable versions of the same programs with different library support compiled in. If it were standard, compilers could more easily provide integrated language support for the system, too.
Dependency hell was one of the main obstacles to packaging Linux applications for years, until Flatpak, Snap, etc. came along to brute-force away the issue by just piling everything the application needs into a giant blob.
The term "dependency hell" reminds me of "DLL hell" Windows devs used to refer to. Something must have changed around 2000 because I remember an article announcing, "No more DLL hell." but I don't remember what the change was.
-
Generally speaking, Linux needs better binary compatibility.
Currently, if you compile something, it's usually dynamically linked against dozens of libraries that are present on your system, but if you give the executable to someone else with a different distro, they may not have those libraries or their version may be too old or incompatible.
Statically linking programs is often impossible and generally discouraged, making software distribution a nightmare. Flatpak and similar systems made things easier, but it's such a crap solution and basically involves having an entire separate OS installed in parallel, with its own problems like having a version of Mesa that's too old for a new GPU and stuff like that. Application must be able to be packaged with everything they need with them, there is no reason for dynamic linking to be so important in Linux these days.
I'm not in favor of proprietary software, but better binary compatibility is a necessity for Linux to succeed, and I'm saying this as someone who's been using Linux for over a decade and who refuses to install any proprietary software. Sometimes I find myself using apps and games in Wine even when a native version is available just to avoid the hassle of having to find and probably compile libobsoletecrap-5.so
Disagree - making it harder to ship proprietary blob crap "for Linux" is a feature, not a bug.
-
Domain authentication and group policy analogs.
I'm surprised more user friendly distros don't have this, especially more commercial ones
-
The diversity of Linux distributions is one of its strengths, but it can also be challenging for app and game development. Where do we need more standards? For example, package management, graphics APIs, or other aspects of the ecosystem? Would such increased standards encourage broader adoption of the Linux ecosystem by developers?
interoperability > homogeneity
-
The diversity of Linux distributions is one of its strengths, but it can also be challenging for app and game development. Where do we need more standards? For example, package management, graphics APIs, or other aspects of the ecosystem? Would such increased standards encourage broader adoption of the Linux ecosystem by developers?
Manuals or notifications written with lay people in mind, not experts.
-
Domain authentication and group policy analogs.
I've never understood putting arbitrary limits on a worker's laptop. I had always been seeking for ways to hijack them. Once I ended up using a VM,
without limit... -
Stability and standardisation within the kernel for kernel modules. There are plenty of commercial products that use proprietary kernel modules that basically only work on a very specific kernel version, preventing upgrades.
Or they could just open source and inline their garbage kernel modules…
I don't use any of these, but I'm curious. Could you please write some examples?
-
Flatpak is very useful for a lot of things, but i really dont think it should be the default. It still has some weird issues. For example if you run a seperate home and root partition flatpak by default will install things into your root partition which quickly fills up. You have to go in and do a bunch of work to get it to use the home partition.
Or for example issues with themeing and cursors. Its a pretty common issue for flatpaks to not properly detect your cursor theme and just use the default until you mess around with perms and settings to fix it.
They also generally get updates slower. I guess maybe if its adopted more that would change but flatpak is already pretty widely used and thats still an issue. Especially for smaller programs not used by as many people.
Keeping it as just something that is good to use for the ones who like a GUI experience and want something simple and easy is great. But if we were to start doing like what ubuntu does with snaps where theyll just replace things you install with the snap version then im not in favor of that at all.
I agree that flatpak is not there yet. The API is limited, and it is also hard to package an app. But I really want to see ot succeed
-
I've never understood putting arbitrary limits on a worker's laptop. I had always been seeking for ways to hijack them. Once I ended up using a VM,
without limit...I mean, it sucks, but the stupid shit people will do with company laptops...
-
The diversity of Linux distributions is one of its strengths, but it can also be challenging for app and game development. Where do we need more standards? For example, package management, graphics APIs, or other aspects of the ecosystem? Would such increased standards encourage broader adoption of the Linux ecosystem by developers?
Actual native package management and package distribution
-
I don't use any of these, but I'm curious. Could you please write some examples?
It mostly affects people working with ”fun” enterprise hardware or special purpose things.
But to take one example, proprietary drivers for high performance network cards, most likely from Nvidia.
-
The diversity of Linux distributions is one of its strengths, but it can also be challenging for app and game development. Where do we need more standards? For example, package management, graphics APIs, or other aspects of the ecosystem? Would such increased standards encourage broader adoption of the Linux ecosystem by developers?
- find something Lennart built. Eg. systemd
- remove that
- go to 1