There was a time where this debate was bigger. It seems the world has shifted towards architectures and tooling that does not allow dynamic linking or makes it harder. This compromise makes it easier for the maintainers of the tools / languages, but does take away choice from the user / developer. But maybe that’s not important? What are your thoughts?
Shared libraries save RAM.
Dynamic linking allows working around problematic libraries, or even adding functionality, if the app developer can’t or won’t.
Static linking makes sense sometimes, but not all the time.
Citation needed :) I was surprised but I read (sorry I can’t find the source again) that in most cases dynamic linking are loaded 1 time, and usually very few times. This make RAM gain much less obvious. In addition static linking allows inlining which itself allow aggressive constant propagation and dead code elimination, in addition to LTO. All of this decrease the binary size sometimes in non negligeable ways.
That is easily disproved on my system by
cat /proc/*/maps
.Someone found the link to the article I was thinking about.
Ah, yes, I think I read Drew’s post a few years ago. The message I take away from it is not that dynamic linking is without benefits, but merely that static linking isn’t the end of the world (on systems like his).
Not exactly, shared libraries save cache.
Does this apply if the app is open source?
In practical terms often yes. It can be easier in practical terms to just
LD_PRELOAD
something than to maintain your own patched version of an RPM / APT package for example.