Do I really need to explain this point? I think linking against
system
OpenSSL is *way better* than statically linking to a random vendored
copy of it.
There are maybe about 100-120 libraries for which this is obviously the case. openssl,
glibc, glib2, zlib, libxml2, libcurl, kde libraries, etc. Core system libraries that
pretty much everything depends on. Dynamically linking such libraries has real benefits.
For everything else though? No, not so much.
I feel like there is insufficient recognition of the extent to which C libraries do
"bundling". Not "bundling" in the sense of vendoring a whole library,
but in the sense of including one-off implementations of basic data structures,
configuration parsers, hashing algorithms, etc. I would love to hear anyone argue that
100 different variations of "sha256.c" across 100 different packages better
follows the spirit of the "no bundling" guidelines than a vendored crate named
"sha256" with 100x as many eyes on it, and a higher likelihood to actually be
updated if a problem is found.
Many of the tiny, "sprawling" Rust dependencies are like this - not all of them
of course, but many.
Torvalds has similar feelings:
https://lore.kernel.org/lkml/CAHk-=whs8QZf3YnifdLv57+FhBi5_WeNTG1B-suOES=...