[Proposal] Ring-based Packaging Policies

Kevin Kofler kevin.kofler at chello.at
Mon Feb 16 16:03:51 UTC 2015

Stephen Gallagher wrote:
> tl;dr Shall we consider requiring a lesser package review for packages
> that are not present on Product or Spin install media?

TL;DR: No, at least not in the form you propose (allowing bundled 
libraries). See also my counterproposal below (voiced already in the oral 
discussion at DevConf, now written down for posterity).

> * The no-bundled-libraries policy means that when a library module
> requires an update, it means that only one package needs to be modified
> in order to enhance all applications on the system that consumes it.
> This is a significant time-saver when it comes to dealing with
> (increasingly common) security vulnerabilities.

Indeed, this helps both security fixes and regular bug fixes.

Another advantage of the policy is to save both disk space and RAM by 
avoiding duplication.

> * Legal review and the FPCA ensures that Fedora remains true to its
> "Freedom" Foundation (as well as protecting Fedora contributors from the
> perils of the international legal system).

And this part must not be negotiable.

> * Package reviews for less-interesting packages (such as those for less
> popular SIGs) often remain un-reviewed for weeks, months or even years.

See my counterproposal below.

> * The package policies are only ever reviewed during the initial
> creation of the package. Once that initial (high) hurdle is cleared, the
> packager essentially has free reign to do whatever they want with their
> package. This sometimes means that as time passes, the spec files
> "bit-rot" or otherwise start accumulating other inconsistencies. (A
> common example is the package whose upstream starts bundling a library
> without the packager noticing).

Surely the answer to that cannot be to simply allow poor quality from day 

> * Many upstream projects do not concern themselves with being "in" any
> particular distribution (with the notable example being the
> Debian/Ubuntu flavors which have amassed a sufficient apparent userbase
> that they sometimes get special treatment). For a variety of reasons,
> this often leads directly to bundling the vast majority of their
> dependencies. This is done for many reasons, but the two most common are
> supportability and portability; it's impossible for many upstreams to
> actually QA their package with every possible distro library. Instead,
> if they ship everything they depend on, they can guarantee *that*
> specific combination. This moves the responsibility from the
> distribution to the upstream package to maintain their bundled
> libraries.

And that is not practical, also because it is then OUR responsibility to 
update to the new application that updates to the new bundled libraries. 
This just does not scale. Without bundled libraries, when we get notified of 
a security issue, we have to update the library and be done with it. With 
bundled libraries, we have to do one of the following processes:
a) working with upstream:
   for each package bundling the library {
     1. forward the security notification to its upstream
     2. wait for upstream to update the bundled library. We have no SLA
        whatsoever there, so this can take theoretically forever!
     3. pick up the new upstream release with the new bundled library
     4. push that as an update to our users, with the required testing for a
        new upstream release (typically longer than for a simple CVE fix)
   Not only is this O(n) instead of O(1), but it also takes way too long for
   any security issue of sufficient importance.
b) bypassing upstream:
   for each package bundling the library {
     1. patch the bundled library ourselves
     2. push the updated application
   Now this moves the delay again into our area of responsibility, so it can
   be done faster (eliminating the wait for upstream), but it is still O(n)
   instead of O(1) as in the unbundled case. In addition, this solution also
   has the usual problems of downstream patches. In particular, once the
   official upstream patch comes out, we need to drop our downstream patch,
   and then also have ANOTHER round of testing with the new upstream

So the only reasonable solution is to unbundle all the libraries, no matter 
what upstream thinks of that.

> * With many languages, there is no possibility of installing multiple
> versions of the same library on the same system, so if an application
> requires a newer or older version of the library than is in Fedora to
> run, it fails.

In this case, it is the packager's job to fix the program to work with the 
version of the library in the distribution. That's what we distributors are 

I am also not convinced that this bundling issue is the main blocker for 
package reviews to begin with.

So, for my counterproposal:
I propose that packagers with a sufficient level of trust (packager 
sponsors, provenpackagers, or a new, yet-to-be-defined group (maybe 
packagers with at least N packages)) be allowed to import new packages with 
a self-review. We trust those people for so many things, and we know that 
they understand the packaging guidelines, so why can we not trust them to 
import their own packages without blocking on somebody else? Here are just 2 
examples of packages that have been sitting in the queue for months and 
would have gone in instantly with my proposed policy:
The submitter has been a packager sponsor and provenpackager for years (and 
even several of the people he sponsored are now also packager sponsors 
and/or provenpackagers), so why do we need to waste our time reviewing his 
packages when it's clear that he knows what he's doing?

        Kevin Kofler

More information about the devel mailing list