Axel Thimm wrote:
On Fri, Sep 19, 2008 at 11:49:15AM -0700, Toshio Kuratomi wrote:
> Axel Thimm wrote:
>> On Fri, Sep 19, 2008 at 09:59:42AM +0200, Ralf Corsepius wrote:
>>>> I'd ask the question differently: If upstream saw itself forced to
>>>> duplicate/fork a lib how can you help making the forked bits back into
>>>> the original upstream.
>>> Then let me answer with my "developer's hat" on:
>>>
>>> The reason why such "forks" exist, often is disagreement between
>>> "upstream" and "fork" devs, because they disagree on
APIs/usage and the
>>> like.
>>>
>>> Upstreams often accuse "fork devs" to be abusing their
libraries/APIs
>>> (e.g. using undocumented internals). Conversely, "fork devs" often
>>> accuse "upstreams" not to listen to "users'
demands".
>>>
>>> I.e. in many cases such conflicts will hardly be resolvable.
>> Yes, and sometimes as in the case of ffmpeg and minilzo upstream even
>> wants you to take a snapshot, e.g. the "forking" is in their
>> consent.
>>
> Upstream is wrong. The consequences of their actions don't come back to
> haunt them, though, they come back to haunt us. If there's a security
> hole in their library snapshot, their answer can be "we fixed that two
> months ago. Why didn't you update?" Our answer would have to be: "We
> have an unfixed vulnerability in gnome-foo-player, toms-foo-player, and
> foo-plus-player. We are not able to fix these until we perform a port
> to an incompatible ffmpeg snapshot for the maintainers of the affected
> software."
Or backport to the versions needed. That's what RHEL does. And you now
have the choice to
o either not package it
This one's not true. We've already released it. If we remove
gnome-foo-player, toms-foo-player, and foo-plus-player from the
repository, all we're doing is keeping future people from getting the
software. We aren't helping the people who have already received the
software.
o package it as a shared external lib in its own package and have
the
same problem.
Using "system libraries" just means blessing one version of the
upstream project. Which if you want to follow security issues would
need to be the latest and greatest. Which means that projects that
have cut an older version of it will not run/build anymore.
You didn't address the benefits that I outlined below:
> 1) If the library is statically linked in the application
it's harder to
> find in a quick audit.
>
> 2) If the flaw affects a range of library versions (for instance,
> 899.x.y is unaffected), having them be packaged separately makes finding
> the affected versions and fixing them easier than finding out which
> versions of the private library each app has.
So at the end we just don't ship them, as we do for x11vnc. Is
that
the preferred outcome?
It might be. If we don't have the manpower to keep things secure then
we have no business inflicting them upon our users.
>>>>> Just to quote one such example: ffmpeg is a fast moving target, and
>>>>> any project depending on the lib API is cutting a checkout, patching
>>>>> it a up and using it for its own purposes. Replacing these internal
>>>>> ffmpegs with a system ffmpeg is a nightmare or even impossible w/o
>>>>> rewriting the app interface to it. Given that ffmpeg and friends
fall
>>>>> under the patent forbidden class we don't see that directly in
Fedora,
>>>>> but this issue is still out there.
>>>> Well, ffmpeg is a special case wrt. many issues. If they were doing a
>>>> proper job, they would release properly versioned packages with properly
>>>> versioned APIs, which could be installed in parallel.
>>> Even if so, would Fedora package 20 different versions of ffmpeg to
>>> satisfy the 20 different consumers? There wouldn't be any benefits to
>>> an internal lib either: If there is a security flaw fixed in ffmpeg
>>> 1004.0.1 the versions 900.x.y to 1003.x.y would be just as insecure as
>>> external packages.
>>>
>> I think there is benefit:
> 1) If the library is statically linked in the application
it's harder to
> find in a quick audit.
>
> 2) If the flaw affects a range of library versions (for instance,
> 899.x.y is unaffected), having them be packaged separately makes finding
> the affected versions and fixing them easier than finding out which
> versions of the private library each app has.
>>
>> Also, one of the goals of a distribution is to make programs and
>> libraries work together. So the approach we'd likely want to take is
>> choosing a few versions of the library that we could support (probably
>> in conjunction with other distros) and then porting the apps to one of
>> those library versions and getting upstreams to update their private
>> libs to those versions since we were willing to do the port for them.
>
> Well, be my guest. Many have attempted to do so in some cases, but if
> upstream is *fast*, you will never catch up, and different
> distributions will chose different snapshots depending on the consumer
> apps they target, so coordinating will be more than difficult.
>
> But in principle I agree: If we could afford the manpower to keep all
> comsumers compatible to the latest stable release of a library that
> would be the ideal world. But we don't have that manpower, which is
> not just packaging, but true upstream work.
>
> A reality check shows that we even lack the definition of "latest
> stable upstream release" in some cases like ffmpeg.
>
Re-read what I wrote. I'm saying "choosing a few versions of the
library that we could support (probably in conjunction with other
distros) and then porting the apps to one of those library versions".
If there's a set of packages that we care enough about that use a
library which can't commit to an API for more than a month then we and
other distros have to care about the situation and fix it between
ourselves. And yes, this is programming that I'm talking about, not
just packaging.
>>>> I'd recommend to soften this guideline to
something as "the packager
>>>> should try hard to use system libs, and try to communicate the
>>>> benefits to upstream, but if there are reasons not to use system libs,
>>>> then he should document this in the specfile".
>>> I am not sure this is a good idea. I'd rather be stricter on this,
>>> because this would force devs to think about what they are doing and
>>> packagers think about the quality of what they are packaging.
>>>
>>> The unpleasant truth is: If a package bundles "modified libs/apps"
from
>>> elsewhere, which can't be installed in parallel to the original
>>> libs/apps, this package's dev's are doing something wrong.
>> The truth is that sometimes upstream makes some decisions, we can try
>> to forward our position, but upstream may ignore us or maybe will have
>> a good reason to counter our position (for ffmpeg and minilzo I believe
>> upstream is correct and we should be the ones revising the guidelines).
>>
>> That's why I suggest that the packager tries hard to do it in the
>> typical shared way, but if there are sane reasons to use internal libs
>> to not let the package dry out forever in some review queue like for
>> x11vnc.
> I am against this approach. But perhaps you'll have a useful counter to
> the points I raised.
The points you raised are nice in an ideal over manpowered world, but
in reality we see packages stalling in review queues forever instead.
Anyway, these were my 0.02. I'm not really seeing anyone from the FPC
favouring this, even less willing to pick this up and driving this
through, so I'll rather stick to agreeing to disagree. :)
heh. Okay. :-)
-Toshio