Updates (was Fedora 23 Final RC10 status is GO !)

Michael Catanzaro mcatanzaro at gnome.org
Tue Nov 3 16:41:16 UTC 2015


On Tue, 2015-11-03 at 02:13 +0100, Kevin Kofler wrote:
> Michael Catanzaro wrote:
> > Yeah, that's the clear disadvantage. The service pack approach
> > sidesteps that problem: everything still goes out, just not so
> > soon, so
> > everything spends plenty of time in testing. All the bugs still get
> > fixed, just not as fast.
> 
> And that's "better" HOW?

In order of importance:

1) More time to catch regressions
2) A reasonable way to QA a snapshot of what gets released
3) Less-frequent updates
4) Small total size of updates

(1) doesn't require the service pack approach and could be implemented
by just adding a longer waiting period to bodhi. I'm of the mind that
all updates should spend two weeks in testing before they go stable,
regardless of karma, since otherwise how will people have time to
notice regressions? Currently updates tend to stay in testing for a
week or less, which isn't even enough time for GNOME Software to prompt
the user about new updates.

(2) allows us to solve the problem of having no effective QA for our
updates. Maybe the current team is not big enough to QA service packs
as well as the next release, but at least with service packs there is
something that can reasonably be QAed. It also allows for regular
install media respins, which I think you want? That will make Linus
happy, too.

(3) is quite important, since in Workstation we reboot to install
updates, which is quite annoying. (No point in arguing about that here,
it's a done deal....) Here you would only have one reboot per service
pack, plus security updates.

(4) would be true if a package would otherwise have been updated
multiple times between service packs.

> > (This also solves the problem of maintainers releasing individually
> > -good updates too frequently.)
> 
> What problem? "too frequently" according to whom? I don't see any
> problem 
> here, and I think updates (especially bugfixes) cannot be frequent
> enough.

I will pick on LibreOffice. Individually, the updates rarely ever break
anything, and fix bugs, so they're good updates taken alone. But
collectively, it's problematic because there's a new update every week,
and LibreOffice is big: it takes a lot of time to download and install.
Does the inconvenience of making the update take longer offset the
value provided by the update? It's a function of how often you use
LibreOffice, how much the relevant bugs affect you, your download
bandwidth, and your hard drive speed. I rarely use LO, so weekly
updates are definitely not worth it for me: I'd rather have the updates
once a month, at most, but it's just an inconvenience. For the
maintainer, weekly updates are surely worth it, or they wouldn't be
happening. For most users, probably not, but I guess it depends on the
bug. For bandwidth-limited users, it's not an inconvenience, but a
massive disaster. We're currently lacking criteria to say how frequent
is too frequent. I think 3-4 weeks between updating one package is
probably as fast as is ever appropriate, except in special
circumstances (maintainer discretion there), but weekly updates as a
rule is problematic, especially for such a large package.

I'd like to get to the point where a reboot to install weekly updates
takes about a minute. Currently, it's about five minutes per weekly
update. I recently made the mistake of installing all of texlive, so
that I wouldn't have to bang my head against a wall when trying to
figure out what Fedora package to install to get a particular texlive
package; that caused my weekly updates to take 2-3 hours apiece, so my
computer was unusable for the rest of the evening when I made the
mistake of applying updates. (texlive is a good example of another set
of packages that must be updated very rarely, even if the content of
the updates itself is fine.)

> > The counterargument is that we keep seeing major version updates
> > that
> > violate our existing updates policy.
> 
> This means the policy is broken, not the updates. I am glad that we
> are not 
> following that policy to the letter, that's the only reason the
> system works 
> at all! We need to encourage pushing new versions unless there is a
> good 
> reason not to, including, but not limited to:
> * incompatibility with existing data (including documents, config
> files,
>   savegames, etc.),
> * feature regressions (including deliberately removed features and
> features
>   missing from a rewrite),
> * major UI changes (but a menu item moving to some other place is
> harmless),
> * new bugs (known in advance or found during testing) unless
> outweighed by
>   the bugs that are fixed,
> etc. If none of the above hold, then why would we not let the users
> benefit 
> from the new features in the new version of the package? Upstream
> clearly 
> considers it stable or they would not have released it as such.

For most applications (not core system applications), major version
updates are fine, and the policy could be changed. For lower-level
system components, updates need to be balanced against regression
potential. Updates that just add patches can break things too, but
they're less likely to break things than version updates. A big version
update is more likely to break things than a small version update. Some
packages (e.g. web engines) really need to be updated to the latest
major version no matter what.

> > Who if not a neutral party charged with upholding that policy
> > should have
> > the final say? Some maintainers who clearly haven't read it?
> 
> I have read it. I just don't interpret it as being in contradiction
> with my 
> updates. See e.g. Routino, which only went from 2.7.3 to 3.0 because
> it 
> added a shared library (which in turn allows building new versions of
> applications such as qmapshack). The changes to the routing software
> itself 
> are minor and almost entirely bugfixes. It is also compatible with
> databases 
> from 2.7.3. (I NEVER push an update of Routino that is NOT database-
> compatible!) I don't see any reason why the update would be a
> problem.

Well, I think maintainer judgment comes into play here. If the version
number bump does not reflect the regression potential, that should be
fine. But taking the time to explain that when requesting the update is
not the end of the world.

In my view, the neutral party would be charged with stopping the really
suspicious updates only, or allowing them but slowing the pace.

> > If we have another party approving updates, then it's the
> > maintainer's
> > job to write an argument in favor of releasing the update: a quick
> > summary of what the fix is and the regression potential. If the
> > update
> > gets rejected, the maintainer might really be wrong! and if not
> > would
> > have to try again to explain better. I think this would be good
> > regardless of whether or not we do updates packs.
> 
> I think this would just be added bureaucracy and a royal PITA. Bodhi
> is 
> already painful enough as it stands!
> 
> If you keep making it harder for packagers to do their job, you will
> find 
> yourself losing packagers rapidly.
> 
>         Kevin Kofler

Frankly, I think it's most important to make Bodhi work. I haven't been
able to search for updates since the new Bodhi interface went live,
except by manually editing the URL, which is crazy. Why is there a
search box if it doesn't work? (Turns out it works, but only in
Firefox. ;) Also, I couldn't figure out how to add builds to an update.
(Admittedly, that was next to impossible in the old web interface too.)
If not for fedpkg, I don't think I would even be able to release
updates.

Michael


More information about the devel mailing list