[FZH] Fwd: [announce] yum: parallel downloading

Liang Suilong liangsuilong at gmail.com
Sat May 19 16:57:47 UTC 2012

Yum 终于也可以并行下载了。。。

旧版本的 Fedora 也可以用。

貌似还是有点 Bug 的,人品好的可以去试试。

Sent From My Heart
My Page: http://www.liangsuilong.info

---------- Forwarded message ----------
From: Zdenek Pavlas <zpavlas在redhat.com>
Date: Thu, May 17, 2012 at 12:07 AM
Subject: [announce] yum: parallel downloading
To: devel在lists.fedoraproject.org


A new yum and urlgrabber packages have just hit Rawhide.  These releases
include some new features, including parallel downloading of packages and
metadata, and a new mirror selection code.  As we plan to include these
features in RHEL7, I welcome any feedback or bug reports!

python-urlgrabber-3.9.1-12.fc18 supports a new API to urlgab() files in
parallel, and yum-3.4.3-26.fc18 can use this.  Both packages are compatible
with older versions.

Feature list:

- parallel downloading of packages and metadata

If possible, multiple files are downloaded in parallel.  (see below for the
limitations that apply)

- configurable 'max_connections' limit in yum.conf

This is the maximum number of simultaneous connections Yum makes.  Purpose
this is to limit local resources (number of processes forked).  The default
to use urlgrabber's default value of 5.

- mirror limits are honored, too.

Making many connections to the same mirror usually does not help much, it
consumes more resources.  That's why Yum also uses mirror limits from
metalink.xml.  If no such limit is available, at most 3 simultaneous
connections are made to any single mirror.

- new mirror selection algorithm

The real downloading speed is calculated after each download, and the
statistics get updated.  These are in turn used when selecting mirrors for
further downloads.  This should be more accurate than measuring latencies in
fastestmirror plugin, but slow mirrors now have to be tried from time to
and the statistics need some time to build up.

- ctrl-c handling

This is a long-standing problem in Yum.  Due to various shortcomings in rpm
curl it's impossible to react immediately to SIGINT.  But now the downloader
runs in a different process, so we can exit even if curl is still stuck.
The "skip to next mirror" feature is gone (we don't want to restart all
currently running downloads).

Known limitations:

- metalink.xml and repomd.xml downloads are not parallelized yet.

Zdeněk Pavlas
devel mailing list

More information about the Chinese mailing list