Yum and partial downloads.

Robin Laing MeSat at TelusPlanet.net
Fri May 8 04:03:41 UTC 2015


On 2015-04-23 08:27, stan wrote:
> On Wed, 22 Apr 2015 23:38:35 -0600
> Robin Laing <MeSat at TelusPlanet.net> wrote:
>
>> In one case, the file was at 99% complete when it stopped.  Restarted
>> on a different mirror at 0%
>>
>> Due to firewall rules there is bandwidth management and it allows
>> downloads to start at a high speed only to slow down at 25MB.
>
> You could game the bandwidth management with wget or curl, using their
> continuation capability.  When the throttling starts, stop the program,
> wait a few seconds, and then restart with the continuation option.
> That way, you download the large file in several 25 MB chunks at high
> speed. Only useful for occasional large files, not in general, though
> you could script it.  Put the downloaded file in the local yum
> repository (under /var/yum somewhere) where yum will find it when it
> tries to update, or install it directly using the -C option.
>
>> The Question.
>>
>> The network admin asked if Fedora was like Ubuntu that will continue
>> downloading where it left off?  I stated I doubted it as the display
>> kept going back to zero
>
> This has been my experience as well.  Internally, I think yum uses
> python-urlgrabber to fetch the packages, and it seems to have no
> memory of past attempts.
>
> There is probably a reason this behavior was adopted, but I don't know
> it.
>
> You might be able to use the throttle and bandwidth options to adjust
> for your bandwidth management.  See man yum.conf.
>

Thank you.

I have started to look at the options.

Robin



More information about the users mailing list