yum broken; headers directory missing

Cam camilo at mesias.co.uk
Sun Mar 28 20:17:28 UTC 2004


Michael

Thanks for your reply. One last comment...

>>* finally, generally inefficient behaviour (downloading headers one by 
>>one instead of having a compressed archive of headers
> 
> 
> Yes and no.  Headers are already compressed.  The only advantage there
> would be doing a single download instead of many.  HTTP keepalive
> helps a lot because you only use one HTTP connection.  Also, what you
> propose would involve actually downloading MORE.  Yum doesn't keep
> headers for rpms you have installed, so you'd need to download a whole
> bunch of EXTRA headers.  Finally, what happens when only a few
> packages get downloaded?  Surely, you don't want to download the whole
> tarball again.  [perhaps you were only talking about the initial
> setup, though]

What happens is probably fair on the server but from the client's point 
of view, if you have to get all the headers when the server is under 
load you can see pauses between the downloads. If it's using HTTP 
keeplive then it's probably already doing a fair job under heavy load.


> The next major version will almost certainly have better timeout
> control.
...
> Next version will also support REGET for that case.  You'll pick up
> where you left off.

Sounds good. I wonder, does the download.redhat.com server accept rsync 
connections from mere mortals? I'm tempted to sync the lot to a local 
server and update from that :)

-Cam





More information about the test mailing list