On Wed, May 13, 2020 at 08:16:06AM -0400, Stephen John Smoogen wrote:
On Wed, 13 May 2020 at 03:19, Florian Weimer
> * Stephen John Smoogen:
> > No because the things that backups and rsync do works in a slow way.
> > We can do the backup the look-aside cache with tar-balls in a couple
> > of hours. We can also rsync that in the same amount of time. It takes
> > that long or longer to do that with a couple of git trees which are
> > much smaller in size but larger in file numbers. Every file in a git
> > tree is stat'd and while there is some deduplication, there is a lot
> > of files.
> I think there's a logic bug somewhere. 8-)
I think the logic bug is assuming people will regularly repack and
garbage collect the git repositories. I have found that this is a
rarity and trying to enforce it happening ends up with maintainer
complaints that you messed with THEIR way of doing things. Assume that
no one will do so until we have a crisis which forces various people
to finally give into it.
Then also assume that developers will not come up with multiple ways
to branch/sidebranch/fork (sometimes in their same project) which
will end up with tons of lose objects which are not deduplicated. That
is the reality of what I have seen on all our source repositories in
the past. Then also realize that for copyright and other legal reasons
we can not delete code once it has been committed (or at least been
built against which the packet will do for you right away) .. so this
is always going to grow.
Some random facts:
There are currently 2668 glibc files in the lookaside.
There are currently 6733 files in the glibc.git git repo on
Perhaps we should run regular 'git gc' over the repos there to move
objects into pack files, but we don't currently.
In any case the usual problem is scale here. ~10k files isn't that much,
but when you multiply by 30,000 packages its a lot.