On Sat, Dec 01, 2007 at 02:03:06PM -0500, Mark Nielsen wrote:
I think you'll get the same sort of thing if you gzip the file.
I've
gzip'd some 20G xen images down to around 2G.
AFAIR, this is just because sparse files appear as long sequences of
zeros & thus compress wel - gzip isn't actually optimizing for sparseness.
The trouble is that when you extract the file gunzip will fully allocate
it filling with zeros.
My original 1M file takes 4k
# ls -lhs foo
4.0K -rw-r--r-- 1 root root 1.0M 2007-12-01 14:18 foo
#gzip foo
# ls -lhs foo.gz
8.0K -rw-r--r-- 1 root root 1.1K 2007-12-01 14:18 foo.gz
Now when it uncompress it it takes up the full 1 MB :-(
# gunzip foo.gz
# ls -lhs foo
1.1M -rw-r--r-- 1 root root 1.0M 2007-12-01 14:18 foo
You *always* want to use tar to preserve sparseness, and then gzip the tar
file so you get optimal resource usage both on archiving & extracting.
Dan.
--
|=- Red Hat, Engineering, Emerging Technologies, Boston. +1 978 392 2496 -=|
|=- Perl modules:
http://search.cpan.org/~danberr/ -=|
|=- Projects:
http://freshmeat.net/~danielpb/ -=|
|=- GnuPG: 7D3B9505 F3C9 553F A1DA 4AC2 5648 23C1 B3DF F742 7D3B 9505 -=|