fighting with xargs :-(
Les Mikesell
lesmikesell at gmail.com
Sun Jun 25 22:52:39 UTC 2006
On Sun, 2006-06-25 at 15:57, Don Russell wrote:
> Not FC5 specific...
>
> I am trying to use a (simple) command to gzip a bunch of files in a
> given directory... BUT I want to keep the original files too. gzip does
> not seem to have an option to keep the original file. In some cases, the
> file names contain blanks and or $ characters...
>
> I want to gzip each file individually, not combine several files into
> one .gz file.
>
> So, I thought some form of ls and xargs would do the trick and I would
> be done in 5 minutes. :-)
>
> ls -1 *[^.gz] | xargs -r -I {fn} gzip -c {fn} > {fn}.gz
>
> (hours pass, reams of reading later...)
>
> I added the -p option to xargs so I could see what it is actually doing
> (vs what I think it should do) and see the command actually stops at the
> >. The > {fn}.gz isn't part of the command created by xargs...
>
> Try again, escaping the > ...
> ls -1 *[^.gz] | xargs -rp -I {fn} gzip -c {fn} \> {fn}.gz
>
> The command looks good... but it does not work... it seems I get the
> compressed stuff displayed on my screen instead of being writtento the
> file. Isn't the > the correct redirection character so stdout goes to
> the specified file?
>
> I usually end up with a file called {fn}.gz somewhere along the line.
>
> This has got to be simple... what am I missing? :-)
>
> Any suggestions?
Let the shell do the heavy lifting:
for file in *[^.gz]
do
gzip -c "$file" > "$file.gz"
done
But, it is generally easier to deal with things when you
separate them into subdirectories instead of having to
wild-card the filenames. Is there some reason to keep
the compressed and uncompressed versions in the same
directory?
--
Les Mikesell
lesmikesell at gmail.com
More information about the users
mailing list