Finding Duplicate Files

Patrick O'Callaghan pocallaghan at gmail.com
Thu Mar 13 23:43:34 UTC 2008


On Thu, 2008-03-13 at 21:25 +0000, Jonathan Roberts wrote:
> Hey all,
> 
> I've got into a bit of a muddle with my backups...more than a little in fact!
> 
> I have several folders each approx 10-20 Gb in size. Each has some
> unique material and some duplicate material, and it's even possible
> there's duplicate material in sub-folders too. How can I consolidate
> all of this into a single folder so that I can easily move the backup
> onto different mediums, and get back some disk space!?

find top-directory -type f -print|xargs sha1sum|sort

This will take a while (!) but you'll get a list of all the files
grouped according to their SHA1 checksums. You can assume that two files
with the same checksum have the same content (or use cmp if you're
paranoid). If you want to get fancy, use the checksum as an associative
array index to group the filenames in buckets (left as an exercise for
the reader :-)

poc




More information about the users mailing list