I'm not a fan of trash directories. They give you a second chance, but you may end up not really deleting things you wanted to delete. What about a simple script something like this?
If you want something a bit more polished, you might check out "nrm", available at http://www.omnisterra.com/walker/linux/nrm.html
It's a C program written to have the same arguments, return codes and side effects as /bin/rm to the extent possible.
It moves files into the hidden sub-directory ".gone" instead of removing files. You can remove files with "-s" and you get sequenced backups - all files are saved with a time-stamp suffix for fine grained file restores. There is a program "urm" to unremove the file, but most users just do "ls .gone" to see what they've recently removed and then just "mv" it back to their working directory.
There is an associated cron job that runs every day to permanently remove all deleted files that are older than a configurable age. The default is 3 days so you can get things back on Monday that you nuked on Friday.
It's been running on HP-UX since about 1987, and on linux since 1997. I'd consider it pretty solid at this point.
kind regards, -- Rick Walker
On Thu, Mar 28, 2013 at 5:58 PM, Rick Walker walker@omnisterra.com wrote:
I'm not a fan of trash directories. They give you a second chance, but you may end up not really deleting things you wanted to delete. What about a simple script something like this?
If you want something a bit more polished, you might check out "nrm", available at http://www.omnisterra.com/walker/linux/nrm.html
Thank you :)
It's a C program written to have the same arguments, return codes and side effects as /bin/rm to the extent possible.
It moves files into the hidden sub-directory ".gone" instead of removing files. You can remove files with "-s" and you get sequenced backups - all files are saved with a time-stamp suffix for fine grained file restores. There is a program "urm" to unremove the file, but most users just do "ls .gone" to see what they've recently removed and then just "mv" it back to their working directory.
There is an associated cron job that runs every day to permanently remove all deleted files that are older than a configurable age. The default is 3 days so you can get things back on Monday that you nuked on Friday.
It's been running on HP-UX since about 1987, and on linux since 1997. I'd consider it pretty solid at this point.
kind regards,
Rick Walker
-- users mailing list users@lists.fedoraproject.org To unsubscribe or change subscription options: https://admin.fedoraproject.org/mailman/listinfo/users Guidelines: http://fedoraproject.org/wiki/Mailing_list_guidelines Have a question? Ask away: http://ask.fedoraproject.org
Cool. I'd never heard of it.
billo
On Wed, 27 Mar 2013, Rick Walker wrote:
I'm not a fan of trash directories. They give you a second chance, but you may end up not really deleting things you wanted to delete. What about a simple script something like this?
If you want something a bit more polished, you might check out "nrm", available at http://www.omnisterra.com/walker/linux/nrm.html
It's a C program written to have the same arguments, return codes and side effects as /bin/rm to the extent possible.
It moves files into the hidden sub-directory ".gone" instead of removing files. You can remove files with "-s" and you get sequenced backups - all files are saved with a time-stamp suffix for fine grained file restores. There is a program "urm" to unremove the file, but most users just do "ls .gone" to see what they've recently removed and then just "mv" it back to their working directory.
There is an associated cron job that runs every day to permanently remove all deleted files that are older than a configurable age. The default is 3 days so you can get things back on Monday that you nuked on Friday.
It's been running on HP-UX since about 1987, and on linux since 1997. I'd consider it pretty solid at this point.
kind regards,
Rick Walker
-- users mailing list users@lists.fedoraproject.org To unsubscribe or change subscription options: https://admin.fedoraproject.org/mailman/listinfo/users Guidelines: http://fedoraproject.org/wiki/Mailing_list_guidelines Have a question? Ask away: http://ask.fedoraproject.org
On Wed, 2013-03-27 at 23:58 -0700, Rick Walker wrote:
I'm not a fan of trash directories. They give you a second chance, but you may end up not really deleting things you wanted to delete. What about a simple script something like this?
If you want something a bit more polished, you might check out "nrm", available at http://www.omnisterra.com/walker/linux/nrm.html
It's a C program written to have the same arguments, return codes and side effects as /bin/rm to the extent possible.
It moves files into the hidden sub-directory ".gone" instead of removing files. You can remove files with "-s" and you get sequenced backups - all files are saved with a time-stamp suffix for fine grained file restores. There is a program "urm" to unremove the file, but most users just do "ls .gone" to see what they've recently removed and then just "mv" it back to their working directory.
There is an associated cron job that runs every day to permanently remove all deleted files that are older than a configurable age. The default is 3 days so you can get things back on Monday that you nuked on Friday.
I prefer to run a daily cron job to back up my home directory, which also allows me to look back over previous versions. I happen to use rsnapshot for this, but plenty pf other solutions exist. At some point a pseudo-rm is going to fail, either because the implementer didn't handle some corner case correctly, or because you typed rm instead of nrm, or because a file was removed by a program without anyone typing anything, and the only solution is to have a backup. So if you're going to have a backup anyway, what's the point?
poc
At some point a
pseudo-rm is going to fail, either because the implementer didn't handle some corner case correctly, or because you typed rm instead of nrm, or because a file was removed by a program without anyone typing anything, and the only solution is to have a backup. So if you're going to have a backup anyway, what's the point?
Lets say we did a daily backup at 9:00am and deleted some files by error at 5:00 pm. Then what???
A daily backup plus some kind of control mechanism prior to rm (ls and/or nrm) seems to solve the above mentioned scenario. Anyway that's what I've learned from my experience.
poc
-- users mailing list users@lists.fedoraproject.org To unsubscribe or change subscription options: https://admin.fedoraproject.org/mailman/listinfo/users Guidelines: http://fedoraproject.org/wiki/Mailing_list_guidelines Have a question? Ask away: http://ask.fedoraproject.org
On Fri, 2013-03-29 at 17:40 +1100, Celik wrote:
At some point a
pseudo-rm is going to fail, either because the implementer didn't handle some corner case correctly, or because you typed rm instead of nrm, or because a file was removed by a program without anyone typing anything, and the only solution is to have a backup. So if you're going to have a backup anyway, what's the point?
Lets say we did a daily backup at 9:00am and deleted some files by error at 5:00 pm. Then what???
If you're worried about that, then backup on an hourly basis, or even more frequently. Modern backup solutions such as rsnapshot or obnam do this incrementally and at very low cost. Then you can only lose a maximum of an hour's work. Given that much of that work will be program-generated, and another substantial amount will come from user apps which have their own internal backups (e.g. word processors etc.) the potential for catastrophic loss is reduced considerably. YMMV of course.
A daily backup plus some kind of control mechanism prior to rm (ls and/or nrm) seems to solve the above mentioned scenario. Anyway that's what I've learned from my experience.
Do you also have regular backups? If not, I'm afraid you're going to be disappointed sooner or later.
poc
On Sat, Mar 30, 2013 at 4:22 AM, Patrick O'Callaghan pocallaghan@gmail.comwrote:
On Fri, 2013-03-29 at 17:40 +1100, Celik wrote:
At some point a
pseudo-rm is going to fail, either because the implementer didn't
handle
some corner case correctly, or because you typed rm instead of nrm, or because a file was removed by a program without anyone typing anything, and the only solution is to have a backup. So if you're going to have a backup anyway, what's the point?
Lets say we did a daily backup at 9:00am and deleted some files by error
at
5:00 pm. Then what???
If you're worried about that, then backup on an hourly basis, or even more frequently. Modern backup solutions such as rsnapshot or obnam do this incrementally and at very low cost. Then you can only lose a maximum of an hour's work. Given that much of that work will be program-generated, and another substantial amount will come from user apps which have their own internal backups (e.g. word processors etc.) the potential for catastrophic loss is reduced considerably. YMMV of course.
A daily backup plus some kind of control mechanism prior to rm (ls and/or nrm) seems to solve the above mentioned scenario. Anyway that's what I've learned from my experience.
Do you also have regular backups? If not, I'm afraid you're going to be disappointed sooner or later.
I carry out three types of backups. Backup type-1: complete home directory, done once a month or so, using tar to an external hard-drive #1 Backup type-2: for projects completed or adjusted, using tar to an external hard-drive #2. Backup intervals vary. Backup type-3: during code development, using "cp -rf" to a backup folder under home directory. I use this to make backups of my code.
In this thread, it has been suggested to do daily backups. But it wasn't clear, should the daily backups be for the whole system or only the current project we are working on?
An issue that has been bothering me and caused me to be reluctant in making regular backups is - a complete backup is costly. It takes 4-5 hours minimum, plus causing storage issues, the compressed home directory is too large.
poc
-- users mailing list users@lists.fedoraproject.org To unsubscribe or change subscription options: https://admin.fedoraproject.org/mailman/listinfo/users Guidelines: http://fedoraproject.org/wiki/Mailing_list_guidelines Have a question? Ask away: http://ask.fedoraproject.org
On Mon, 2013-04-01 at 15:40 +1100, Celik wrote:
In this thread, it has been suggested to do daily backups. But it wasn't clear, should the daily backups be for the whole system or only the current project we are working on?
Only you can answer that. Ask yourself what you can afford to lose.
An issue that has been bothering me and caused me to be reluctant in making regular backups is - a complete backup is costly. It takes 4-5 hours minimum, plus causing storage issues, the compressed home directory is too large.
Hardly surprising if you're doing it with tar, which is making a complete copy every time. That's the wrong approach. As I mentioned earlier, rsnapshot, obnam and others can do incremental backups which take up little space. They are also better run as cron jobs, i.e. automatically. I run mine when my system is quiescent (4am every day) but again that's your call.
And it goes without saying that the backup should be to another machine, e.g. using rsync or a net-mounted filesystem.
Note that these are recommendations for a personal system that you can't afford to lose. For larger systems you might want to look at solutions such as Amanda or Bacula (not that these won't work on a personal system as well, they just need a bit more setting up).
poc
On Tue, Apr 2, 2013 at 1:03 AM, Patrick O'Callaghan pocallaghan@gmail.comwrote:
On Mon, 2013-04-01 at 15:40 +1100, Celik wrote:
In this thread, it has been suggested to do daily backups. But it wasn't clear, should the daily backups be for the whole system or only the current project we are working on?
Only you can answer that. Ask yourself what you can afford to lose.
An issue that has been bothering me and caused me to be reluctant in making regular backups is - a complete backup is costly. It takes 4-5 hours minimum, plus causing storage issues, the compressed home directory is too large.
Hardly surprising if you're doing it with tar, which is making a complete copy every time. That's the wrong approach. As I mentioned earlier, rsnapshot, obnam and others can do incremental backups which take up little space. They are also better run as cron jobs, i.e. automatically. I run mine when my system is quiescent (4am every day) but again that's your call.
And it goes without saying that the backup should be to another machine, e.g. using rsync or a net-mounted filesystem.
Note that these are recommendations for a personal system that you can't afford to lose. For larger systems you might want to look at solutions such as Amanda or Bacula (not that these won't work on a personal system as well, they just need a bit more setting up).
Thank you
poc
-- users mailing list users@lists.fedoraproject.org To unsubscribe or change subscription options: https://admin.fedoraproject.org/mailman/listinfo/users Guidelines: http://fedoraproject.org/wiki/Mailing_list_guidelines Have a question? Ask away: http://ask.fedoraproject.org