NB: This is a repost on this blog of a post made on nixers.net
No this isn’t a post trashing shell scripting.
Handling files on the command line is most of the time a non-reversable process, a dangerous one in some cases (Unix Horror Stories). There are tricks to avoid the unnecessary loss and help in recovering files if need be.
Users do not expect that anything they delete is permanently gone. Instead, they are used to a “Trash can” metaphor. A deleted document ends up in a “Trash can”, and stays there at least for some time — until the can is manually or automatically cleaned.
In this thread we’ll list what ideas we have on this topic, novel or not so novel.
There’s the usual aliasing of rm to mv into a specific directory,
a trash can for the command line.
This can be combined with a cron job or timer that cleans files in this directory that are older than a certain time.
You can check the actual XDG trash documentation that goes into great
details about what needs to be taken into consideration:
$XDG_DATA_HOME/Trash) and usually at least
split into two directories:
- files for the actual exact copy of the files and directories (including their permission, and they should not override if two have the same names)
- info, that contains the information about where and what name the deleted file had, in case it needs to be restored. And also the date it was deleted.
Another way to avoid losing files is to keep backups of the file system. This can be done via a logical volume management be it included in the file system itself (ZFS, btrfs, etc..) or not.
So, what’s your command line trash, how do you deal with avoidable losses.
If you want to have a more in depth discussion I'm always available by email or irc.
We can discuss and argue about what you like and dislike, about new ideas to consider, opinions, etc..
If you don't feel like "having a discussion" or are intimidated by emails then you can simply say something small in the comment sections below and/or share it with your friends.