Having setup a backup server for work and home, I was looking into how to remove archives that were a week or older. Initially, I wrote a simple script to search a path for files that matched a naming convention and whose creation dates exceeded the week limit. This was fine and dandy but not the most elegant and didn't really increase my linux-fu. So I decided to delve deeper and find a good one-liner that embraces the linux way of life. My result:

rm `find path/to/backups -mtime +7 -name '*.bz2'`

The breakdown:

  1. rm - remove files, any n00b knows that…
  2. the ` mark surrounding the argument passed to rm denotes that the shell should use the result of executing the command(s) inside the ticks as the arguments to rm. This tick can be found to the left of the number one.
  3. find does that, finds files and lists them to stdout, generally the screen.
  4. /path/to/backups is the path find begins its search from…
  5. -mtime +7 says find files who's modification time is more than 7*24 hours ago, effectively a week or more in the past.
  6. -name '*.bz2' matches any file ending in with the .bz2 extension.

Slap that sucker into a cron job and you have yourself an automated way to maintain a certain number of files that do not exceed a certain time frame. find is enormously useful and has so many wonderful options to assist you in your file searching. So there you have it, a simple one-liner to remove files based on their timestamp.