Need command/script to archive files older than

I need to find a way to archive all files older than a given date but there are some conditions that are making it difficult for me to find the correct command:

Linux based system (RH5)

  • there are multiple layers of directory depth I need to search
  • each file should be tar'd in it's original location in the directory
  • original file removed of course (this is a space issue)
  • tar files should achieve max compression
  • Cannot simply delete any files, records have to be maintained

Now personally I would prefer to pull all files from say July 2015 into a singe tar, max compress, and simply write a script that the web app can call to search the tar files and extract if needed.

---------- Post updated at 12:55 PM ---------- Previous update was at 11:53 AM ----------

I should have said I would prefer to do one big tar file but I'm being told I can't by management and customer.

If you do not compress the tar file it makes no sense to tar at all. A tar archive can be slightly larger than the original file. One single file per tar is pointless as well. Just compress the file. It becomes unreadable.

What you want to do, minus some details:

# find files in select directories:
# you need a file with directory names, call it dirnames, create one
while read dir
do
      # -mtime  180  finds files older than 180 days, change it to what you want in days 
      # the name excludes already zipped files.
      find "$dir" -type f  -mtime +180 ! -name '*.gz' -exec gzip {} \;    
done

If you absolutely have to have a single file in tar format archive change the above -exec command to get whatever.