I'm writing a shell script for cleanup of older files from various sub-directories inside a main directory
The structure of directories is as below:
Logs
daily
online
archive
weekly
online
archive
monthly
online
archive
Now, files inside online directory are in the format
<name>_<variable>_<YYYY-MM-DD-HH-mm>.dat
Now, the goal is to gzip the files which are older than 15 days from online directory and then move them to archive directory. When the zip files in archive directory are older than 60 days, then I need to delete them
I can't go with actual timestamp of the file as the files are generated on unexpected dates in few cases although the file name has correct time stamp
So, I need to use the date from file name to apply the above intelligence. Is there any quick solution to this problem?
Some example files from input directory:
A1_XXXX_2013-02-08-04-15-20.dat
A2_XXYYZZ_2013-02-08-04-16-40.dat
A3_ZZXXYYY_2013-02-08-04-18-30.dat
Expected zip files in archive directory:
A1_XXXX_2013-02-08-04-15-20.gz
A2_XXYYZZ_2013-02-08-04-16-40.gz
A3_ZZXXYYY_2013-02-08-04-18-30.gz