I want to create a script that will archive files out of a log folder into seperate folders below it, but I need it to keep the previous 2 days worth of information. Now I know I can easily get a script to put the files in there but I don't know how to only get files older than 2 days, my problem is that the file name doesn't have a date in it so i'll need to use the following date to decide which ones to archive
-rw-r--r-- 1 ugpea eupuk 1824 15 Feb 14:14 submit_rep_15141419_4042930.log
-rw-rw-r-- 1 eupuk eupuk 400483 15 Feb 14:14 med_main.log
-rw-r--r-- 1 root system 4001576 15 Feb 14:15 jobwatch_eupuk.log
-rw-r--r-- 1 root system 50384302 15 Feb 14:15 socketserver_eupuk.log
-rw-rw-r-- 1 eupuk eupuk 374478 15 Feb 14:15 provision_script.log
I think I can use an awk command but have never used it before, does anyone know what command I should use?
Above command will give you list of files older than 2 days. Now use cp or mv command to move the above files to archive directory and compress them as required.
second step "grep" is used to exclude the subdirectories if there are any. If you have nested directories, first step would give you results from all the directires, which is the not the case we wanted. Thats the reason additional grep is there to grep out files from current directory only.
I'm getting to the game late here, but am interested in performing a similar function, modified slightly to remove the files that are more than 2 days old - less of an archive, more of a clean up/forced log control.
What do you suggest to be the cleanest method for adding a 'rm' action to this command?