Shell script to check a file and delete old files

Hello,
I needed help with a shell script where in it checks if a file exists under a directory and also checks the age of the file and delete it if it is older than 3 weeks.

thanks

What have you tried so far? Have you used the search function of the forum? There is also some related topics at the bottom of the page here.

I have tried the below one line command/script.

find /nim/dr/HMCBackup/  -type f -mtime +31 -exec ls -ltr {} \;

The thing is the files that I am specifically looking for ends in .tgz format, is there a way to filter out the .tgz files under the /nim/dr/HMCBackup directory?

Also I need to set up an email notification if there are no *.tgz files.

Thanks

Did you consider find 's -name test? And, tests and actions can be grouped to fulfill complex tasks with find .

BTW, 31 days is way more than 3 weeks.

Actually, I need to check for files older than 30 days and not 3 weeks. I did not get what do you mean by tests and actions could be grouped

Example from man find :

ls -al /nim/dr/HMCBackup/\* | grep -e .tgz

the above command gives me the .tgz files any idea how to incorporate the age of the files into it?

use -name option in addition to your command.

-name *.tgz 

Can you try below one to find files .tgz and if files are not found to send mail command.

[ -z "$(find /nim/dr/HMCBackup/ -name *.tgz)" ] || mail -s .... 
ls -al /nim/dr/HMCBackup/* | grep -e .tgz | cut -c45-90

this gives me the files with the time. Sample output below:

 Oct 11 13:36 HMCBackup_20161011.123704.tgz
 May  5 2015  HMCBackup_20150505.103242.tgz

Thanks

Using cut on the output of ls -l is always fraught with danger because the column widths can vary. It seems that the file names have the creation date in them in the format YYYYMoDD.HHMiSS so you don't really need to get the detail from a full ls -l because you have it there to work with if needed.

Using ls -lrt within a find command has no meaning because the files are found (and therefore listed) one by one, or if your OS supports using a + to end an -exec clause then there is a limit to how many files you will get in one execution of ls so you might get several chunks of sorted output but it's all meaningless overall.

Using find you can get the file names you need and display them or do other things, e.g:-

find /path/to/directory -name "*.tgz" -mtime +31 -print                  # Display .tgz files older than 31 days
find /path/to/directory -name "*.tgz" -mtime +31 -exec rm {} \;          # Remove .tgz files older than 31 days

The order of the files depends how they are stored in the directory rather than in any particular sort order.

What is the overall plan? It seems to me that you are over complicating things because you have lots of bits of tools where probably find can do everything you need. If you can describe it weill, with all the conditions, then I'm sure we can help.

I'm assuming you are using ks on the wonderful AIX, but what version?

Kind regards,
Robin

1 Like

Thanks Robin, for the response and offer to help. Let me try to describe in detail what I am trying to accomplish.
I am trying to automate the HMC (Hardware Management Console) backup on AIX. We have 6 HMC's in each Data Center and I need to back the HMC console data to the NIM server every month.
I am able to automate this part using HMC itself, wherein it SFTP's the console file which is a *.tgz file on to the NIM server every month. I am trying to write a script wherin it needs to check if I have successful HMC backup for all the 6 HMC's in each Data Center and send a notification email if there is no HMC backup file. It also needs to rotate/delete the old HMC backup (tgz file) older than 30 days.
Hopefully, I am clear in what I am trying to do, any help/advise to to do this in a better/efficient way would be appreciated.

Thanks