Script problem when running on crontab

Hi guys!

I created a backup script that works fine when I run manually, but when I put a crontab job to execute it the result are not the expected. (not a time problem).

Here is my script:

bash-3.00# cat /bk_tool/backup2.sh
#!/usr/bin/csh

clear
             set DIR_HOST='SCP08'
             set DIR_ATS='/ATS'                                         
             set DIR_DEST='/backupAP'                                    
             set DIR_FILE='/etc/vfstab'                                  
             set DIR_ROUTER='/etc/defaultrouter'                         
             set DIR_IPV4='/etc/hosts'                                   
             set DIR_INT='/etc/hostname.*'                              
             set DIR_HOME='/export/home'                               
             set DIR_BANNER='/etc/motd'                                
             set DIR_PASS='/etc/passwd'                                 
             set DIR_MASK='/etc/netmasks'                             
             set DIR_GROUP='/etc/group'                                 
             set DIR_INIT='/etc/inittab'                              
             set DIR_MANIFEST='/var/svc/manifest/application/ats'      
             set DIR_GROUP='/etc/group'                                 
             set DIR_INIT='/etc/inittab'                               
             set DIR_MANIFEST='/var/svc/manifest/application/ats'       
             set DIR_PROFILE='/etc/profile'                              
             set DIR_EIS='/.profile-EIS'                                
             set DIR_BKP='/bk_tool'                                      
             set DIR_LOG='/bk_tool/log'                                  
             set DIR_CRON='/var/spool/cron/crontabs'                    


                        echo "`date +%Y%m%d_%H%M%S` Start Backup Aplicacao" >> $DIR_LOG/backup.log
                        find $DIR_ATS \( -name "*.log" -o -name "*.rtdb" \)     > $DIR_BKP/exclude_list
                        tar cXfv exclude_list - $DIR_ATS/* | gzip --stdout >$DIR_DEST/"$DIR_HOST"_aplicacao_`date +%Y%m%d`.tar.gz
                        echo "`date +%Y%m%d_%H%M%S` Fim Backup Aplicacao" >> $DIR_LOG/backup.log

                        rm -f `find $DIR_DEST -type f -ctime +3 -print`

my cron job at /var/spool/cron/crontabs/root

5 0 * * * /bk_tool/backup2.sh

The status of backups:

bash-3.00# ls -lh /backupAP/
total 357478
-rw-r--r--   1 root     root          20 Feb 22 00:20 SCP08_aplicacao_20140222.tar.gz
-rw-r--r--   1 root     root          20 Feb 23 00:20 SCP08_aplicacao_20140223.tar.gz
-rw-r--r--   1 root     root        174M Feb 24 18:18 SCP08_aplicacao_20140224.tar.gz

Anyone have any idea to solve this problem?
Any way to log the problem?

This is asked very often and this is usually a problem because when the cronjob is run, it will not login with the environment that is set, when you do that manually. This also means, you will not have a $PATH or a very short list of paths in there set.
It is often recommended to use full paths in Cronjobs or have your needed environment being set/sourced at the beginning of your script.

Have a read here:
cron and crontab | Unix Linux Forums | Answers to Frequently Asked Questions

1 Like

It worked for now zaxxon! going to correct the script on my 80 machines.

Didnt understood why that happens, once I used all full path names on my script.

But.... thats ok!! I am glade for your help!!

=D

Another advice, while unrelated to your original issue and not mandatory as Unix doesn't care that much file extensions, if you have a csh script (and insist using the controversial C shell for scripts), you'd better call your script backup2.csh and not backup2.sh .