Backup files to tape drive on solaris

Hi,
I want to take backup of files older than 20 days from a directory onto a tape drive on remote machine on Solaris.
The files are of format abc-20100301000000.gz on my local machine.
I know the below commands for searching files older than x days and command for backup procedure.

solar1 # ufsdump 0uf Hostname OR IP:backup_drive_name  files_path
solar1 # find backup_files_path -mtime +10

I want to use the combined command to take backup of files older than 20 days in my shell script.
Can someone suggest me.

Regards,
Jyothi

for FILE in `find /path/to/directory -mtime +20`
   do echo $FILE
        /usr/sbin/ufsdump 0uf Hostname OR IP:backup_drive_name  files_path

That won't work.

That might be this:

ufsdump 0u $(find /path/to/directory -type f -name "abc-*.gz")

but it's probably not the right way to use ufsdump unless you already made a full level 0 dump.
Tar, cpio, pax or zip would be better and more portable choices.

Jillagre thanks on usefull information :wink:

Hi,
Thanks solaris_user and jlliagre for ur reply.
But Iam hanged up with the problem in taking the backup of the files.
I have a directory from which i want to find the files older than x days and move them to temp dir.
But the prob is wen i run the command i get "/usr/bin/find: Arg list too long " error and Iam unable to move further for using ufsdump command.

Iam using the find command as below:

find /dir_path/*.gz -mtime +148 -exec mv {} temp \;

Am i missing something? or is there any other way to find the files and move them?

Please guide me..

Regards,
Jyothi

This will work much better:

find /dir_path -name "*.gz" -mtime +148 -exec mv {} temp +

Thanks jlliagre for the command.But that gave "find: incomplete statement" error
Hence i removed the + and ran the command as

find /dir_path -name "*.gz" -mtime +148 -exec mv {} temp \;

And say what..... It is working!!!!!!!!!!! :slight_smile:

Regards,
Jyothi

The "+" variant was indeed unsuitable in that specific case.