Hi,
I want to take backup of files older than 20 days from a directory onto a tape drive on remote machine on Solaris.
The files are of format abc-20100301000000.gz on my local machine.
I know the below commands for searching files older than x days and command for backup procedure.
solar1 # ufsdump 0uf Hostname OR IP:backup_drive_name files_path
solar1 # find backup_files_path -mtime +10
I want to use the combined command to take backup of files older than 20 days in my shell script.
Can someone suggest me.
ufsdump 0u $(find /path/to/directory -type f -name "abc-*.gz")
but it's probably not the right way to use ufsdump unless you already made a full level 0 dump.
Tar, cpio, pax or zip would be better and more portable choices.
Hi,
Thanks solaris_user and jlliagre for ur reply.
But Iam hanged up with the problem in taking the backup of the files.
I have a directory from which i want to find the files older than x days and move them to temp dir.
But the prob is wen i run the command i get "/usr/bin/find: Arg list too long " error and Iam unable to move further for using ufsdump command.