What extra Parameters I can use for archiving log files

Hello All,

I have developed a script which takes following parameter from the input file to archive log files

1)Input Path
2)File pattern(*.csv)
3)Number of days(+1)

Following is the algorithm of my script

  1. Read the input file
  2. go to that path and search for particular n days older file
  3. print the file names and dump all files into new tar file, also remove the files from original directory
    Also the tar file which i am creating is created with systime

Now my question is; what other parameters should be considered which will make my script more efficient and powerful?

please help.

Thanks

My thoughts, which I hope it will be helpful!

  1. Use compression! for example gzip is very popular choice!
  2. get your script to delete old backups .. maybe use find command to search and delete files older than 7 days or so.

Thanks amlife, m just browsing forum to get more idea for my problem.

Does any one know that -if file size is of 200mb or more, will tar file still be created?

is there any restriction in UNIX/LINUX that tar file should of some limit?

---------- Post updated at 11:52 AM ---------- Previous update was at 01:34 AM ----------

Can any one please provide there inputs?

There is generally a limit of 2 Gb on the size of a "tar" archive. Check your own Operating System.