Compress every file

Dear Experts,

I am new to this forum. Thank you for moderator to allow me to join.

I have a question about automatic compression using sh and crontab
on an application which runs on the Suse Linux Server 11 SP4.

My question is how to compress every file in a directory into its own tar whilst preserving the name for each file?
here is my sh code:

#!/bin/sh

DB_DATA_PATH="/db/shr/bcksrv/bck/IPAddress/DBX/Foldera"
DB_TAR_PATH="/db/shr/folderb"
DB_DATA

LINES=$(find "$DB_DATA_PATH" -maxdepth 1 -mtime -1 -name "bck_F1*" | wc -l)
if [ $LINES -gt 0 ]; then
	find "$DB_DATA_PATH" -type f -mtime +2 -iname "bck_F1*" -exec gzip -f '{}' ';'
fi

the above sh code is not working. Any inputs/ideas would be greatly appreciated. Thank you.

---------- Post updated at 03:35 PM ---------- Previous update was at 09:52 AM ----------

Anything wrong with my question ? Please let me know.

Welcome Steven_2975,

I have a few observations:-

  • I don't know what command DB_DATA does. Is that important?
  • I don't see why you count the items to test if you need to do the work. If the second find command does not select anything, the -exec section does not run.
  • If the depth of directory/filename gets very long, your commands may fail. It might be better to change to $DB_DATA_PATH, check that you have done so and then run find with a directory of .
  • I'm not sure you need to have single quotes around the '{}' or ';'
  • When you say it's 'not working', in what way is it not? Do you have any output/errors?
  • For a dummy run, use .... -exec echo gzip ...... so you make sure you don't break anything whilst working on it.
  • In your first find, you have options -maxdepth 1 -mtime -1 set when you are counting, but not when you execute the second find. Which is the correct configuration?

I'm sure we can help you work this through to achieve what you want.

Kind regards,
Robin

On top of what rbatte1 said (esp. list item 5!), you're talking of "compress ... into its own tar" but not using the tar command anywhere in your code?

1 Like

Hi Robin,

Many thanks for your reply.

DB_DATA_PATH is the code for the sub directory that is created to store the backup file of an application database backup.
So, the backup database actual path is /db/shr/bcksrv/bck/IPAddress/DBX/Foldera.

Here is one of the sub directories is
/db/shr/bcksrv/bck/IPAddress/DBX/Foldera/bck_F120170527.

The /db/shr/bcksrv/bck/IPAddress/DBX/Foldera/bck_F120170527 consists of two sub directories, i.e. ept & ine. In these both directories, there are some backup files of the application database.

In this sub directory, the available files are only backup file folders. Each backup file folder consists of 2 sub folders, i.e. exp & ine folders.

I use manual compressing command as follows:
zip -r bck_F120170527.zip /db/shr/bcksrv/bck/IPAddress/DBX/Foldera/bck_F120170527/ept /db/shr/bcksrv/bck/IPAddress/DBX/Foldera/bck_F120170527/ine

but now, I want to compress the main automatically every 2 days.

I using this new modified code but not success:

#!/bin/sh

DB_DATA_PATH="/db/shr/bcksrv/bck/IPAddress/DBX/Foldera"
DB_TAR_PATH="/db/shr/folderb"
DB_DATA

LINES=$(find "$DB_DATA_PATH" -maxdepth 1 -mtime -1 -name "bck_F1*" | wc -l)
if [ $LINES -gt 0 ]; then
find "$DB_DATA_PATH" -type f -mtime +2 -iname "bck_F1*" -exec tar czvf {}.tar.gz -f "$DB_TAR_PATH" ';'
fi

the new sh command is still not working. Any helps to repair the code are really appreciate. Thank you.

---------- Post updated at 12:36 PM ---------- Previous update was at 12:18 PM ----------

Thanks for your good info. I have repaired the code now. Pls help to check my code.

Please become accustomed to provide decent context info of your problem.

You've been asked to explain WHAT and HOW is not working, and to show, if existent, system (error) messages verbatim, to avoid ambiguities and keep people from guessing.

EDIT: You seem to supply two -f options to tar , and no file(s) to archive.

1 Like

Yes, sure. I actually want to compress every file in a directory into its own tar whilst preserving the name for each file automatically using cron job.
To do this, I am using the sh code that I have pasted here but the code is not working. I am sure the code is wrong.

So, to answer your question: WHAT is not working , it is the code or the sh code that I have pasted here.
HOW is not working,i.e. it is because the code is wrong.

If the code is not working, I need the new sh code from all of your here. If it is different with mine, it is okay but it should be worked.
If it is not available, pls help to repair my sh code. I really appreciate it. Tqvm

Rgds,
Steven