tar + gzip + split together

Hi All
I need guidance on this requirement .
We have a directory structure which has data of approx 100 GB
We need to tar the structure then zip it and create different files of not more than 10 GB
A separate tar file then a .gz should not be created , on the fly a script is needed which should tar then gzip it and create splitted zipped files of not more than 10 GB only
Please advice on this
I would really apprecite it.

Thanks
Aamir

are you getting any error messages when attempting these?

what you can do is go into the directory with the 100GB size. once in the directory, issue a du -sk * | sort -rn

that will list to you the all the other directories in the directory and their sizes from biggest to smallest.

using that information, you add up the directory sizes. you can pick the first 2 or 3 or 4 or whatever till it adds up to 10GB. Then you create a tar archive for this first 10GB like this

create tar archive = tar cf /name-to-give-archive /directory-to-archive

Example = tar cf /var/tmp/jamesbond1.tar /usr/impt/dir1 /usr/impt/dir2 /usr/impt/dir3 ...etc

for the second 10GB data you archive u do the same; you add up the remaining directory sizes till it reaches 10GB. then u also archive that.

create tar archive = tar cf /name-to-give-archive /directory-to-archive

Example = tar cf /var/tmp/jamesbond2.tar /usr/impt/dir1 /usr/impt/dir2 /usr/impt/dir3 ...etc

Check the tar manpage for the '-M' and '-F' flags.

Hi
The problem is their isnt one 100 GB file , we are doing a
datacentre move so their are lots of servers and huge directories to be moved so
we do require a script which would make that easier

Perhaps you could provide a little more information on the scenario, mainly what the directory structure looks like that you are trying to archive.

Agree with deepak, need more information. If you could provide more details about scenario we could be able to help you out.

So the media you will be using for the move doesn't allow for more than 10GB to be moved at a time, and you would like the files to not depend on each other, in case one of them goes corrupt, right?

Of course, before you compress something, you can't know whether it will be more or less than 10GB after compression (unless it's already less than 10GB, in which case it's a pretty safe bet). Could you create smaller zips and fill as many as will fit in 10GB?

Hi.

Note - the original question is well over a year old ... cheers, drl