Environment: HP-Unix Operating System B.11.31 U ia64
I have a folder with around 2000 files. There are some files which sizes are more than 8 GB. I need to compress all the files in the folder into a single file, so that I can transfer the file.
I tried with tar command, but the tar command is failing if the filesize is more than 8 GB.
Please advise me the efficient way of compressing / zipping all these 2000 files in a folder.
You will need to use gzip first on all files, then pax to create a big archive, since on HPUX tar will not support READ of archives greater then 8GB (but i think it will create one tho )
Alternative to above is to install gnu tar on HPUX machine and use that directly with compression (-z switch).
we can't use cpio command, because it limited to 2GB as man page said:
Because of industry standards and interoperability goals, cpio does
not support the archival of files larger than 2 GB or files that have
user/group IDs greater than 60 K. Files with user/group IDs greater
than 60 K are archived and restored under the user/group ID of the
current process.
So tar is limited (8GB) , and cpio is even more limited (2GB).
It's a pity that the latest HP-UX does not do anything about it.
Then the remaining options are
pax -w /path/to/folder | gzip -c > folder.tar.gz
View the contents with
gunzip -c folder.tar.gz | tar tf -
(Extract with tar xf - or pax -r)
Or use pax' own format (extractable only with pax)
My concern is what method will you be using to transfer such file after... If I believe what you are saying we might end with a unique file of (X)XX GB, do you intend sending this via network???
The last option not mentionned yet is using fbackup... my preference would be follwing Peasant's solution... as I have no morean HP-UX to test...
Remember also that you can use pipes...between servers... and so archive on one side and unarchive on the remote host