HP-UNIX How to zip/compress files in a folder?

Environment: HP-Unix Operating System B.11.31 U ia64

I have a folder with around 2000 files. There are some files which sizes are more than 8 GB. I need to compress all the files in the folder into a single file, so that I can transfer the file.

I tried with tar command, but the tar command is failing if the filesize is more than 8 GB.

Please advise me the efficient way of compressing / zipping all these 2000 files in a folder.

TAR does not compress. It creates archives.

You will need to use gzip first on all files, then pax to create a big archive, since on HPUX tar will not support READ of archives greater then 8GB (but i think it will create one tho :slight_smile: )

Alternative to above is to install gnu tar on HPUX machine and use that directly with compression (-z switch).

Hope that helps.

Regards
Peasant.

It is usually better to first tar then gzip:
Example:

tar cf - /path/to/folder | gzip -c > folder.tar.gz

Or create one on a remote system

tar cf - /path/to/folder | gzip -c | ssh remote_system "cat > /destination/folder.tar.gz"

It is also possible to store a relative path (easier to re-locate during extraction)

cd /path/to && tar cf - folder | ...

or

cd /path/to/folder && tar cf - . | ...

Thanks for the reply.

But I can not use

 tar 

as it skips the files which are greater than 8 GB.

Please suggest other than tar.

Maybe cpio does not have this limitation?

cd /path/to/folder && find . -print | cpio -o | gzip -c > folder.cpio.gz

Or to remote system

cd /path/to/folder && find . -print | cpio -o | gzip -c | ssh remote_system "cat > /destination/folder.cpio.gz"

To unpack a .cpio.gz file in the current directory, run

gunzip -c /destination/folder.cpio.gz | cpio -idm

---------- Post updated at 09:33 AM ---------- Previous update was at 09:18 AM ----------

Another alternative, with pax that is compatible with tar:

cd /path/to/folder && pax -wf - . | gzip -c > folder.tar.gz

To a remote system

cd /path/to/folder && pax -wf - . | gzip -c | ssh remote_system "cat > folder.tar.gz"

we can't use cpio command, because it limited to 2GB as man page said:

Because of industry standards and interoperability goals, cpio does
not support the archival of files larger than 2 GB or files that have
user/group IDs greater than 60 K. Files with user/group IDs greater
than 60 K are archived and restored under the user/group ID of the
current process.

1 Like

If you have multiple files which are in same format say csv, then you can navigate into the directory and issue the below command

zip -r zip_file_name *.csv

This will zip all csv files into a zip file by name zip_file_name

So tar is limited (8GB) , and cpio is even more limited (2GB).
It's a pity that the latest HP-UX does not do anything about it.
Then the remaining options are

pax -w /path/to/folder | gzip -c > folder.tar.gz 

View the contents with

gunzip -c folder.tar.gz | tar tf -

(Extract with tar xf - or pax -r)
Or use pax' own format (extractable only with pax)

pax -w -x pax /path/to/folder | gzip -c > folder.tar.gz 

View the contents with

gunzip -c folder.pax.gz | pax

(Extract with pax -r)
Or install zip and unzip

zip -r zipfile /path/to/folder

View the contents with

unzip -l zipfile

(Extract with just unzip)

My concern is what method will you be using to transfer such file after... If I believe what you are saying we might end with a unique file of (X)XX GB, do you intend sending this via network???
The last option not mentionned yet is using fbackup... my preference would be follwing Peasant's solution... as I have no morean HP-UX to test...

Remember also that you can use pipes...between servers... and so archive on one side and unarchive on the remote host