Hi,
I have a process which creates files and gzip all the files.
Next day, i need to get the file sizes ( before zip size ) of all the gzipped files. Is there any way i can get the original file sizes of gzipped files.
Gunzipping the files, getting the file size and gzipping again is not the approach i am looking for?
I appreciate your responses.
Thanks in Advance.
17 seconds of grubbing around in man gunzip found gunzip -l
This even appears to work with streams, much to my surprise.
If you use the -l switch, it lists the files and both their compressed and uncompressed sizes:
gzip -l file.txt.gz
compressed uncompressed ratio uncompressed_name
81 68 20.6% file.txt
Hope this helps.
---------- Post updated at 10:03 AM ---------- Previous update was at 10:02 AM ----------
17 seconds, damn, took me 20... coffee hasn't kicked in yet.
Are you sure, Jim?
$ cat vectortop.png | gzip > vectortop.png.gz
$ gunzip -l vectortop.png.gz
compressed uncompressed ratio uncompressed_name
4360 4337 -0.1% vectortop.png
$
The 'cat' is to make sure it's not doing anything sneaky like running fstat on /dev/stdin.
---------- Post updated at 09:46 AM ---------- Previous update was at 09:14 AM ----------
yet again, working on streams:
$ gunzip -l < vectortop.png.gz
compressed uncompressed ratio uncompressed_name
4360 4337 -0.1% stdout
The limit seems to be that the compressed file has to be a file, not a stream, for the final size to be known. Maybe it has to seek to the end, impossible in a stream.
$ cat vectortop.png.gz | gunzip -l
compressed uncompressed ratio uncompressed_name
-1 -1 0.0% stdout
$
Solution worked. Its displays the size in bytes.
How can i convert these sizes into MB?
Divide by one million.
gunzip -l whatever.gz | awk 'NR==2 { print sprintf("%.1f MB", $2/(1000*1000)); exit }'