What're some useful commands for gzipping logs in Linux?

Gzip all logs in a current directory and save them to a different file

gzip *

Gzip all logs in a current directory except the one named application.log

find . -maxdepth 1 -mindepth 1 ! -name 'application.log'  -exec gzip {} \;

This will also gzip to a different file each.

I want more man.

Hello,

Can you provide a bit more information on what problem it is precisely that you're attempting to solve that isn't covered by what you've already posted ? Fundamentally gzip is the command used for creating files compressed with the gzip algorithm, and it's the command that will always be used for this purpose. The find command can certainly be used to compress any file or files you like, depending on the exact criteria you need covered.

1 Like

I just want to learn about commands in your toolbox, that's all.

gzip is pretty much a one-trick pony: it consumes CPU to minimise file size without data loss, and more CPU to recover the original data.

The next level up is the logrotate subsystem. This compresses and renames logfiles so they do not grow forever, keeps a series of them (numbered), and removes them when they are obsolete. It can also mail them. It is fairly configurable.

logrotate is normally scheduled using crontab, so you don't even need to remember to do anything regularly.

Typically, gzip is also used to compress archives. It is not always helpful: some file formats (images, PDFs, Excel workbooks) have their own built-in compression tools, which leaves very little data redundancy for gzip to leverage.

2 Likes

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.