Compress and logging files from October 6th in a loop

Good night, i need your help please

Because there are about 10000 files from October 6th, i need to to compress but i use this command and it does not do anything, in the prompt has no respones and i have to press CRTL+C to goback to the shell

for file in $(ls -l | grep "Impres" | grep "Oct  6"|grep -v .gz | awk '{print $9}');
do
 gzip $file `
done

but if i list them, there are files from October 6th

CEL /produccion/explotacion/xpfactur/facturacion/log/ImpresionScl/nociclo # ls -l | grep "Impres" | grep "Oct  6"|grep -v .gz | >
bc
ImpresionScl_11715951.log
ImpresionScl_11762413.log
ImpresionScl_11762472.log
ImpresionScl_11763290.log
ImpresionScl_11763291.log
ImpresionScl_11763292.log
ImpresionScl_11763294.log
ImpresionScl_11797566.log
ImpresionScl_11864274.log

There must be a syntax error or somethink ?
The OS is SunOS

I appreciate your help in advanced

------ Post updated at 09:44 PM ------

I found the problem but i face this while overriding:

already exists; do you wish to overwrite (y or n)?   not overwritten

Looks like the files in your current working directory are not unique.
Is this true ?

Do you have both (in one moment in time, just an example) :

ImpresionScl_11763290.log # Oct 6 you wish to compress
ImpresionScl_11763290.log.gz # older date then Oct 6

In working directory when trying to compress ?

Initial problem you found is at the end of following line :

gzip $file `

If you found the answer yourself, it is always nice to point it out and describe, so others may benefit.

Hope that helps
Regards
Peasant.

1 Like

Yes thank you it was a silly mistake i made, the initial problem was:

gzip $file `

yes there are not unique there are some from september.

but neither i dont want to overwrite them nor to be asked to overwrite, so i implemented this to skip if it really exists and redirect to a lof file the gzipped files:

for file in $(ls -l | grep "Impres" | grep "Oct  6"|grep -v .gz | awk '{print $9}');
do
 if [-f $file]
    echo "Skipped"
 else 
   echo "compresing file $file" >> output.log
   gzip $file
done

But its kind of a hassle because at least there are nearly 10000 files to gzip and log.

so is there any better approach to enhance this script and run faster?

I appreciate your help in advanced

Please check this older thread :

So i do not rewrite from scratch, seems Scrutinizer already wrote some good code to run parallel gzips, with examples and multiple scenarios.
Of course, it's not one to fit all, you will have to adjust it to meet your needs a bit.

Post here if you get stuck with it or have any questions.

Regards
Peasant.

Personally, I would avoid reading the output of ls -l because you could get false positives too easily. Maybe another approach might be better:-

touch -mt 201810060000 /tmp/start_point
touch -mt 201810070000 /tmp/end_point

echo "$(date) : Starting"

for file in $(find . -type f -newer /tmp/start_point ! -newer /tmp/end_point ! -name "*.gz")
do
   echo "$(date) : Compressing ${file}"
   echo gzip ${file}
done
echo "$(date) : Complete"

Remove the dark read echo if you are happy that it should work.

Does that help?
Robin