I want to search for a certain string in thousands of files and these files are distributed over different directories created daily. For that I created a small script in bash but while running it I am getting the below error:
#!/usr/local/bin/bash
for i in `cat msisdn_u.txt`
do
cd /comptel4/elink/backup1/output/vas/NG0/20130301
find ./*GPX.Z|xargs zcat|grep $i; cd ..
cd /comptel4/elink/backup1/output/vas/NG0/20130302
find ./*GPX.Z|xargs zcat|grep $i; cd ..
cd /comptel4/elink/backup1/output/vas/NG0/20130303
find ./*GPX.Z|xargs zcat|grep $i; cd ..
cd /comptel4/elink/backup1/output/vas/NG0/20130304
find ./*GPX.Z|xargs zcat|grep $i; cd ..
cd /comptel4/elink/backup1/output/vas/NG0/20130305
find ./*GPX.Z|xargs zcat|grep $i; cd ..
cd /comptel4/elink/backup1/output/vas/NG0/20130306
find ./*GPX.Z|xargs zcat|grep $i; cd ..
cd /comptel4/elink/backup1/output/vas/NG0/20130307
find ./*GPX.Z|xargs zcat|grep $i; cd ..
..
..
..
done
Thanks for your suggestions PikK45..but is the command descending into directories..because I dont see any output..the command returns back to the command prompt
You will continue to trigger that error so long as the contents of the txt file in the highlighted command substitution exceed your system's available memory (or process mem limit).
Especially PikK45. Thank you very much for helping me all along. Please you have to let me know why running the while loop did not cause the memory error
---------- Post updated at 10:01 PM ---------- Previous update was at 09:59 PM ----------
Yes Alister. Radoulov's suggestion did not work. And yes, thank you too radoulov