i hav a script which processes 1000 files one by one in every run jus for searching their corresponding output files at a given path, due to this my script runs for long time and taking more CPU,
can we have any way in which we can have this check at least 100 files in a single shot instead of one by one by a simple loop, and how will it affect the CPU consumption.
cat $Home | while read line
do
File=$(echo $line|awk '{print $4}');
if [[ -f "$File.SP" ]]
then
echo " File $File is Correct"
else
echo "File $File is Missed"
fi
]
i am checking for the files which are received in last 24 hours( almost 1000 files) in my system and they have their corresponding ".SP" file generated, if not then its a MISS.
Here i am checking 1000 files one by one that their corresponding ".SP" is generated or not. can i have any way common by which i can put this check for more than a file at a time, to minimize the run time of the script?
i mean can we parallely run multiple checks like threading??