I am writing a script in which I need to gather 2 numbers for 'total' and 'successful'. The goal is to compare the two numbers and if they are not equal, rerun the task until all are successful. I'm thinking the best way will be with awk or sed, but I really don't know where to begin with this one.
The line that is outputted from the task is a very large single line with no spaces. However the beginning of the line appears to be consistent. Here is the output from my 4 manual runs:
Does the task not use an exit code ? You can test for it until it is succesful
until task
do
:
done
This will run indefinitely if task will never be successful
Alternatively (bash/ksh) you can limit the number of attempts and report about them.
for i in {1..10}
do
if task
then
echo "successful after ${i} attempt(s)"
break
fi
if (( i >= 10 )); then
echo "Max number of attempts exceeded; task was unsuccessful"
exit 1
fi
done
--
If there is no return code and testing the output is the only option, then I suggest testing for the number of fails :
until [[ $result =~ ^\{\"_shards\":\{\"total\":[0-9]+,\"successful\":[0-9]+,\"failed\":0\} ]]
do
result=$(task)
done
Likewise, you can use
result=$(task)
if [[ $result =~ ^\{\"_shards\":\{\"total\":[0-9]+,\"successful\":[0-9]+,\"failed\":0\} ]]
If task doesn't return a useful exit code (i.e., always returns a zero exit status), one could still skip parsing counts and just loop until success is found:
while line=$(task)
do if [ "$line" = "${line%:0\}}" ]
then
echo "One or more tests failed: $line"
else
echo "All tests passed: $line"
break
fi
done
until [[ $(synced_flush) == "0" ]]
do
synced_flush
done
This appears to be doing the trick. If there is something really wrong with this approach please let me know. Like most of the scripts I write, they start off 'working' and I improve the efficiency with time ...
Thanks again scrutinizer, Don Cragun, and RudiC.. Much appreciated.
Not "really wrong", but you can optimise a bit. Change:
$curl -s -u ${creds} -X POST "localhost:9200/_flush/synced" | \
grep -Eo "shards[^}]*failed\":[0-9]*" | \
sed -e 's/.*://'
to
$curl -s -u ${creds} -X POST "localhost:9200/_flush/synced" | \
sed -n ' /^{"_shards.*failed":[0-9]/ {
s/.*failed://
s/\([0-9]*\).*/\1/p
}'
as sed can do everything grep is able to do, so you need only one of them. Usually this doesn't make a big difference but if the combination is called many times over the one saved process amounts to some considerably less time needed.