I have a log file which has 5K lines in it, I need to send only 200 lines to an application at a time and delete the 200 lines from log fileafter its been fed to application.
The script should keep on running until all 5K has been processed..
If the log file might still grow while you are handling it, doing this in a shell script sounds rather dangerous. From Perl or Python or C, you might be able to use some file locking primitives to coordinate with processes which might be writing to the file concurrently.
Having said that, consider this (lack of) proof of concept.
while true; do
if head -n 200 file | xargs application; then
sed 1,200d file >file.tmp
mv -f file.tmp file
test -s file || break # quit loop if file empty
else
echo "application failed ($?); not removing first 200 lines -- retrying in a while" >&2
fi
sleep 600 # or something like that here, maybe?
done