For loop till the files downloaded

Need assistance in writing a for loop script or any looping method. Below is the code where i can get all the files from the URL . There are about 80 files in the URL .Every day the files get updated . Script that i wanted is the loop must keep on running till it gets 80 files. It matches the count and if the count is not right then it re runs the loop to get the file. I do not want to put it in crontab and run every 30 sec

for i in `cat filename.txt`
   do
       wget -m --append-output= output.log -np -p --no-directories -b --directory-prefix=<path> http://sample.com/gfs.20131113/$i
  done

Why 80? 80 of what? You're downloading recursively, do sub-downloads count towards the 80?

wget has retry options of its own which may make things a great deal simpler.

there are only 80 files on that link which i am downloading . Some files come early and some late . So till i get all the files i wanted to loop in till I get all the files.

Can you tell me how can i get sub downloads .

I guess you've got 80 files in your filename.txt. Why don't you create a temp file, appending every file name that wget fails with, and, with a while loop around your above for loop, retry reading that ever decreasing temp file? Like

cp filename.txt tempfile
while tempfile not empty
  do
  while read FILE
    do wget $FILE || echo $FILE >>tempfile2
    done < tempfile
  cp tempfile2 tempfile
  done

Why not use wget's retry options? -t 20 to try 20 times before giving up.

Otherwise you're stuck with trying to parse an HTML file and figure out which things wget missed.

I will try an surely let you know. Thanks