Querying a website at random bunches simultaneously

I have a script that hits my website for stock quotes and pulls prices. currently it does this one by one until all the symbols have been pulled. I've solved for bandwidth issues by putting in a random 1-10 second sleep function, but this will take too long.

I know that I can do everything at once by adding & at the end, but that will bomb out my website's resources.

Is there a way to hit my site with a random count of 5-10 symbols pulled all at once, every 1-10 second randomly?

while read line
do
    array+=("$line")
done < "../stocks/aaa-stockuniverse.txt"
#this reads all the tickers into an array


for ((i=0; i < ${#array[*]}; i++))
do
	eval $(curl -s "http://www.website.com/api?stock=${array}"|sed 's/</\n</g' |sed '/data=/!d; s/ data=/=/g; s/\/>/; /g; s/</GF_/g' |tee ~/stocks/${array}.txt)
	echo "${array},$(date +%Y-%m-%d),$GF_open,$GF_high,$GF_low,$GF_last,$GF_volume"
	sleep $[ ( $RANDOM % 10 )  + 1 ]s
done

Why random?

Let's assume your site can handle 10 at once (you decide on a number, 10 is just a choice)
You want parallelism, this does exactly that.
Use this as a template:

cnt=1
for ((i=0; i < ${#array[*]}; i++))
do
	  (curl -s "http://www.website.com/api?stock=${array}"|sed 's/</\n</g' |
                  sed '/data=/!d; s/ data=/=/g; s/\/>/; /g; s/</GF_/g'  ~/stocks/${array}.txt
	     echo "${array},$(date +%Y-%m-%d), $GF_open, $GF_high, $GF_low, $GF_last, $GF_volume" >> somefile) &

              cnt=$(( $cnt + 1 ))

          [ $(( $cnt % 10 )) -eq 0 ]  && wait

done
wait   # this one is important 

I altered the code to stop writing to the terminal, because the & places the child process in the background. The loop waits until all children are done - but it lets 10 of them fire off, then waits in bunches of 10.

This is the most efficient way to create parallel code for what you are doing.

I am trying to understand your code. A few questions:
1- What is the semantical need for the double ampersand in "&& wait"? Is it a logical AND? If so, then what does the second "wait" do?
2- The program includes the writing to somefile. This is within the paralleled loop, but is this shell specific? Would it be more efficient/robust to build up the array in the loop and write all contents to the file outside the loop?