Well, first off, the script you showed us can't do anything, neither slow nor fast: you create an array "$url" inside func2() but it will be private to this function and not be known any more once you leave this function.
Second, this method of filling an array is overly complicated, regardless of being inside a function or not:
func2() {
index1=0
while read line ; do
index1=$(($index1+1))
url[$index1]=$line
done < tmp/url1.txt
}
You can easily do it this way:
while read line ; do
url[$(( index++ )) ]="$line"
done < tmp/url1.txt
Next, your way of going through the array can also be improved, not to mention your way of subprocess handling - BACKTICKS ARE DEPRECATED:
Finally: in func3() you run through every element of the array. In funcrun() you do the same (that you provide func3() with an argument which it ignores doesn't change anything)! So in fact your run not n curl -invocations for n array elements but n-squared! What "helps" a little is that you only do it for the first 5 elements of $url[] , regardless of how many elements it holds (see the for-loop in funcrun() ), so you only do 5 times of what is necessary instead of n times so.
You might want first consolidate this mess. If it still is "too slow" you might think about this: you could put the curl -invocations in background so that they run in parallel instead of one after the other. Notice, though, that if that is successful depends on the number of URLs you want to pull: a dozen is perhaps no problem, a few hundreds might be, a few thousands are definitely going to be a problem. You will need a "fanout" value in this case so that only a certain maximum number of parallel processes run at the same time.
You want to upate a million web sites on the internet? No surprise it's taking days...
If I got you wrong, please rephrase your problem and supply more details, like sample input files (not a million line, though - some 10 to 20).
it is required to send requests through the post method to api
But there are a lot of requests that need to be split up to optimize requests and run in multiple processes