SFTP
in example i mentioned as 10. but in practical I've more than lakhs of files and while doing mget all the files my server memory getting overloaded. To avoid this we're planning to process it by reducing the files count and process it as a batch of 5000 files
There is a command, parallel, that allows you to perform a selected number of operation all at once. Not all systems have it. Linux generally has it or it can be downloaded and installed.
Poor man's version in bash, using scp to copy files:
#!/bin/bash
cd /path/to/files/to/send
for fname in *
do
for (( x=0; x<10; x++ ))
do
scp $fname remotebox:: &
done
wait
done
Use a series of sftp batch files ( -b option). Get the remote directory listing, split it up into several chunks and write those to the respective batch files.