Limit number of files transferred

I've a folder in remote server and it has 50 files. I like to transfer these files by first 10 and next 10 files.

I'm using mget command to transfer the files. How to limit the file transfer limit to 10. instead of copying 50 files at a time.

Thanks
Janarthan

Using ftp ?

Why do you want to do them 10 at a time? Bandwidth issues, or something else?

SFTP
in example i mentioned as 10. but in practical I've more than lakhs of files and while doing mget all the files my server memory getting overloaded. To avoid this we're planning to process it by reducing the files count and process it as a batch of 5000 files

There is a command, parallel, that allows you to perform a selected number of operation all at once. Not all systems have it. Linux generally has it or it can be downloaded and installed.

Poor man's version in bash, using scp to copy files:

#!/bin/bash
cd /path/to/files/to/send
for fname in *
do
     for (( x=0; x<10; x++ ))
     do
          scp $fname remotebox::  &          
     done
     wait
done

If I use parallel then there's a chance for difference in file order sequence between the source and destination path. since, its running parallely

Use a series of sftp batch files ( -b option). Get the remote directory listing, split it up into several chunks and write those to the respective batch files.