Same time ftp download in perl multiple sites.

I currently have a perl script that have to retreive a single file from 20+ sites every 10 min. Right now it will ftp to site1 and download and continue up until site20. I am trying to get this to run all the ftp d/l at the same time. This is where I have my problem, I can't get it to work.

Sample of sam_sites.pl file :

AA|123.456.789.100|PASSWORD|Windows
BB|321.654.987.100|PASSWORD|Unix
CC|135.790.864.123|PASSWORD|Unix

Code from ret_sam.pl file :

#!/usr/bin/perl
use Net::FTP;
use Switch;
use threads;
use threads::shared;

$version = "0.00.0001";
$RETSAMC = `ps -ef | grep ret_sam | grep -v grep | wc -l`;
$RETSAMS = "/home/fox/SAM/RET_SAM_RUN";
$HOMEDIR = "/home/fox/SAM/sites/test";

open(SITES_F, "sam_sites.pl") || die ("Missing sam_sites.pl file!");
@site=<SITES_F>;
close(SITES_F);
$NLETNUM = scalar(@site);

foreach $location_n (@site)
{
chomp($location_n);
($s_name,$s_ip,$s_pwd,$s_sys)=split(/\|/,$location_n);
$thr = threads->create(\&ftp_go_and_get, $s_name, $s_ip, $s_pwd, $s_sys);
$thr->join();
}

sub ftp_go_and_get {
if (lc($[3]) eq "windows"){
$f_user = "fox\@openfox";
$directory = "SAM";
} elsif (lc($
[3]) eq "unix"){
$f_user = "fox";
$directory = "/home/fox/SAM";
}
$ftp=Net::FTP->new($[1],Timeout=>60) or $newerr=1;
push @ERRORS, "Can't ftp to $_[1]($
[0]): $!\n" if $newerr;
myerr() if $newerr;
print "Connected to $_[1]($[0])\n";
$ftp->login("$f_user","$
[2]") or $newerr=1;
print "Trying to get in and grab file.\n";
push @ERRORS, "Can't login to $_[1]($[0]): $!\n" if $newerr;
# $ftp->quit if $newerr;
myerr() if $newerr;
print "Logged in\n";
$ftp->cwd("$directory") or $newerr=1;
push @ERRORS, "Can't find a spot to take dump.\n" if $newerr;
myerr() if $newerr;
# $ftp->quit if $newerr;
#$f_size=$ftp->size("@
[0]") or $newerr=1;
#print "The size of this file is $f_size\n";
# push @ERRORS, "Can't get size of file.\n" if $newerr;
# myerr() if $newerr;
# $ftp->quit if $newerr;
@files=$ftp->get($[0], "$HOMEDIR/$[0]") or $newerr=1;
push @ERRORS, "Can't get file $[0]\n" if $newerr;
myerr() if $newerr;
print "Got file for $
[0]\n";
foreach(@files) {
print "I took a dump in $_\n";
}
$ftp->quit;
}

sub myerr {
print "Error: \n";
print @ERRORS;
exit 0;
}

I am looking for any suggestions or constructive critism that can help me out here. Thanks in advance.

At the same time!? That may be like a giant bandwidth hog.. but oookay.

You may be able to do it like this..

Loop from 1 to 20... (numbers being 1..20)

for number in numbers; do
    var="site"$number
    perl somescript.pl $var &
done

Well, that, or you can learn to fork in perl.

Basically here is the logic. Loop through all of your sites, and execute the perl script with the argument of what site you want to ftp data to/from, and then throw it in the background to do its thing (so the loop continues). No guarantees it will work, but it should.

So you think I would have to have the subroutine in it's own script and just run it in the background? My main script has other functions but I can try to run this again calling the ftp portion &.

Do I even want to use threads in this then? I was just calling my subroutine `&ftp_go_and_get $ARG1 $ARG2 $ARG3 $ARG4` before but that did not give me the desired results either.

I have started to look at fork() this but have never myself done it.

As for bandwidth the total of all the files is under 40k and it is not really an issue for this.

Lets see..

So lets call the perl script that does all the ftp stuff, ftp.pl

In the ftp.pl , the script allows 1 argument, that is the site that is needing to be transfered to/from.

hostfile="hosts.txt"

for host in `cat $hostfile`; do
     perl ftp.pl $host &
done

That is assuming that you have all of the hosts inside of a file called hosts.txt (that way you can just add append additional hosts to the bottom). Easy as pie. If you want, you can use rsync, scp, rcp..whatever in linux.

I do have a file where I have all the hosts IP PWD and OS type stored. I called it sam_sites.pl.

AA|123.456.789.100|PASSWORD|Windows
BB|321.654.987.100|PASSWORD|Unix
CC|135.790.864.123|PASSWORD|Unix

I read in the file into an array @site

open(SITES_F, "sam_sites.pl") || die ("Missing sam_sites.pl file!");
@site=<SITES_F>;
close(SITES_F);
$NLETNUM = scalar(@site);

I then have a foreach loop where I take each line and set the variables to host,IP,PWD and OS. Then I start the ftp transfer.

foreach $location_n (@site)
{
chomp($location_n);
($s_name,$s_ip,$s_pwd,$s_sys)=split(/\|/,$location_n);
$thr = threads->create(\&ftp_go_and_get, $s_name, $s_ip, $s_pwd, $s_sys);
$thr->join();
}

So I guess one of my questions still is will threads do what I I wanting with d/l the files at the same time or do I need to run a perl script in background or fork()?

I appreciate your help.

hrm, I'm not really sure. I am not the biggest perl guru, so that is why I suggested the background method.