i'm searching for a solution how to download all files from root-directory of an ftp-server through an ftp proxy
getting through the ftp proxy and download one file with get ist no problem, but mget * does nothing!
ftp -n -i -v <<EOF
open proxyHost proxyPort
user user@ftp_server password
lcd xxxxxxxxxxxx
epsv4 // downloading only works in extended passive mode
/* now i want to download all files in the root directory of the ftp but mget * does not work (does nothing... not even throwing an error message) */
bye
EOF
i read that it is possible that the console hasn't enough memory to perform mget with an high amount of files!?
i'm thinking about store all filenames in a list and then downloading them with get in a loop (in a later version of the script i propably want to delete the files after downloading them) but i don't really know how to do this and found no example for downloading files in a loop from an ftp server (through ftp connection, because connection over http doesn't work)
#!/bin/bash
function send_ftp
{
ftp -n -i -v $HOST> $2 <<EOF
open proxyHost proxyPort
user USER PASS
cd $dir_remote
$1
bye
EOF
}
file_log=log.$$
file_list=tmp.$$
dir_remote=you data
#Put the max file that you can copy in one instrucction of mget
max_file=you data
#IP remote host
HOST=you data
#call to generate the list of files in remote host
send_ftp "dir $dir_remote" $file_log
#in my case de first 12lines is the "head" of the log file, besure in you case
#the grep filter is because the lines of ftp begin with number and the files no
sed "1,12 d" $file_log | grep -v ^[0-9] | sed "s/ */#/g"| cut -f9 -d"#" >$file_list
rm -f $file_log
#to optimice the ftp comands i make a buffer with the max number of files that we
#can pass to de mget instruction
buffer=""
conta=0
while read file
do
buffer="$buffer $file"
conta=$(expr $conta + 1)
if [ $max_file -eq $conta ]
then
send_ftp "mget $buffer" "/dev/null"
buffer=""
conta=0
fi
done <$file_list
if [ "$buffer" ]
then
send_ftp "mget $buffer" "/dev/null"
fi
rm -f $file_list
please copy the shell like i put.
the only thinks yo has to modifique is the valor of some variables:
dir_remote=you data
#Put the max file that you can copy in one instrucction of mget
max_file=you data
#IP remote host
HOST=you data
example if directory of the remote server is /tmp/mydir the remote server ip is 10.30.20.50 and you want copy 30files in every connection yo need to modify:
dir_remote=/tmp/mydir
#Put the max file that you can copy in one instrucction of mget
max_file=30
#IP remote host
HOST=10.30.20.50
And the strings USER y PASS you need to put de user and pass for the connection
ftp -n -i -v $HOST> $2 <<EOF
open proxyHost proxyPort
user USER PASS
cd $dir_remote
For example for user=user_ftp, pass=mypass
ftp -n -i -v $HOST> $2 <<EOF
open proxyHost proxyPort
user user_ftp mypass
cd $dir_remote
if I have understood you well, the problem that you have is that in the directory of the remote server where from you want to copy the files there are too many files. For this motive the judgment
mget *
returns mistake to you provided that the substitution of the character * is too long.
The option more easy could be to use get for each of the files, but it means to realize one connection ftp for every file.
For this motive is variable max_file, indicates the maximum number to bring in every connection being able to optimize the number of connections.
Let's suppose the following situation:
We want to copy the files of the directory /tmp/midirectorio and in this directory(board of directors) there are 200 files, if we define max_file=50, when you execute the script it will make 4 connections ftp and in every connection copy 50 files.
if i put the address of the ftp_server to the HOST variable, the connection runs to our firewall....
if i put the address of the proxy to the HOST variable, i get: "ftp: connect: Operation timed out"
i don't know what $HOST> &2 in the ftp command really does???
but to understand i need to connect through the proxy
open proxyHost proxyPort
and with username: accountname@ftp_server_address
something like: user_ftp@10.30.20.50
and password is password...
i tried like this but it doesn't work with your script
ftp -n -i -v <<EOF
open proxyHost proxyPort
user user@ftp_server password
/* now i want to download all files in the root directory of the ftp but mget * does not work (does nothing... not even throwing an error message) */
bye
EOF
the equivalente in the function is:
function send_ftp
{
ftp -n -i -v > $2 <<EOF
open proxyHost proxyPort
user user@$HOST password
cd $dir_remote
$1
bye
EOF
}
&2 is not in my script, my script:
ftp -n -i -v $HOME> $2
HOME is a variable, and when you put $HOME the system change the string $HOME by the value, then :
ftp -n -i -v 10.10.10.10 > $2
here the character > is the redirection of the stdout for the ftp, $2 is the second parameter to call the function
If you look to the calls :
send_ftp "dir" $file_log
in this case $file_log is the second parameter in the call
if you don't wnat to use variables you can put the valors, i wrote the scrpt withc variables because i think is better.
i'm still trying to make this work, but no success yet
i'm getting no error, script is working for 1-2 minutes and then the shell is free to type again but no new files in the local folder or any other folder and no log files like in trys before....
even tried with different dir_remote or without a folder behind dir (cause there are no folders only files on the server, so its the default folder....) but i get the same result.... working without error or result...
#rm -f $file_list
echo "The file with the list:$file_list"
#rm -f $file_log
echo "The file log:$file_log"
When the script finish now they say you where are the files and you can edit ;).
And you can add this:
echo "DEBUG:File Name=[$file]"
just under the instruction do:
while read file
do
echo "DEBUGFile Name=[$file]"
....the rest of the code
And when the script run they print out the name of every file that is going to copy.
Onething: why you put "/." in dir_remote?� this means for that they go to de / of the server.
be sure the name of the directori. If the directory of work is the HOME directori of the user ftp you need to change and put dir_remote=.
Other way is to delete the "cd $dir_remote" of the script
Every that you put inside <<EOF ..... EOF is like you type when you make a conection by yourself, if serv suport the epsv4 command and you need you can put, no problems xD
[CODE]
ftp -n -i -v > $2 <<EOF
open ------- -----
user ---@------ -----
epsv4
$1
bye
EOF
[CODE]
The otput is exactly:
DEBUG:File Name=[XXXX]
if you see XXXXX i think there are a problem to generate the list of files... please edit the 2files and sent me the content (think is better you send me a private with the content)
i think now is a problem of ftp...
to probe if is problem of ftp you can make the ftp by yourself and see if when you make then ftp you have the same problem.
i succeeded in downloading 1 file , with the code posted in the 1 post (with get Filename in it)
so the ftp is fine, but there's a problem with the connection like we establish it...... could probably the mget with a buffer have problems with epsv4 ???
certainly the ftp-proxy does only work with passive mode (our admin told me), cause in active mode it would open another channel for every action and that will brought our connection to the firewall
i think it is the mget $buffer command which leads to the error "ftp: No control connection for command."
and because of the output, it seems that for every max file package, the while do loop runs first for every file, and all the send_ftp were executed at last, so that i get this output...
i would except something like this:
or is this function send_ftp executed in another thread, so that it is trying to connect and failing after the while do loop is finished for 1 package?
oh man scripting for an ftp behind an ftp-proxy really sucks :X
You has to make the smae probe...if shell use mget yo need to probe with mget... in you response i understand you use get...
If you has define max_file=5 you first see 5 lines:
DEBUG[XXXX]
DEBUG[XXXX]
DEBUG[XXXX]
DEBUG[XXXX]
DEBUG[XXXX]
ftp error
ftp error
ftp error
ftp error
ftp error
and in the 5 DEBUG the shell call to de function
To see this you need tu put max_file=1
Well if you probe to make a conection
and use get and is ok... then change in the script
now i will tried to move them to another folder after downloading, hope this script won't download the files if they are in a subfolder of the folder from which i download...
while read file
do
echo "DEBUG:File Name=[$file]"
buffer="$buffer $file"
conta=$(expr $conta + 1)
if [ $max_file -eq $conta ]
then
send_ftp "mget $buffer" "/dev/null"
send_ftp "rename $buffer subfolder/$buffer" "/dev/null"
buffer=""
conta=0
fi
done <$file_list
could it work like this?
i will try with test files^^
have you tried turning off interactive mode before using mget * ?
ftp to server
type prompt <enter>
this will turn off interactive mode, now try mget *
it works is a great notice xD ... in this case the problem is ftp server ar pasive and active mode and if you has a proxy i think maybe there are a not correct configuration.
I read something about the ftp internali open another ports.. and something happends when sftp_server has to redirect the paquets to the client...but a don't remember so well.
in theory for this propose is the option -i in the ftp:
-i Disable interactive prompting by multiple-file commands; see
the prompt command, below. By default, when this option is
not specified, prompting is enabled.
and this option is put in the shell.