Replicate remote directory to local directory with ftp

I have system that generate files every 1 hours , i only have ftp connection from my local server to remote .

$ ls -al
-rw-r--r--  1 water None 0 Feb  7 18:09 a.0800
-rw-r--r--  1 water None 0 Feb  7 18:09 a.0900
-rw-r--r--  1 water None 0 Feb  7 18:09 a.1000

is there any perl / php script to replicates only for new files to my local , i know lftp or sync will work perfect but i don't have privilege to install the aplication.

Try like this:

(1) connect to the remote ftp server and list the files and disconnect
(2) compare the file names with local ones and note which files are not present.
(3) Again connect to the remote ftp server and *mget* the files.

Any progress with this?

You can use 'find' to select the required files based on timestamp, create a ftp command file, and then execute ftp transfer.

Do you want example code?

lftp give very good solution for this case , but i still have some issue with lftp

I want to download only the files that newer than latest local files( last downloaded from previous task) , tried --newer-than but have problems with different offset/timezone.

lfile=$(ls -1 | tail -n 1)
mfile=$(date -r $lfile)

lftp  -d -e "mirror --newer-than='$mfile' $rdir $ldir;bye" -u $u,$p $ips

any sugestion?