Shell script to transfer the files from source to target server.

I need to write a shell script to transfer the files every hour from source - target server.
The cron job should be running every hour and shouldn't copy already copied files to the remote server ?

I was able to write intial script but not able to get the logic for (in the next run it should copy only newly created files with in that hour)

Here is the logic:-
for I in `ls`
do
echo $I - Log from cron
scp $I pluto@5.67.16.25:$REMOTE_LOC/
sleep 120
done

Thanks,

When you transferred your files, just move them or rename them if possible so they will not be transferred again.
If these files may not be renamed or moved to another location, copy them from their original location to some "transfer" directory and store them there.
To save some space, you can also just write into some txt-file, which files you just transfered and filter them out to not transfer them again by checking if their names are already in the list.
There is a lot of possibilities how to do that.

Best write with [ code ] tags when displaying code, data or logs etc.

For the succeeding steps, why not use find instead of ls in filtering the files that needs to be copied, that way you can use the -mtime (can be used filter files based on the last mod time), -name, -type and other useful flags

Why not just use rsync? It can be a bit confusing.. but very very configurable!

Here I can move the files on remote server to a different directory.
But how come the local server knows that it shouldn't transfer already transferred files in the next hour run ? The files that gets created are 500 MB each time.

here how can I use the filter command ?

Thanks,
Radhika.

Hi angheloko,
why not use find instead of ls in filtering the files that needs to be copied, that way you can use the -mtime (can be used filter files based on the last mod time), -name, -type and other useful flags

I know using the mtime with days (number of days). I don't know how to use modified time like for the past 1 hour,2 hours ?

Could you please give me a hint on this ?

Thanks,
Radhika.

Hi Rhije,

Why not just use rsync? It can be a bit confusing.. but very very configurable!

I don't have much idea about the rsync command ? I have just gone thru the rsync man page .

Here I am transferring the files from Solaris server to Linux server and already set up SSH between those boxes.

Thanks,
Radhika.

hrm.. I would try using 'touch' to modify the access time.

Old files, touch them to be a year ago, new files, touch them to be NOW. And so you can easily see which ones are which.

find . ! -name . -prune -mtime +1 -type f | while read x; do
echo "$x" 
done

Can't test it right now though

 -mtime n
        File's  data was last modified n*24 hours ago.  See the comments
        for -atime to understand how rounding affects the interpretation
        of file modification times.

Hi

I would like the file modification time to be in hours not in days like 1 day ago or 2 days ago.

I would like have it to be in hours..

here is my logic

export ts=`date +%m/%d/%H:%M`
export ts
ts=`expr $ts - 60 minutes` ---

#Create file with time_stamp
touch -t 0$ts check_flag

#Check for newer files than flag
find . -newer check_flag

#Remove time_stamp
rm check_flag

here I would like to get previous time (an hour ago by subtracting 60 minutes of time).

Thanks,
Radhika.

Hi ,

Can I use this logic in my script to select only the files that are creted in the last 1 hour ??/

if [ -n "$(find
file1 -prune -newer file2)" ]; then

   printf %s\\\\n "file1 is newer than file2"

Hi,

Basically I would like to find the files which are created in the past 1 hour and copy thos eonly ???

Thanks,
Radhika.

Ei rad,

Try this to display the files. It's incomplete though as it may not compute properly when run after crossing a day.

find . ! -name . -prune -ctime -1 -type f | while read x; do
    y=$((`date +%H | sed 's/^0//'` - 0))
    z=$((`ls -lt "$x" | awk '{print $8}' | cut -d: -f1 | sed 's/^0//'` - 0))
    DIFF=$((y-z))
    [ $DIFF -gt 0 ] && echo "$x"
done