I need to write a shell script to transfer the files every hour from source - target server.
The cron job should be running every hour and shouldn't copy already copied files to the remote server ?
I was able to write intial script but not able to get the logic for (in the next run it should copy only newly created files with in that hour)
When you transferred your files, just move them or rename them if possible so they will not be transferred again.
If these files may not be renamed or moved to another location, copy them from their original location to some "transfer" directory and store them there.
To save some space, you can also just write into some txt-file, which files you just transfered and filter them out to not transfer them again by checking if their names are already in the list.
There is a lot of possibilities how to do that.
Best write with [ code ] tags when displaying code, data or logs etc.
For the succeeding steps, why not use find instead of ls in filtering the files that needs to be copied, that way you can use the -mtime (can be used filter files based on the last mod time), -name, -type and other useful flags
Here I can move the files on remote server to a different directory.
But how come the local server knows that it shouldn't transfer already transferred files in the next hour run ? The files that gets created are 500 MB each time.
Hi angheloko,
why not use find instead of ls in filtering the files that needs to be copied, that way you can use the -mtime (can be used filter files based on the last mod time), -name, -type and other useful flags
I know using the mtime with days (number of days). I don't know how to use modified time like for the past 1 hour,2 hours ?
find . ! -name . -prune -mtime +1 -type f | while read x; do
echo "$x"
done
Can't test it right now though
-mtime n
File's data was last modified n*24 hours ago. See the comments
for -atime to understand how rounding affects the interpretation
of file modification times.