I need a shell script to compress and transfer only last modified(new) file from one linux server to another linux sever.
the files i need to backup are 2 files .txt & .log which will be created in the same folder for every 3-4hrs and this script need to run once in 24hrs.
so only newly created one .txt & .log file need to compress and transferred..
As the script runs every 24 hours, this boils down to finding the files newer than a certain date/time (the date/time of the last run) and then transfer them, yes?
Create with touch (see man touch ) a file with the timestamp of the last run of the script you write, then use find ... -newer <this file> to find all the files newer than this. Process this list of files in a loop and transfer them via scp (or whatever means of transfer you prefer, scp is just my suggestion).
That does not sound logical. Why would you want to backup A3.* only if the script runs every 24 hours? Which means A1 and A2 aren't backed up yet.
Use bakunin's proposal find ... -newer timestamp ... to create a list of files to be backed up now, then immediately touch timestamp to make sure all files created after this will be backed up next time the script runs (unfortunately this can't be done in an atomic operation so there's a small chance that a file created in that split second between find and touch will be missed next time. Reversing the order raises a small chance of doubly backed up files).