Continuous log file transfer to remote server

I have several production servers and 1 offline server. Production server continuously generates new log files for my application. Depending on time of day new files may be generated every few seconds and at other times every few hours. I also have an offline server where I would like to pull log files from all my production server and crunch information from all those logs. I want to to do this in near real time meaning as soon as production server finishes writing to a file and starts writing to a new file I want to transfer the previous file that has just been completed. When the new file completes I want to copy that and so on 24x7. I think a daemon process on offline server that checks for most recent file that has been closed and then transfers it would do the trick but I don't know how to implement my idea. Can someone share ideas on how to do this?
thanks

Do you manage the script that generate the logs or are they generated by some "blackbox-application-binaries" ?
Are they located in a specific PATH (which) ?
do they have a naming convention (which) ?

Checking when a file is closed means busting out strange applications like fuser or lsof. Why not just watch for when a newer one is created? Some silly pseudocode:

last_updated=$(ls --sort=time /path/to/log | head -n 1)

while true
do
        ls --sort=time /path/to/log > /tmp/$$
        read latest < /tmp/$$
        if [ ! "$latest" = "$last_updated" ]
        then
                new="$latest"
                exec 5</tmp/$$
                # Skip the file being written to
                read latest <&5
                read latest <&5
                while [ ! "$latest" = "$last_updated" ]
                do
                        cp "$latest" /wherever
                        read latest <&5
                done
                exec 5<&-

                cp "$last_updated" /wherever
                last_updated="$new"
        else
                sleep 1
        fi        
done

You could potentially make it much smarter, not needing to run ls repeatedly and all that, by knowing more about the pattern this thing creates logfiles in.

You could also use NFS or somesuch to have the server create the files on the other machine in the first place, instead of having to continuously watch and copy.

Logs are generated by an application. I think application uses logpipe mechanism. Yes they are all located in a specific path and path is specific to application.
They all follow a naming convention of abc165.22.34 where 165 is julian day (date +%j), and uses hour and minute time stamp so 22 is hour and 34 is minute and the value is picked when the new log file is opened.