rsh help - getting the output of remote script to local server.

Hi,

I have a script that runs for an hour.
Have to run it on remote server and need the output it produces on the remote server to decide for failure or success.
I run it through a Autosys Job which logs the outputs, both 1 & 2.
I use the commands
1) rsh <SERVER> 'nohup /tmp/xyz.ksh &'
2) rsh <SERVER> 'nohup /tmp/xyz.ksh >/tmp/ark.out 2>&1 &'

command 1 solves my purpose, but it holds up the port. So if the script runs for 3 hours, the port will be in use for 3 hours which is not desirable.

command 2 releases the port, but produces output on the remote server. So my Job doesn't know if the script was success or failure on the remote server.

One way could be to get the output file rsh'd to the host server again and do the processing.
I wanted to know if there is some functionality, to release the port used and also get the success or failure of the remotely executed script.

Thanks

I suppose you are referring to http://www.unix.com/shell-programming-scripting/102097-using-mutiple-nohup-execute-multiple-commands.html. I don't mind at all that you ignored my advice concerning nohup, but my advice would have been different in any case had I known the more ultimate goal.

Anyway, this line "the port will be in use for 3 hours which is not desirable" makes no sense at all. There is no port held up in such a case.

In this case, you need to run rsh so that the entire job on the remote server runs in the foreground. You then background the rsh process, but not until after checking the exit code:

rsh <SERVER> 'nohup /tmp/xyz.ksh 2>&1'  >output.job1 || echo job1 failed  &

With this command...
"rsh <SERVER> 'nohup /tmp/xyz.ksh &'"
the rsh will wait till the output on the script "/tmp/xyz.ksh" is generated..
so if this script takes 3 hours....the rsh will wait for 3 hours to complete, i.e. the output of the script "/tmp/xyz.ksh" which is executed on the remote server will be written to the host server.
During this time, the rsh is still active and creating problem since there is a connection between the 2 servers...and this i feel uses a port.
I may be wrong as i'm not that deep into unix.....

I have the problem solved temporarily wherein i have put a condition to check the "err" file produced remotely and have the Autosys Job fail if it finds the "err" file with some stuff populated in it.
But still want to know the solution if any for the porblem i posted.

True, a port is used, but there are over 30,000 such ports available to programs. It is in reality, nothing to worry about.

It's best to do it like this

rsh <SERVER> /tmp/xyz.ksh  >locally-saved-report.out   &
wait

In our environment...
the SA has reduced the no of ports....thats where the problem arises...
Also i don't have just one job...
There are 2 jobs each of the total of 90 servers.
Each Job has within it some 5 "rsh's"..so in all somewhere around 1000 "rsh's". There are many more applications running out here on these servers.
Me blocking so many ports is a bad thing for others, so wanted to reduce as much as I can.
As I said I do have a temporary solution where I create the o/p on the remote server, and check it later, but need to know if there is any other way to get it after the remote run directly.

Look into a job queuing system like SGE or Torque. While these are meant for compute clusters, it makes sense to configure and install in your case. You simply submit jobs to the queue, which determines an appropriate cluster host to run on (you can also specify certain jobs to certain hosts, if needed), and when the job is done, you get an email with (optionally) the output of the command. Only a few ports are needed per machine, and it is a lot cleaner than RSH. I found SGE to be cleaner than Torgue. Both are free and open-source.