Forking and Pinging

Keep in mind that I haven't done Perl scripting for a LONG time, so I'm quite rusty.

This is what I would like to do:

  • using fork, create 3 or 4 processes to read 3 or 4 different text documents containing server names or IP addresses
  • in each of those processes, Perl will ping each of those server names or IP addresses to see if they are pingable
  • when the result is known, it will be outputted into a seperate text file (or logfile)

I'm just wondering if this can be done or not (theoretically). I basically want to ping multiple things at the same time, using fork.

How about writing one ping script and passing the file name as a parameter.

Then you can have a wrapper script that runs the processes in the background:

ping_script file1 &
ping_script file2 &
....

Would that really work, though? To me, it sounds like the script is being run at the same time on 4 different occasions (... I wonder if that makes sense or not...).
To me, the problem is that I want all of the results to go into one output textfile. If the script were to be 4 different processes at the same, would they be able to write into the same output file at the same time with no errors?

Writing into the same file will be a problem for sure. But you can write to different file based on the input parameters and once all the processes finish, you can 'cat' them and write the output to the final file.

Just an idea - I am sure there are better ways of doing it :slight_smile:

Actually, that sounds pretty good.
Just means more scripting for me to remember, hahaha.

What about this scenario, though?
Since pinging can take a while (especially with a lot of servers and IPs), reading through each server one at a time can be time-consuming.
What if all of this info was in one text file?
Would forking be recommended then if I had no choice but to read from one file only?

Not sure what you will gain by using fork if all server names are in one file. If you have multiple files then your pings can run in parallel.

May be it is time to do some actual testing :slight_smile:

If all server names are on one file, then pinging each of them one by one will take a long time, especially if some of them are unreachable (which WILL happen). That is why I want to try to fork the processes so that multiple servers can be pinged at a single time in order to reduce time usage.

writing a simple script like soleil4716 suggested would work fine and run in parrellel. Using Perl and fork() is possible also but slightly more complex. I suggest you limit your ping command to 3 requests and set a small timeout. Depending on how many processors you have you probably don't want to create 1,000 concurrent pings so you should keep it manageable. For example: only run 20 pings concurrently and as one completes kick off another so you don't tax the system.