Updating a CSV file by multiple PERL scripts

Hi Friends,

I'm writing code to update a CSV file by multiple PERL scripts.
I've around 2000 PERL scripts which need to register their entries into a CSV file. Hence, I'm inserting following line code in all the 2000 PERL scripts.

open(FILE, ">>/software/app/TASKS/Batch_Job_Log_$rundate.csv");
sleep(60);
printf (FILE "$JobName\_$JobNumber, $rundate\n");
close(FILE);

Problem is, only around 500 jobs are making their entries in the above CSV file out of 2000. But as shown above I'm giving some delay [sleep (60)] to handle CSV file by multiple PERL scripts.

Could any of you help me to improve the above scenario (updating CSV for all 2000 PERL scripts)?

Thanks and Regards / Lokesha R.

I think 'sleep (60);' is to allow other perl scripts to write the same file, which is already opened by some other script. That should work fine, I believe.

Thanks for your reply, Ganapati.

I thought so, but I don't know whether this is the right approach to allow multiple scripts to write the same CSV file.

Hi.

Have you considered the advice in:

perldoc -q lock

cheers, drl

Thanks for the reply drl.

But I donot want to lock the CSV file, it should be open for all the scripts to update their data? Please correct me, if I'm understanding your intention wrongly.

With Thanks and Regards / Lokesha R.

Hi.

I was thinking of locking, then unlocking a file in the sense of avoiding simultaneous access to the file, which could result in over-writing changes, as described in File locking - Wikipedia, the free encyclopedia ... cheers, drl