Out of memory in PERL

I dont know whether this is a common error but it seems my PERL program is running out of buffer memory. Inside a loop I am processing an array (reading elements and displaying in it--note: for every iteration of the loop the WHOLE array is being scanned once--...the array was created by reading a file in it). I notice that as the number of repitions of the loop increases some characters are progressively clipped off off the end of the array element (I am storing strings). So, when the loop runs the first time, I get the elements I stored in the array, but by the time its in the last iteration, the elements are noticeably clipped from the end. In fact if read the output properly, I find that the clipping is increasing progressively with each iteration, which is ridiculous!

Can someone explain what happened here?

Perl rarely goes out of memory. Under normal circumstances, it is more likely that a badly written program using the improper algorithm is doing the trouble.

Can you post a minimal example as to how you loop things? If you can quote a complete example that is giving you bad performance then that will likely help us understand why your program does not perform well.

open (INFILE,"file1");
my @all_PL=<INFILE>;

close INFILE;

open (INFILE,"file2");
my @all_PL_additional_details=<INFILE>;
close INFILE;

my ($pl_line,$group_name,$file,$sd_line,$sd_group_name,$sd_file,$var);
foreach $pl_line (@all_PL)
{
chop($pl_line);
($group_name,$file)=reverse(split (/:/,$pl_line));
open (PLFILE,">>$TEMP_DIR/$file");
print PLFILE "something\n";

   foreach  $sd_line  \(@all\_PL\_additional_details\) 
   \{   chop\($sd_line\);
       \($sd\_group\_name,$sd_file\) = reverse\(split \(/:/,$sd_line\)\);
	   
                if \($sd\_group_name eq $group_name\) \{ 
		
		    open \(SDMATRIX,"$TEMP\_DIR/$sd_file"\);
		     my @sd_matrix=&lt;SDMATRIX&gt;;
		     close SDMATRIX;
			
	   		my @writable\_data=@sd_matrix[2-11];
			print PLFILE "something\\n";
			foreach $var \(@writable_data\) \{	print PLFILE "$var";\}
	        \}
   
   \}

close SDFILE;

  1. Holding the entire content of a file in an array (a slurp) for a huge file is likely to be inefficient, or just consume lots of memory. Try to rewrite your script so that each line is read as is processed.

  2. In the loop you are opening a lot of PLFILE but you are not closing any one of them. If you do a lot of I/Os you should try to close filehandles as early as they are not needed.

As for "clipping", I have not experienced any of such behaviour myself.

Actually I rewrote the parts where I am assigning the contents of a file to an array ( now processing the contents with a while (<FILEHANDLE>) , but the error still occurs ), but I sure missed the part abt the open PLFILE handles- thanks!

And, yes, this is the first time I faced such an error too....apparently an error which didnt generate an error message, which is much more ridiculous.

Hi,
This thread was started when I found an error that I wrongly attributed to buffer overrun in PERL. I found the bug recently, and thought would mention it for future reference. The progressive clipping of strings is happening because I was using chop() when I read in a line, to drop the trailing new line character. e.g.
chop($line);

Although this works fine for most of the time, I didnt realise that it would fail miserably for the last line of the file, which being the last line, DOESNT have a trailing new line character. So the above function was actually clipping off a REAL character! And more the loop ran, the worse the condition became.

Instead, now I am using this (which I believe is far more safer):
$line==~tr /\n//d;

Needless to say this bug would have been a nightmare if i had released my code into production!

Thanks to cbkihong for following up with the previous posts.