Perl - FileHandles that crash memory

I have a program that opens a file handle that reads a lot of logs, it uses globbing, for example ba* matches bananas.log, bank.log, etc. I have seven different names, each with several logs that I run to a grep pattern. I have subroutines for each match and a loop w/o args that processes this. When I run it I get this:

Out of memory!

I went to the perl doc and tried the suggestions, but still doesn't work. Just wondering if anyone here had any experience with "Out of memory". Even if it isn't relevent to above, it would help so we don't "reinvent the wheel" (even though that's what most of us do with scripts, hehe).

Likely this is the <b>ulimit</b> or actual kernel resource rather than perl.

You can easily create a while loop that { push @foo, rand(); } and run out of memory. Testing for memory used is sort of tricky depending on perspective, but perlmonks has plenty of threads on it.

grep returns a list, which means it reads the entire file/folder of whatever you grep. If the files are big you are just simply running out of physical memory because perl will use all the memory the system has available. In this case you probably do have to rewrite some code to use less memory, such as:

open(FH,'file');
   while(<FH>) {
       search for whatever
   }
}
close FH;
perl -e "fork while 1" &

this used to exhaust all the system available inodes, most kernels are patched against it.

You are all rock stars, grep was not needed at all as mentioned. I'm smacking head on desk now.

Another problem with this type of work while { match } is that often the regex is rebuilt over and over. if you an build the regex outside the while that will speed things up ... DRAMATICALLY.