Error while script execution - 0403-029 there is not enough memory available now

Hi,

I am executing a shell script on AIX box where I need to list all the files in the file system and then do some other tasks with them. I am able to do this successfully on HP-UX and Linux boxes but I get the following error after 10-15 seconds when I try to execute the script on an AIX box.

0403-029 there is not enough memory available now

I have to search a string in the entire file system.

My code

for dir in $(ls -d /*)
     do           
            for file in $(find $dir -type f -print)
            do
                    echo $file                            
                    #search the string in the file once this runs successfully
             done
                
      done

Your shell is running out of space trying to get a list of all files in the first directory you are processing with:

for file in $(find $dir -type f -print)

Do you really need to print the name of every regular file on your system? Or, can you just print the names of the files containing the text for which you are searching along with the lines that contain the matching text in those files?

Are you searching for fixed strings, for strings matching a basic regular expression, or for strings that match an extended regular expression?

PS: Are you intentionally excluding dot files found in the root directory on your system from your search?

If it is just to search all files for a string wouldn't it make more sense to do something like:

find / -type f -exec /some/search/command {} \;

and then the command /some/search/command could report something sensible based on result.

And since the scripts works on HPUX and Linux - either they have more memory, fewer files and/or they store temporary results differently than AIX.

I'll describe in more detail. I have to search a string in the entire file system. I start by getting the list of directories at the root level, then I go to each directory and search in all the files. While doing the search I have to exclude few directories and files as well. code is given below.

root_dir="/"
mysearchstring="sysuser"

for dir in $(ls -d $root_dir*)
do
                for file in $(find $dir -type d \( -name 'sys' -o -name 'proc' -o -name '*.svn*' -o -name '*.ssh*' -o -name '*.subversion*' -o -name '*.snapshot*' -o -name '*crash*' -o -name '*dbatools*' -o -name 'opt' -o -name 'lic98*' -o -name '*informix*' -o -name '*developer*' -o -name '*tmp*' -o -name '*temp*' -o -name '*log*' -o -name '*spool*' -o -name '*perl_build*' -o -name '*.Z' -o -name '*is*data*' -o -name '*shared1*' -o -name '*pi_inbound*' -o -name '*backup*' -o -name '*archive*' \) -prune -o -type f \( ! -name '*.gz' ! -name '*.Z' ! -name '*image*' ! -name '*.out' ! -name '*.log*' ! -name '*.LOG*' ! -name '*.csv' ! -name '*.dat' ! -name '*.rcv' ! -name '*temp*' ! -name '*.gif' ! -name '*.png' ! -name '*.unl' ! -name '*tp_cleanup.*' ! -name '*.send' \) -print)
                                do
                                                                if [[ $(grep -l -i -e ${mysearchstring} "${file}") != "" ]]
                                                                then
                                                                                                echo $file
                                                                fi                              
                                 done
done

Here I am excluding few directories and files while doing the search. Also if I do not use print, I still get the not enough memory error.

---------- Post updated at 01:05 PM ---------- Previous update was at 12:54 PM ----------

I'll describe in more detail. I have to search a string in the entire file system. I start by getting the list of directories at the root level, then I go to each directory and search in all the files. While doing the search I have to exclude few directories and files as well. code is given below.

root_dir="/"
mysearchstring="sysuser"

for dir in $(ls -d $root_dir*)
do
                for file in $(find $dir -type d \( -name 'sys' -o -name 'proc' -o -name '*.svn*' -o -name '*.ssh*' -o -name '*.subversion*' -o -name '*.snapshot*' -o -name '*crash*' -o -name '*dbatools*' -o -name 'opt' -o -name 'lic98*' -o -name '*informix*' -o -name '*developer*' -o -name '*tmp*' -o -name '*temp*' -o -name '*log*' -o -name '*spool*' -o -name '*perl_build*' -o -name '*.Z' -o -name '*is*data*' -o -name '*shared1*' -o -name '*pi_inbound*' -o -name '*backup*' -o -name '*archive*' \) -prune -o -type f \( ! -name '*.gz' ! -name '*.Z' ! -name '*image*' ! -name '*.out' ! -name '*.log*' ! -name '*.LOG*' ! -name '*.csv' ! -name '*.dat' ! -name '*.rcv' ! -name '*temp*' ! -name '*.gif' ! -name '*.png' ! -name '*.unl' ! -name '*tp_cleanup.*' ! -name '*.send' \) -print)
                                do
                                                                if [[ $(grep -l -i -e ${mysearchstring} "${file}") != "" ]]
                                                                then
                                                                                                echo $file
                                                                fi                              
                                 done
done

Here I am excluding few directories and files while doing the search. Also if I do not use print, I still get the not enough memory error.

The following should do exactly the same thing without running into memory constraints as long as you don't start it in a directory where expanding $root_dir* overflows ARG_MAX limits:

root_dir="/"    # Note that if a directory other than / is used here, it must include a terminating /
mysearchstring="sysuser"
find "$root_dir"* -type d \( -name 'sys' -o -name 'proc' -o -name '*.svn*' -o -name '*.ssh*' \
    -o -name '*.subversion*' -o -name '*.snapshot*' -o -name '*crash*' \
    -o -name '*dbatools*' -o -name 'opt' -o -name 'lic98*' -o -name '*informix*' \
    -o -name '*developer*' -o -name '*tmp*' -o -name '*temp*' -o -name '*log*' \
    -o -name '*spool*' -o -name '*perl_build*' -o -name '*.Z' -o -name '*is*data*' \
    -o -name '*shared1*' -o -name '*pi_inbound*' -o -name '*backup*' \
    -o -name '*archive*' \) -prune -o \
    -type f \( ! -name '*.gz' ! -name '*.Z' ! -name '*image*' ! -name '*.out' \
    ! -name '*.log*' ! -name '*.LOG*' ! -name '*.csv' ! -name '*.dat' ! -name '*.rcv' \
    ! -name '*temp*' ! -name '*.gif' ! -name '*.png' ! -name '*.unl' ! -name '*tp_cleanup.*' \
    ! -name '*.send' \) -exec grep -lie -- "$mysearchstring" {} +

PS With what you're doing here, I don't see any reason why changing find "$root_dir"* -type d ... in the above to just find "$root_dir" -type d ... would produce different results (and it avoid the possibility of an ARG_MAX limit error).

1 Like

Well, for whatever reason, if it is not working on AIX - asis - change it.

The problem is, assuming, finding the file or files with a certain string, not that one command can be used everywhere, regardless.

Break it into components - e.g., get a list of all the files you want to scan, e.g., skip the -exec part and as a second command process the files where - if the string is found - output the filename and/or repeat the search with output to stdout.

My advice is not to spend too much time on find per se (and we have assumed it is AIX find, not a find from a RPM package of coreutils (or even my installp packaging).

If you are clever enough to write a single find command to do this task, I am sure it is child's play to break it into components.

---------- Post updated at 01:19 PM ---------- Previous update was at 01:13 PM ----------

One more thought - try prefixing (or use export) command with

LDR_CNTRL=MAXDATA=0x8000000 find ...

or

export LDR_CNTRL=MAXDATA=0x8000000
find ...

The value 0x80000000 assigns 2G of data to the application, rather than the default 256MByte. This value can be enlarged, but hopefully - 2G will be enough.

Thanks Don/Michael .. using Exec option with find command worked well and now I don't get the out of memory error. The script executed well.