Segmentation Fault(Core Dump) Error

Hi all,

I have a folder with some 28 files. I have a script file that will iteratively take one file at a time from the folder and provide an output for the input file. Till the 7th file, there was no problem but from the 8th file onwards, i got this Segmentation Fault(Core Dump) error. A file named "Core" is getting created also. When i checked my log file, it showed the above mentioned error in this line :

java -Xms1024m -Xmx1024m -classpath "jars" a.b.c -f propertiesFile

Can someone suggest a solution?

Thanks in advance.

There is not really enough here to make good suggestions. Have your tried to run a debugger (jdi, jdb, etc.) on the class code to see where it is crashing? You need to see what incoming data is like just before the segfault, and where in the code it is crashing.

Segfaulting is usually the result of trying to store XXX data into a memory location, where only X memory is allocated.

Hi,

I didn't try running any debugger. I will just describe the problem in detail. I have some 20 odd xls files which has some data in it. The filesize is less than 600KB put together. I have an unix script that will iteratively pick one file at a time and then call a java pgm which will perform some operations n write a output text file.

Given below is the code snippet:

FILES="*"
for file in `ls $FILES`
do
java -Xms512m -Xmx512m -classpath "-jars" javaFile $file        --Line NO 4
done

am passing the filename to the java pgm for performing some operations. Out of the 20 odd files , till the 9th file the whole pgm runs perfectly after which for each file i am getting an error in Line No.4 saying "Segmentation Fault(Core Dump)". I am running this script in unix server.

I tried executing a script file tat takes an input file size of 30MB and tat generates an output file of same size. The execution went well and i got no segementation fault for this scenario.

But in the case of less file size am getting this error. Can someone suggest an idea?

Thanks in advance.