Finding a specific pattern from thousands of files ????

Hi All,

I want to find a specific pattern from approximately 400000 files on solaris platform. Its very heavy for me to grep that pattern to each file individually.

Can anybody suggest me some way to search for specific pattern (alpha numeric) from these forty thousand files. Please note that out of 40000 files only few files has this specific pattern.

I would really be greatful. This really can make my work easy.

Thanks,
Aru

one way i could think of is to run multiple instance of grep together dividing the number of files to each of the instance.

If you are running it in live env take care of the no of instance, the load your system could take on

Try to use find command coupled with -exec option.
If you know the pattern of filenames you can narrow down no. of files to be searched
with wild card characters.

request was to search for a pattern within the file and the user is not sure of the file pattern which contain the search pattern
only input is the filenames and the pattern to be searched within the file

moreover find with exec is not an effective way for large runs
use find piped to xargs

Ya matrixmadhan...... you are right . I am not sure which files out of those 40000 file contains a search pattern..

suppose that I have files 1x,2x,3x........,40000x .
So, how are you going to use find piped to xargs , say to search pattern "N1cd56" in the above 40000 files.

please tell me the usage.

Thanks,
Aru

that was just a comparison what i said about exec with find and xargs with find

suggested was:
instance 1:
for files in <file 1... file10000>
do
grep <pattern> files
done

instance 2:
for files in <file 10001... file20000>
do
grep <pattern> files
done

similarly for instance 3 and 4.
you can just have list of filenames in a particular file and just furnish the required filenames (file 1.. file 10000) to each of the instance

hope i am clear

Let me try once......

Thnx.
Aru