Searching a particular string pattern in 10000 files

Problem Statement:-

I need to search a particular `String Pattern` in around `10000 files` and find the records which contains that `particular pattern`. I can use `grep` here, but it is taking lots of time.

Below is the command I am using to search a `particular string pattern` after `unzipping` the `dat.gz file`

    gzcat /data/newfolder/real-time-newdata/*_20120809_0_*.gz | grep 'b295ed051380a47a2f65fb75ff0d7aa7^]3^]-1'

If I simply count how many files are there after unzipping the above `dat.gz file`

gzcat /data/newfolder/real-time-newdata/*_20120809_0_*.gz | wc -l

I get around `10000 files`. And I need to search the above string pattern in all these `10000 files` and find out the records which contains the above `String Pattern`.

What is the best approach on this? Should we take `100 files` at a time and search for the particular String Pattern in that `100 files parallelly`.

Note:

I am running SunOS

    bash-3.00$ uname -a
    SunOS lvsaishdc3in0001 5.10 Generic_142901-02 i86pc i386 i86pc

Try this perl one-liner and see if it is any better:

perl -ne 'print if (m/searh_string/)' test*    # Searches all files that begin with: test

---------- Post updated 2012-08-16 at 12:16 AM ---------- Previous update was 2012-08-15 at 10:03 PM ----------

Also try out 'sed' over the unzipped files:

sed -n '/search_string/p' test*    # Searches all files that begin with: test

Try this one as well:

find . -name "*_20120809_0_*.gz" -print | xargs zgrep 'b295ed051380a47a2f65fb75ff0d7aa7^]3^]-1'

Just a note, if you have zgrep, you probably have zfgrep which is faster, but doesn't support any kind of regex. But, if you're just looking for a string, go with [z]fgrep