find & grep


I would like to know which files contain a certain string. If I use 'grep "string" *' only the working directory is being searched. I also want to search the subdirectories. When I use 'find . -type f -print |xargs grep "string" > dev/null' I get the message 'xargs: missing quote?'. What's up?
Should I use another command?

Thanks in advance.


grep -r "string" * will read all files under each directory recursively.

Which version of Unix does this work on? I have never encountered a grep that works like this. A portable solution is:

find . -exec grep string {} /dev/null \;

The beauty of UNIX is the 10,000 ways to do these things. Here is how I do it (one way) when I want to go down the tree:

find * | grep string

It is not the typical way, but I know many old UNIX people who do it this way. Guess we are lazy to think or type :slight_smile:

For example, if we want to search the entire file system for we would do this:

find / | grep

This way we find all the files. If we did not care about case then:

find / | grep  -i

I also like using egrep with regular expression matching for more tricky or complex matches and searches.

However, this will only find you the filenames that contain the string. It will not search for the string <I>within</I> the file, which is what the original poster was asking about.

Doing a find and then greping will certainly find the string within a string (in the file name) and works much better and faster. You are right PxT, this will not search the string within the file.

find / | grep lib

You will find a zillion occurances of the string 'lib' within a string. Here is a small sample and I've not even refined the grep or egrep (and just captured small output).


Yes, it does not search the actual file. I normally use PERL for that and not xargs. For some strange reason, xargs and I don't gel.... when I have to search files and strings within lots of files I use command line PERL.

[Edited by Neo on 01-31-2001 at 08:52 PM]

grep -r "string" * works pretty well on Linux(RH).

However this is not a comprehensive method. But I think this could be used for what poster is mentioned since he is using '*'(this cause grep read all files under each directory recursively).

But if he uses grep -r "www" html* will only recurse into subdirectories whose names starts with "html". so if there is a subdirectory whose name is not starts with "html" will be ignored even if it has the file that contain "www" string.

my grep version: GNU grep 2.3

If I use this I get the message 'grep: illegal option --r'.

It's Sun 4.1, maybe that explains it...

Interesting. This must be a new feature. I have GNU grep v2.0 on my Linux box. Thanks for the info!

As I recall, there is often a problem using

find . -exec grep string {} /dev/null \; 

I haven't used this in a while, but I recall getting errors because this usage of find also 'finds' directories, binaries, and just about everything else under the sun. Because it finds binaries, directories, etc. it attempts to search them and gets pretty messy.

find . -type r -exec grep string {} /dev/null \; 

Will find only regular files, and not directories, but does not stop the grep from searching large binaries, as I recall. That is why I usually don't use this combo for grepping strings and forgot about this one. I don't recall a switch in find to stop it from 'finding' binary files. Perhaps there is ?

Well I usually narrow it down a bit. Something like:

find . -name \*.h -exec grep <<I>etc</I>>

Another good one is something like this:

grep <string> `file * | egrep 'script|text' | awk -F: '{print $1}'`

Not perfect, but I've used it a few times. A little more complicated to do recursive this way though...

[Edited by PxT on 02-01-2001 at 11:22 AM]