Finding files in current directory when 100,000's files in current directory

Hi All

I was wondering what is the most efficient way to find files in the current directory(that may contain 100,000's files), that meets a certain specified file type and of a certain age.

I have experimented with the find command in unix but it also searches all sub directories. I have found references on internet that specify maxdepth/depth to limit search to current directory but the flavour of unix I am using does not have the maxdepth flag, I have read the man pages of find command and cannot find any way to limit search to current directory.

Can I use a unix ls command and pipe the output to another built in unix command that can check for age of file(remembering it can handle searching 100,000s of files)?

I seem to be having much issues with this and have not come up with a simple solution.

Can someone give me a sample unix script of what to do?

See: advanced/complex uses of the find command

Do you have the ability to use Perl or Python? You might also want to check to see if your copy of ls has the --full-date flag, which could help if you're doing a shell script.

Also, what version of Unix are you on? Sometimes --max-depth is there, but it's a different flag name.

Try "which perl" and "which python" to see if you have those. You could easily slap together a little script to do this.