directory full?

I'm not a unix admin, just fell into support, so I may be asking a real duh question.

Client runs a PeopleSoft HR/Payrool system. The batch server runs in HPUX PA_RISC 11.11
When a batch process runs, output is written to "staging" directory. When the job finishes, successfully or not, the output, logs, etc. are "wrapped up" in html and copied to a "report repository".

The report repository output can be accessed from an application page in the browser.

The staging directory has never been purged. Since each process's output is unuiquely named, it just keeps adding files to the directory.

Yesterday all jobs began to fail. The only app message was vague ("http transfer failure"). Had the unix admins on IM and we looked at this and that. Checked file space (68%). Looked for permission changes and such.

In my endeavors, I realized that the directory had a ginormous number of files. On a wild-ass hunch, I deleted some of the oldest ones. Jobs began to run to success.

So, is there some kind of limit to the number of files in a single directory? Is it a kernel parm? Is it an OS limit? Was it something else entirely?

You know you need some cleaning when ls * gives you an error message like "too much arguments"...
I dont know of limitations as such but do know that enourmous quantities of files in one directory can give great issues... ( like >800 000 files /tmp...) and its not obvious to clean after ( "too much arguments"...) so if there is a need to keep then move them regularly in a sub-directoy