hm, yes, probably. software version is 1.0
but its quite expensive, and I'm not sure if we can ask for update.
I was hoping there were some easy fix, or perhaps that maybe sometimes Solaris reports 100% usage even if it isn't. (server worked for months under such load) so that we could simply ignore it.
Most certainly bad programming from your description. This program IMO checks the directory continuously, without a delay for the required file, instead of checking every second, or waiting for an OS trigger. And no, you can't add that functionality afterwards.
But if it's so expensive, instead of asking for an update demand it, as this is certainly a bug.
If you are ready for hacks as workaround, it would probably be easy to slow down your process when it has nothing to process. There would be several approaches to do it like an interposing library or a dtrace script.
If you can have any sort of return code when the file(s) are uploaded, you could write a simple script that kills the program once a file is uploaded. Then build a cronjob that starts it every minutes if it is not already running.
Dirt cheap and gets things done faster then a monkey coder.
Well, if the OP capped the application to say 30%, it shouldn't hurt it as during normal operation it is not exceeding 20% of CPU utilization. Of course it is a temporary solution