I wondered why you were giving it ls -l, but it didn't occur to me... Well, they're not being ordered by date because find executes ls 99,999 times for 99,999 individual files here. Sorting a list one file long just leaves you where you started.
Try '+' instead of ';', which should feed as many files into ls as it's able.
Unfortunately, if there's thousands and thousands of files, it may have to split the list into multiple chunks, so it would end up not sorted by date again.
I believe Solaris find has this feature but am not 100% positive. If it doesn't, something may have to be kludged with xargs for roughly the same effect:
find ... | xargs ls -lrt
This will not work if any of the filenames have spaces or quotes in them.
Your code executes ls for each file that is newer than the file named by <file_name> and if one of those files is a directory, you'll list the contents of that directory instead of the directory itself.
Note, however, that this will not work if the file name indicated by <file_name> contains any whitespace characters or if any filename older than that file has a name that starts with that name and is immediately followed by a whitespace character. I assume you can work around this limitation.
If perl is going to be used, there's really not much point in using find, sort, or awk. Perl can traverse directory trees, sort lists, extract fields, etc.
A similar, more efficient approach to your suggestion would be to -exec stat ... . GNU and solaris both provide a stat utility, but they require different options to accomplish the same task. Before calling find, the script can test the platform to choose between the two syntaxes.