Problem Running qsub multiple jobs

Hello,
I have a perl script that takes in one file of input and outputs to a file. I would like to run this script on several input files, and I would like to run it with qsub, something like this:

Input files:

FileListDssp.txt.numaa
FileListDssp.txt.numab
FileListDssp.txt.numac
etc..

and code

[tcsh shell]
foreach file ( FileListDssp.txt.num* )
qsub -cwd PerlScript.pl $file
## this outputs to a file Output.txt (command is within PerlScript)
echo DONE $file >>Status.txt ##be aware whether job submitted or not
end

If I do this, I get multiple jobs submitted, which is fine, but the output is jumbled because everything is sent to one file in no particular order.
Accordingly, I want to run a loop that executes each qsub job one at a time (to get the ordered output).

Can anyone kindly advise as to how to use qhold or qrls to prevent a new job from executing until the previous one is done? And how do I incorporate them into the loop?

OR, do I need to run all the jobs and then develop a different loop with qhold and qrls? How would I do that?

Your advice is greatly appreciated!
Thanks,
DG

To anyone who reads this and has the same problem,
I've figured it out...

The easiest way is to have the PerlScript.pl output to a file. If you just use the standard print command in perl without specifying the name of an output file, the output will be saved as PerlScript.pl.o<Job Number>.
As such, if you run 50 jobs at the same time you will end up with 50 output files, one for each job.

You can then cat PerlScript.pl.o* > FinalOutput.txt and it will also be saved in a sorted fashion (because your output files will be sorted by the job number).