PHP looked nice for the first three, but then you got into batch. Maybe put that file into a DB table, process it and select out the new file? Can the unix server push the file so it gets processed and returned, perhaps with a web query?
I don't think PHP would be good for me.. Since the system that I am working do not have any PHP codecs
We are talking about a huge file here more than a million records. Loading that into a temporary table is worthy when I have lot more functions to carry on with the data. All I do here is a simple comparison against a column and prepare the new records based on it.
I am working with Informix with too many restrictions. I am sure that I am not allowed to install Interbase/firefox in my system (working system not my own.. company's :D)
Well, no point in bothering the web server for the batch, normally. PERL can talk to RDBMS as a client, running a query per record. Mapped in files are virtually inside the DB engine, allowing them to be worked in SQL and efficiently related to other tables inside a single, big query. Google "INFORMIX external file as table", and voila: Using data file abstraction with external tables in Informix Dynamic Server- You want to write a query that goes through the file one time relating records to a join of db tables, and returns a cursor of augmented data to your client. You need a robust client setup, not something dopey like some odbc/jdbc cursors that query all the result into VM first so you can walk backward, forward, etc. You need a non-reversible cursor that just flows the data out through your client code immediately as it arrives.