Hi friends,
Need your help again to get a best approach for the below scenario.
I am previously having one shell script which accepts request_id/s as the command line argument.
single req_id arg= 1111
Multiple arg= 1111,2222,3333
which gets passed to the embedded sql inside to the in operator to update the status to inactive
Pseudo code for the same : (please ignore syntax error)
#! /bin/bash
set -vx
req_ids=$1
sqlplus -s xxxx/yyyy@zzzz << EOF >> /grs/sql_logs
update table_name set status='INACTIVE' where req_id in ($req_ids);
commit;
exit;
EOF
The above method was working like a charm for some small set of request_ids.
As presently I am getting a huge volume of request_ids(~1200) I fear whether the command line argument will allow a string of 5000bytes.
If at all it allows the in operator will not allow > 2499 bytes of data.
I have tried inserting the record into a temp table and run a for loop in sql but can't as i am not privileged to create temp table in prod env.
Also tried to pass the file as the command line (.csv file) argument then split the file into 300 records each by pivoting using awk command and process one by one file and remove them once processed but still restricted to create temp files in prod box.
Can you please focus some idea to deal with this situation without using temporary table and split method.
Many Thanks to all the members and UNIX.com