Hi,
I have a query regarding execution of a sql query having long listing of parameters ..I need to execute this query inside a shell script.
The scenario is like....
Suppose I have a file abc.txt that has the card numbers..it could be in thousands..
then I need to fire a query like
select name from table_name where cardnum in ('$x')
where x is the data from the file abc.txt...I am able to execute the query if the data is less ..but if the file abc.txt has thousands of lines..then in that case how to execute it...because when I execute the shell script..it gave me error that there is a long list of parameters...
Please help me to form the shell script to execute it for a long list of parameters....Thanks
The way to avoid too many parameters is not to fiddle and fidget until it accepts too many parameters, as too many parameters will be too many parameters no matter how you cut it. The way to avoid too many parameters is to not use too many parameters.
Perhaps you can avoid reading the file into a shell variable at all:
( printf "%s" "select name from table_name where cardnum in ('"
cat parameters.txt
printf "%s\n" "')" ) | databasecommand
Hi Corona,
Thanks for the suggestion..but still it is not working...
the input text file is like below:
1234
7654
9654
5412
7653
.
.
.
9873
6513
I need to run sql query inside a script and then store the result in a file..I am doing it like below:
The command below will make the file abc_csv.txt like below...
'1234','7654','9654'......And it will feed as input to the query.
cat abc.txt|tr '\n' ','|sed "s/^/'/"|sed "s/,$/'/"|sed "s/,/','/g">abc_csv.txt
pqr=`cat abc_csv.txt`
sqlplus -s user/password@dbschema<<EOF
alter session set db_file_multiblock_read_count=128;
alter session set NLS_DATE_FORMAT='yyyy-mm-dd HH24:mi:ss';
set newpage none feed off
set recsep off
SET HEADING OFF
SET FEEDBACK OFF
SET VERIFY OFF
SET LINESIZE 999
SET PAGESIZE 0
set feedback off trimspool on linesize 300
spool sql_out;
SELECT name from table_name where cardnum in ($pqr)
/
spool off;
exit;
EOF
I am able to execute this shell script if there is small data in abc.txt file...but if the file is large ..it is not working
Since your "in list" could be more than 1,000 items, You could use a 'external table' and just join the external table to the target table(s) so that you only retrieve matching card numbers, example:
-- Create oracle external table linked to a flat file
create table user.table (
card_num varchar2( 16 ) )
organization external (
type oracle_loader
default directory TMP
access parameters (
records delimited by newline
badfile 'card_numbers.bad'
discardfile 'card_numbers.dis'
logfile 'card_numbers.log'
missing field values are null
( card_num ) )
location('card_numbers.txt') )
Not sure how feasible is this, break the input file into smaller ones (each file contains as many records as permissible for the IN SQL operator). For each small-input-file, execute your original sql query.
Hi Corona...Thanks for the info...but what is this header and footer...could you please modify your script with the sql query..I mean where to put ... so that I can understand?
Ok, You can just read your file in the shell script dynamically creating your sql statement, each "in clause" can have a max of 1,000 elements, example:
select co1, col2, col3
from schema.table
where card_number in( 1, 2, 3, ..., 1000 )
or card_number in( 1001, 1002, 1003, ..., 2000 )
or card_number in( 2001, 2002, 2003, ..., 3000 )
or card_number in( 3001, 2002, 3003, ..., 4000 )
or card_number in( 4001, 4002, 4003, ..., 5000 )
Thanks Corona...what to write inside the query...But I am not able to form the query..If possible, could you please mention the coding steps as I do not have too much idea of shell scripting?