Shell Script for call a procedure in Oracle DB

Hi everyone!

I'm new with Shell Scripting, and I have to do a shell script to call a procedure, which have 2 input parameters, the directory(from server) and the txt file (which have informations to update/insert in DB).
I have to create a shell script to execute that procedure for each txt file in that directory. Anyone have some tips for me to do that???

thanks,

Renato

IF you mean call a stored procedure search the forums for 'oracle shell'. We get this type of question dozens of times a year.

Hi Renato,

In your shell script add below code

arg_one=$1
arg_two=$2
sqlplus -s <<EOFSQL
select * from emp where ename=$1 and eno=$2; 
EOFSQL

please replace select query with your proceure execution sysntax

scripter

I know how to do for a single .txt file, the fact is, I have to do something like a FOR command for all txt files in that directory, but can I do that???

for a single file I create a procedure.sql with the lines:

exec prc_cargatxt('TESTE_SUPERVISAO','PRECOLETA.1502_17.06.2009_16h19m13s.txt');
/

and in the batch file I put:

sqlplus username/password@dbname @C:\procedure.sql

its working correct, but how to do for all files???

thanks

Hello renatoal,

for filename in `ls`
do
sqlplus username/password@dbname @${filename}
done

scripter

That's not a very efficient way of doing things.
First of all, you'll need an "exit" in there so as to exit from sqlplus and get ready for the next iteration with a new filename. (Otherwise, the script opens up sqlplus, runs the script and just waits for you to supply the next input to sqlplus.)

So the code would be something like this:

$ cat load_data_1.sh
#!/bin/bash
for i in data*.txt ; do
sqlplus test/test @procedure.sql $i <<EOF 1>/dev/null 2>&1
exit
EOF
done

$ 

Here, "procedure.sql" is a script similar to the one posted by the OP.

$ 
$ cat procedure.sql
exec prc_read_data('&1')
/
$ 

It calls the stored procedure "prc_read_data" which takes in a file name as input. (Furthermore, the procedure opens the file,reads the data and loads into a table.)

And there are 100 such files that match the pattern "data*.txt":

$ 
$ ls -1 data*.txt | wc -l
100
$ 

The problem with your approach is that it executes the "sqlplus" binary 100 times, one per iteration. So there are 100 connections, 100 executions, and 100 disconnections.

What you'd want to do is - 1 connection to Oracle, 100 executions, 1 disconnection. And for that you'd need something more powerful - Perl, for instance:

$ 
$ cat load_data.pl
#!/usr/bin/perl -w
use DBI;
$dir="data*.txt";
# get the database handle
$dbh = DBI->connect("dbi:Oracle:xe","test","test");             # connect once
# now loop through the files that match the glob pattern
while (defined($fname=glob($dir))) {                            # start looping through files
  $sth = $dbh->prepare("begin prc_read_data('$fname'); end;");  # prepare statement within loop
  $sth->execute();                                              # execute statement within loop
}                                                               # end loop
$dbh->disconnect();                                             # and disconnect once
exit;

$ 

This script uses the Perl DBI module, and is pretty self-explanatory.

For the 100 file testcase, the execution times are shown below:

$ 
$ time . load_data_1.sh

real    0m16.963s
user    0m4.540s
sys    0m2.304s
$ 
$ 
$ time perl load_data.pl

real    0m0.828s
user    0m0.196s
sys    0m0.080s
$ 
$ 

Of course, if it is a one-time job, then the shell script approach may work fine for the task at hand. But if it is a scheduled, regular job, with the number of files potentially increasing down the line, then the OP may want to consider ways to minimize the overall execution time.

tyler_durden