Creating script to read many files and load into database via single control file

Hi,
I have many files but with only 2 names , I want to load the data of that file into database through sqlldr with single control file. how can i do that ?????
Example:

switch_file
switch_file
billing_file
billing_file

now these files should be loaded into same database but different tables.

for i in $(ls /tmp*.switch) ; 
do
  sed -i "s|FILENAME|$i|g" /tmp/data1.ctl  --Read
  sqlldr TNS control=/tmp/data1.ctl
  sed -i "s|i|FILENAME|g" /tmp/data1.ctl
done

for i in $(ls /tmp*.billing) ; 
do
  sed -i "s|FILENAME|$i|g" /tmp/data1.ctl  --Read
  sqlldr TNS control=/tmp/data1.ctl
  sed -i "s|i|FILENAME|g" /tmp/data1.ctl
done

COntrol FIle:

load data 
infile 'FILENAME' "str '\r\n'"
append
into table TEXT
fields terminated by ' '
trailing nullcols

Rather than using sed to change your original template file and then change it back why not create a new file with the changes and remove once you are done. Here I use $$ which expands to shells processID to ensure unique filename in case script is run simultaneously.

Also assuming TEXT is supposed to be replaced with the table names:

for table in switch billing
do
   for file in /tmp/*.$table 
   do
       if [ -f "$file" ] 
       then
           sed -e "s|FILENAME|$i|g" -e "s|TEXT|$table|g" /tmp/data1.ctl  > /tmp/$$_data1.ctl
           sqlldr TNS control=/tmp/$$_data1.ctl
           rm /tmp/$$_data1.ctl
       fi
    done
done