How to log file processing details to database table usnig UNIX shell script?

we are getting files on daily basis.we need to process these files.

i need a unix shell script where we can count
1-The no of files processed
2-No of data/record processed for each files.

The script should log these details into a database table. If there is any error while file reading/extraction , then it should log into a error table.

Please let me know how to achieve this.

  1. Please let us know what OS/shell/DB you're using.
  2. Show us your attempts at solving this problem.

To begin with, you can have a counter flag that increments every time a file is processed. For the second part, have another counter that increments every time you read a line from file. Reset this counter when you finish reading the file.

Without knowing what DB you're using, it's difficult to answer. But on a general basis, you could do something like this (say for sqlplus):

echo "insert into table1 (col1, col2) values (val1, val2);" | sqlplus /

Hi Bala, Thanks for the response.

we are using HP-UX/ bash/ oracle

Do you mean this way

count=$(find .. -maxdepth 1 -type f|wc -l)
echo $count
let count=count+1 # Increase by one, for the next file number
echo $count

This is for counting number of files.

please suggest any high level approach

file_list=( $(find . -maxdepth 1 -type f) )
file_count=${#file_list[@]}

echo "Number of files: $file_count"
for file in ${file_list[@]}
do
    line_count=1
    while read line
    do
        process_line line
        (( line_count++ ))
    done < $file
    echo "Number of lines processed in $file: $line_count"
done