how to insert data in database based on text file?

Hi....can you guys help me out in this script??

Below is a text file script....called Bukom.txt and it contains these:

BUKOM 20060101 2.5 2.6 2.7 2.8 2.9 2.3 2.1
BUKOM 20060102 2.4 2.5 2.6 2.7 2.7 2.6 2.4
BUKOM 20060103 2.1 2.3 2.5 2.6 2.7 2.7 2.6

I would like this set of data to be auto read by the script to be insert into the database.
The database table is call bukom_table
and its contains 4 fields..namely:
locn,
date,
hour,
strength..

How to create a script that insert this data into the database?
for example bukom will be in locn...and i do not want anyspace behind.
and for the date will be 20060101 and the strength will be 2.5.

ok for the fields hour it will have to create manually...like for first row..2.5 is the strength and 0000 will be hour..follow by 2.6 will be 0030 hour and 2.7 will be 0100 hour. likewise for the second row...2.4 will be 0000 hour and 2.5 will be 0030 hour and so on...

im having a headache now on working on this scripts....hope you(expertists) can help :slight_smile: a million thanks :slight_smile:

Have you thought about SQL Loader??
Invoking SQL Loader through a script can help you.

yes...i thought of that too.... however my text file is not formated in a way to be append in and its a very huge set of data..

You can write code to create the formmated data you want and write it to a csv file. The format of the csv files should be:

field1,field2,field3,field4

You should then need a corresponding ctl file containg the column names in a comma separated format. Then you can have a shell script calling the sqlldr which can use the data and ctl files.

Regards,
Rahul.

And however large the mount of data maybe there are many options in sqlldr to load it into the DB in less than a minute. Just take care that the table u r inserting data into does not have any indexes created on them.

i got over thousands of data and its hard for me to format all into the correct fields....i need a script to auto read it as there are more than 8 files to read from.......above are my problems....as you see the hour field needed to be inserted manually..i will need days to insert it as the records are too much...

mysqlimport --fields-terminated-by=" " your_database Bukom.txt

What database are you using (mySQL, informix, DB2)? What shell?
If your records are all fixed width, you may be able 'prep' it before loading..if not, you may need to rethink the problem.
Let me know.

Since you are considering using SQL Loader, I would guess that you are using Oracle. The usual method is to load the data into a set of temporary tables. Then write some PL/SQL code to read through the temporary tables to reformat and insert into your table. This method works well with very large data sets.

An alternative method is to bypass SQL Loader + Pl/SQL by using awk to create an "insert" script. Like this...

$ cat Bukom.awk
BEGIN {
  ins = "INSERT INTO bukom_table (locn, date, hour, strength) VALUES"
}
{
   for (i=3; i<=NF; i++) {
      printf "%s ('%s', %s, %s, %s);\n", ins,\
             $1, $2, sprintf("%02d%02d", int((i-3)/2), int((i-3)%2)*30), $i
   }
}

$ awk -f Bukom.awk Bukom.txt
INSERT INTO bukom_table (locn, date, hour, strength) VALUES ('BUKOM', 20060101, 0000, 2.5);
INSERT INTO bukom_table (locn, date, hour, strength) VALUES ('BUKOM', 20060101, 0030, 2.6);
INSERT INTO bukom_table (locn, date, hour, strength) VALUES ('BUKOM', 20060101, 0100, 2.7);
INSERT INTO bukom_table (locn, date, hour, strength) VALUES ('BUKOM', 20060101, 0130, 2.8);
INSERT INTO bukom_table (locn, date, hour, strength) VALUES ('BUKOM', 20060101, 0200, 2.9);
INSERT INTO bukom_table (locn, date, hour, strength) VALUES ('BUKOM', 20060101, 0230, 2.3);
INSERT INTO bukom_table (locn, date, hour, strength) VALUES ('BUKOM', 20060101, 0300, 2.1);
INSERT INTO bukom_table (locn, date, hour, strength) VALUES ('BUKOM', 20060102, 0000, 2.4);
INSERT INTO bukom_table (locn, date, hour, strength) VALUES ('BUKOM', 20060102, 0030, 2.5);
INSERT INTO bukom_table (locn, date, hour, strength) VALUES ('BUKOM', 20060102, 0100, 2.6);
INSERT INTO bukom_table (locn, date, hour, strength) VALUES ('BUKOM', 20060102, 0130, 2.7);
INSERT INTO bukom_table (locn, date, hour, strength) VALUES ('BUKOM', 20060102, 0200, 2.7);
INSERT INTO bukom_table (locn, date, hour, strength) VALUES ('BUKOM', 20060102, 0230, 2.6);
INSERT INTO bukom_table (locn, date, hour, strength) VALUES ('BUKOM', 20060102, 0300, 2.4);
INSERT INTO bukom_table (locn, date, hour, strength) VALUES ('BUKOM', 20060103, 0000, 2.1);
INSERT INTO bukom_table (locn, date, hour, strength) VALUES ('BUKOM', 20060103, 0030, 2.3);
INSERT INTO bukom_table (locn, date, hour, strength) VALUES ('BUKOM', 20060103, 0100, 2.5);
INSERT INTO bukom_table (locn, date, hour, strength) VALUES ('BUKOM', 20060103, 0130, 2.6);
INSERT INTO bukom_table (locn, date, hour, strength) VALUES ('BUKOM', 20060103, 0200, 2.7);
INSERT INTO bukom_table (locn, date, hour, strength) VALUES ('BUKOM', 20060103, 0230, 2.7);
INSERT INTO bukom_table (locn, date, hour, strength) VALUES ('BUKOM', 20060103, 0300, 2.6);

There is a significant performance trade-off using this method over SQL Loader.

hi ygor :stuck_out_tongue: yes im using oracle..thanks for you effort in helping me with this scripts..you totally understand my problem..thanks...