DB2 Export and Import Oracle

Use and complete the template provided. The entire template must be completed. If you don't, your post may be deleted!

  1. The problem statement, all variables and given/known data:
    is this enough to make the data perfect export into delimited file? there are some posted that i read, they included the table field names.

what is the different of this 2 (with table field names and without?

can anyone suggest for the import script into oracle?

  1. Relevant commands, code, scripts, algorithms:
db2 connect to Tablename user id using psswrd

db2 "EXPORT TO '/cardpro/brac/v5/dev/dat/AAAAA.DEL' OF DEL select * FROM AAAAA"
db2 "EXPORT TO '/cardpro/brac/v5/dev/dat/BBBBB.DEL' OF DEL select * FROM BBBBB"
db2 "EXPORT TO '/cardpro/brac/v5/dev/dat/CCCCC.DEL' OF DEL select * FROM CCCCC"
...
.. 

db2 terminate
  1. The attempts at a solution (include all code and scripts):
db2 connect to Tablename user id using psswrd

db2 "EXPORT TO '/cardpro/brac/v5/dev/dat/AAAAA.DEL' OF DEL select * FROM AAAAA"
db2 "EXPORT TO '/cardpro/brac/v5/dev/dat/BBBBB.DEL' OF DEL select * FROM BBBBB"
db2 "EXPORT TO '/cardpro/brac/v5/dev/dat/CCCCC.DEL' OF DEL select * FROM CCCCC"
...
.. 

db2 terminate
  1. Complete Name of School (University), City (State), Country, Name of Professor, and Course Number (Link to Course):
    Xavier university, Philippines, lincaro, computer engineering

Note: Without school/professor/course information, you will be banned if you post here! You must complete the entire template (not just parts of it).

You might be able to get the data into Oracle in a number of ways.

  • The Oracle command sqlldr to load the data from an input file
  • The Oracle command imp if the file format is compatible with Oracle exports (usually written with Oracle command exp)
  • Use the file as a table directly from the database. You would use CREATE DIRECTORY, however this requires elevated privileges and is a risk because the file is left in plain sight. You could mitigate against this by subsequently running a CREATE TABLE new_table AS SELECT * FROM my_temporary_table ; and then dropping the plain file on disk.
  • If the file is not too large, read the file use a shell script and run a set of commands:-
    [list]
  • CREATE TABLE with the correct structure
  • INSERT statements for each record of the input file
  • CREATE INDEX if appropriate
    [/list]

You might get away with writing to/reading from a pipe rather than a regular file. The command mknod p /path/to/file will create the pipe. It's a special type of file that is a sort of buffer for data. If you then start your importing process (one of the first two options) reading it and then in another session run the export to it, you might not only save disk space but you will be doing both in parallel so you will save on the elapse time, i.e. 30 minutes to export plus 40 minutes to import (harder work for the database) might become 42 minutes for both together.

Of course this is dependant on the volume of data, disk structure (contention) and CPU load to a small extent however if these databases are on two separate servers, you can extend this trick even more. Consider the process actually requires three steps totalling 90 minutes:-

  1. Export - 30 minutes
  2. File transfer - 20 minutes
  3. Import - 40 minutes

You can:-

  1. Create pipe for export
  2. Create pipe for import
  3. Start import
  4. Start propagation process (I've used a dd & ssh pipeline but there are other options)
  5. Start export

You might find that this all runs in 40 minutes because there is less disk contention on the import side. Of course, it's just theory and is entirely dependant on what hardware is in play and what contention there actually is at the time.

Robin

1 Like

Thank you very much Robin (as rbatte1), I appreciated your replied.

Another option would be to use oracle heterogeneous connectivity to connect to DB2. Use an ERD tool like Erwin to reverse engineer the DB2 database and convert it to Oracle. Then create the new tables in Oracle without indexes, constraints or triggers, then you can just run insert into table as select commands to directly move your data to the new tables. Then generate the script for indexes and constraints and write any triggers and or stored procedures that you need. Finally automate the process and do many test runs.

1 Like

Thanks gandolf989

---------- Post updated at 01:25 PM ---------- Previous update was at 01:17 PM ----------

Hi All,

Anyone please help? I run my import script in unix but im not sure why its running to long? so I cancel the job.

can anyone tell me what is the problem of my script?

Code:

sqlldr userid/psswrd control=/export/home/user/Import_Script/Table.ctl
LOAD DATA
INFILE '/export/home/user/Export_Data/XTABLE.DEL'
APPEND
INTO TABLE CP_TABLE
FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
( TABLE_NO, TABLE_NO_CARD, TABLE_OPEN_DATE, TABLE_AUTO_STOP, TABLE_MOD_DATE, TABLE_USER_ID, CHECKSUM )

my data ( XTABLE.DEL )

+01.,+00000002.,"20130101","N","        ","root      "," "
+02.,+00000001.,"20130101","N","        ","root      "," "
+03.,+00000001.,"20130101","N","        ","root      "," "
+04.,+00000001.,"20130101","N","        ","root      "," "
+05.,+00000003.,"20130101","N","        ","root      ",""
+06.,+00000001.,"20130101","N","        ","root      "," "
+07.,+00000000.,"20130101","N","        ","root      "," "
+08.,+00000001.,"20130101","N","        ","root      "," "
+09.,+00000001.,"20130101","N","        ","root      "," "
+10.,+00000001.,"20130101","N","        ","root      "," "
+11.,+00000001.,"20130101","N","        ","root      "," "
+12.,+00000001.,"20130101","N","        ","root      "," "
+13.,+00000001.,"20130101","N","        ","root      "," "
+14.,+00000000.,"20130101","N","        ","root      "," "
+15.,+00000001.,"20130101","N","        ","root      ",""

do i need to add commit in my script? is it require?

---------- Post updated at 03:33 PM ---------- Previous update was at 01:25 PM ----------

Thank you Don Cragun.. That things will guide me for my next problem encounter.

The import will probably take longer than a plain export because it is writing. Not only does this require the data to be committed (i.e. to disk, not a query commit) but this will write to redo logs, perhaps extend tables etc. beware that cancelling your loader with do one of two things:-

  • Commit an incomplete load (which you might need to tidy up)
  • Roll-back, which can also take a long time.

For clarity,

  • is this an extraordinarily long delay?
  • can you still connect to the database with another session?

It might be that the redo/undo areas are full or if redo logs are moved to disk, the filesystem that holds them might be full. Have a look at the oracle logs to see if that's the case.

The sqlldr will commit at the end anyway. I think you can use the parameter ROWS=1000 or a suitable number on sqlldr to force more regular commits, at which point you can use another session to count the rows imported into your table so far.

Does that help?
Robin

Sonny,

How long is too long to run? How many indexes do you have on the table? How many constraints, including foreign keys do you have? How many row level triggers do you have and what do they do? How many rows are you inserting? Can you use grid to trace what is happening in the import?

I have not seen enough information to give you a meaningful answer. Also this seems like a question better served on an Oracle Forum than a Unix/Linux forum.

1 Like

Thank you everyone for the advice and guidance you shared to me. it gives a lot of ideas to me on how build my script.

Actually, the script ran too long maybe because I didn't command commit after I delete data in table before I run the import script. After I command commit then I run the script and it very fast to complete. Linux

But I cannot confirm also if that was the caused that the script ran too long.

---------- Post updated at 06:01 PM ---------- Previous update was at 05:40 PM ----------

I have encounter another problem hopefully can advice me on the issue.

I successful export data from oracle but I put into TABLE.DEL file.

exp user/pwd@table file=TABLE.DEL log=xxxxxx.log tables=CP_TABLE rows=yes indexes=no

then I used TABLE.DEL to import into db2.
Not all data was successful import into db2, the other data was rejected.

db2 "import from '/home/user/TABLE.DEL' of del insert into CP_TABLE"

results below:

Number of rows read         = 41
Number of rows skipped      = 0
Number of rows inserted     = 16
Number of rows updated      = 0
Number of rows rejected     = 25
Number of rows committed    = 41

Is the issue was during the export? Because supposedly during export should be the file .dmp and I change into .DEL?

Please advice me..

Thanks

The file name you choose should not make any difference to your export or import commands. They are pretty much just a label to find the data. There is no inference on the 'bit after the dot' like with Windows or other GUI does, but there is nothing really to it. If you go to DOS under Windows, they just become files. Perhaps a .bat or .exe is still special, but that's about it.

Would they have been rejected because they had a duplicate constraint? It's difficult to know without seeing the data exported, the data in the target table and the constraints (unique indices, foreign keys etc.)

Can you elaborate?

Robin

1 Like

Hi rbatte1,

Thanks for your reply.

Actually, the table that I'm tried to import the data is empty. you can check also the attached text file for the error messages I found during import. most of the error are because of truncated.

so you mean, there is no problem in my export data from oracle which was changed into *.del instead of using default *.dmp.

is my import script is correct? or there something wrong/ missing that why it was truncated?

Please advice. Thanks