Need to develop a unix shell script for the below requirement and I need your assistance:
1) search for file.log and file.bad file in a directory and read them
2) pull out "Load_Start_Time", "Data_File_Name", "Error_Type" from log file
4) concatinate each row from bad file as "Error_Description"
5) write to a output flat file record by record
6) make sure the "number of bad record count from log file = number of records written to the output flat file"
7) below are the formats of log and bad file and output file
Log File:
Load started at: 1-Oct-10 11:00:00 EDT
Database: Oracle
Tablename: abc_def_123
Datafile: /abc/def/ghi/inbound/files/file.dat
Error_Type:
-------------------------------------------------
1: 123(456) [CHAR(1)] value too long for column]
2: 123(456) [CHAR(1)] value too long for column]
3: 123(456) [CHAR(1)] value too long for column]
4: 123(456) [CHAR(1)] value too long for column]
number of bad records: 4
Bad File:
111 456 abc 789 2010-10-1 manager US
222 456 abc 789 2010-10-1 employee US
333 456 abc 789 2010-10-1 contractor US
444 456 abc 789 2010-10-1 partner US
Output File:
Load_Start_Time Data_File_Name Error_Type Error_Description
1-Oct-10 11:00:00 file.dat 1: 123(456) [CHAR(1)] value too long for column] 111 456 abc 789 2010-10-1 manager US
1-Oct-10 11:00:00 file.dat 2: 123(456) [CHAR(1)] value too long for column] 222 456 abc 789 2010-10-1 employee US
1-Oct-10 11:00:00 file.dat 3: 123(456) [CHAR(1)] value too long for column] 333 456 abc 789 2010-10-1 contractor US
1-Oct-10 11:00:00 file.dat 4: 123(456) [CHAR(1)] value too long for column] 444 456 abc 789 2010-10-1 partner US