I have developed a bash shell script which works perfectly fine when any bash shell is in use except SunOS.
There are two commands that is not compatible.
#!/usr/bin/env bash
Problem
The awk command below does not work for SunOS
Requirement
We extract Oracle EBS tables into text files and there is need to calculate the Sum total from the following hashTables for the following hashTableFields.
So GL_BALANCES.txt will have a column PERIOD_NET_CR. We are summing it and storing the result into a Log File.
--------------------
hashTables=( GL_BALANCES GL_IMPORT_REFERENCES GL_JE_BATCHES GL_JE_HEADERS GL_JE_LINES )
hashTableFields=( PERIOD_NET_CR HASH_IDX HASH_IDX HASH_IDX ACCOUNTED_CR )
# ------------------------------------------------------------------------
# Adding awk command support for SunOS -- use nawk for SunOS
# ------------------------------------------------------------------------
case `uname` in
SunOS) awkcmd=nawk ;;
*) awkcmd=awk ;;
esac
for (( index=0; index < $cnt; index++ ))
do
RIGHT_NOW=$(date +"%x %r %Z")
printf "%-35s %-20s %-20s\n" "${hashTables[index]}" "Start Time" "${RIGHT_NOW}" >> ${LOG_FILE}
sum=0
"$awkcmd" -F"#|#" -v c="${hashTableFields[index]}" 'NR==1{
for (i=1; i<=NF; i++)
if ($i==c)
{
p=i;
break
};
next
}
{
if ( $p=="|" || $p=="#|" || $p=="#" || $p=="|#" )
print $(p+1)
else
print $p
}' "${hashTables[index]}.txt" >> temp.hash.txt
# ----------------------------------------------------------------
# Remove blank lines from temp.hash.txt. This should resolve the parser issue if it finds blank values
# ----------------------------------------------------------------
sed '/^$/d' temp.hash.txt > temp.hash.out
mv temp.hash.out temp.hash.txt
sum=`"$awkcmd" ' BEGIN {print "sum=0"} {print "sum += " $1; asum += $1} END {printf "sum\nscale=4\nsum=sum/1\n"}' temp.hash.txt | bc`
# ----------------------------------------------------------------
# Calling the function to convert hashtotals in correct precision
# ----------------------------------------------------------------
sum=`round ${sum} ${precision}`
if [ $? -eq 0 ]; then
# Print the hash total
echo "${hashTables[index]}#|#${hashTableFields[index]}#|#${sum}" >> ${HASHTOTAL_FILE}
fi
rm -f temp.hash.txt
RIGHT_NOW=$(date +"%x %r %Z")
printf "%-35s %-20s %-20s\n" "${hashTables[index]}" "End Time" "${RIGHT_NOW}" >> ${LOG_FILE}
done
Log file content with Hash total values generated. This is not happening currently in SunOS. For rest OS, it is working as expected
TABLENAME#|#FIELD#|#TOTAL
GL_BALANCES#|#PERIOD_NET_CR#|#26525675437.9900
GL_IMPORT_REFERENCES#|#HASH_IDX#|#3226845.0000
GL_JE_BATCHES#|#HASH_IDX#|#20623.0000
GL_JE_HEADERS#|#HASH_IDX#|#31556.0000
GL_JE_LINES#|#ACCOUNTED_CR#|#3653881572.0300
---------------------
Split command does not work in SunOS
Requirement
If number of records exceed 5000000, then
GL_BALANCES.txt will be split into
GL_BALANCES_0000.txt
GL_BALANCES_0001.txt
...
...
until all records are placed into multiple files. Each file cannot contain more than 5000000 records. After each file, the subsequent file will contain next set of records.
splitFiles=( GL_BALANCES GL_IMPORT_REFERENCES GL_JE_BATCHES GL_JE_HEADERS GL_JE_LINES )
for splitFileIdx in "${splitFiles[@]}"
do
if [[ -f ${splitFileIdx}.txt ]]; then
RIGHT_NOW=$(date +"%x %r %Z")
printf "%-35s %-20s %-20s\n" "${splitFileIdx}" "Start Time" "${RIGHT_NOW}" >> ${LOG_FILE}
SPLIT_COUNT=0
tail -n +2 ${splitFileIdx}.txt | split -d -a 4 -l ${row_num_split} - ${splitFileIdx}_
for file in ${splitFileIdx}_*
do
head -n 1 ${splitFileIdx}.txt > tmp_file
cat $file >> tmp_file
mv -f tmp_file $file
mv -f ${file} ${file}.txt
# ****************************************************
# Get the number of files and check for split success.
# if successful delete the original file
# ****************************************************
SPLIT_COUNT=$(find ${splitFileIdx}_* -type f | wc -l)
if [[ $SPLIT_COUNT -gt 1 ]]; then
rm -f ${splitFileIdx}.txt
else
mv -f ${file}.txt ${splitFileIdx}.txt
fi
done
RIGHT_NOW=$(date +"%x %r %Z")
printf "%-35s %-20s %-20s\n" "${splitFileIdx}" "End Time" "${RIGHT_NOW}" >> ${LOG_FILE}
fi
done
Please note - the above codes are working as expected in Linux and other OS except SunOS.
We are ready to write them differently based on SunOS compatibility.
Any help is appreciated
There is a lot of code there. What exactly does 'does not work' mean? Give an example of failure, please.
Failure means the code is not getting executed.
Please look at only at the awk command and split command
"$awkcmd" -F"#|#" -v c="${hashTableFields[index]}" 'NR==1{
for (i=1; i<=NF; i++)
if ($i==c)
{
p=i;
break
};
next
}
{
if ( $p=="|" || $p=="#|" || $p=="#" || $p=="|#" )
print $(p+1)
else
print $p
}' "${hashTables[index]}.txt" >> temp.hash.txt
tail -n +2 ${splitFileIdx}.txt | split -d -a 4 -l ${row_num_split} - ${splitFileIdx}_
As far as the awk part goes. I'd be tempted to have awk calculate the sum directly and get rid of the bc and round functions. How many decimal places of precision do you need in the final sum?
Something like this (4 digits of precision, change printf line for more/less):
"$awkcmd" -F"#|#" -v c="${hashTableFields[index]}" '
NR==1{
for (i=1; i<=NF; i++)
if ($i==c) {
p=i;
break
};
next
}
p{
if ( $p=="|" || $p=="#|" || $p=="#" || $p=="|#" )
sum+=$(p+1)
else
sum+=$p
}
END {
FN=FILENAME
gsub(/.txt$/,"",FN)
printf "%s#|#%s#|#%0.4f\n",FN,c,sum
}
' "${hashTables[index]}.txt" >> ${HASHTOTAL_FILE}
Check you split(1) manual on sunos but I believe the issue is -d
for numeric suffixes. You may have to forgo these in SunOS or rename the files after split has done it's bit.
Perhaps:
tail -n +2 ${splitFileIdx}.txt | split -a 4 -l ${row_num_split} - ${splitFileIdx}_
FNUM=1
for file in ${splitFileIdx}_*
do
head -n 1 ${splitFileIdx}.txt > tmp_file
cat $file >> tmp_file
mv -f tmp_file $(printf "%s_%04d.txt" ${splitFileIdx} $FNUM)
let FNUM=FNUM+1
...
Step 1: verify that the $awkcmd variable has the right data in it. i.e.,
echo "$awkcmd"
should show nawk.
Then verify that nawk is in the PATH variable of the process running the code:
echo " I am running nawk which is `which nawk`"
Do that before playing with the awk code itself.
The awk code is clunky but looks, eh.., reasonable maybe, for nawk. If you get no output then you must rule out or find an error message about nawk not found, or if $awkcmd is not assigned you will get a shell error because the shell will try to run the awk code which is gibberish to the shell.
You need to start looking from the very first line, IMO. Keep ruling out problems as you go forward. Do not simply jump ahead 30 lines.
It's been awhile since I dealt with Solaris' nawk
, but it did have its quirks..
I'd start with: "$awkcmd" -F"#|#"
What do you think your field separator is?
A string #|#
? Maybe or maybe not. It might be a regex: #
OR #
.
I'd debug this first with the debug printf
for each field.
Or try "#[|]#"
to force it into a string.
Also: if ( $p=="|" || $p=="#|" || $p=="#" || $p=="|#" )
doesn't make too much sense given your definition of of the field separator.
Post sample input and the desired output.
To @vgersh99 ,
Our field separator is '#|#'
Col1#|#Col2#|#Col3
1#|#11#|#123
2#|#12#|#241
----------------------------------------
To @jim mcnamara,
We have the piece of code which assigns the value 'nawk' to variable $awkcmd.
# ------------------------------------------------------------------------
# Adding awk command support for SunOS -- use nawk for SunOS
# ------------------------------------------------------------------------
case `uname` in
SunOS) awkcmd=nawk ;;
*) awkcmd=awk ;;
esac
---------------------------------------
To @Chubler_XL ,
The decimal places we need in the final sum is 4.
The piece of the code is working fine for all OS
---------- Post updated at 02:39 PM ---------- Previous update was at 02:33 PM ----------
Pasting my entire shell script here if this helps
#!/usr/bin/env bash
erp="R12"
row_num_split=5000000
# ------------------------------------------------------------------------
# Adding awk command support for SunOS -- use nawk for SunOS
# ------------------------------------------------------------------------
case `uname` in
SunOS) awkcmd=nawk ;;
*) awkcmd=awk ;;
esac
# ------------------------------------------------------------------------
# Adding math function
# ------------------------------------------------------------------------
round() {
printf "%.${2}f" "${1}"
}
precision=4
# ------------------------------------------------------------------------
# Changing the case in case lower case entered for variable erp.
# Do not modify this section
# ------------------------------------------------------------------------
erp=`echo $erp | "$awkcmd" '{print toupper($0)}'`
erp=$(echo $erp | tr -d ' ')
# ************************************************************************
# Application substitution Variables
# ************************************************************************
tblFile="ORACLE_${erp}_OTC_INSTALL.sql"
pkgFile="ORACLE_${erp}_OTC_PKG.sql"
sqlFile="ORACLE_${erp}_EXTRACTION.sql"
drpFile="ORACLE_${erp}_OTC_UNINSTALL.sql"
tableList=( FND_APPLICATION_VL FND_FLEX_VALUE_SETS FND_FLEX_VALUES_TL FND_FLEX_VALUES_VL FND_ID_FLEX_SEGMENTS FND_ID_FLEX_STRUCTURES_VL FND_LANGUAGES_VL FND_LOOKUP_VALUES FND_SEGMENT_ATTRIBUTE_VALUES FND_TERRITORIES_TL FND_TERRITORIES_VL FND_USER GL_BALANCES GL_CODE_COMBINATIONS GL_IMPORT_REFERENCES GL_JE_BATCHES GL_JE_CATEGORIES_TL GL_JE_HEADERS GL_JE_LINES GL_JE_SOURCES_TL GL_LEDGER_NORM_SEG_VALS GL_LEDGERS GL_PERIOD_STATUSES GL_PERIODS HR_LEGAL_ENTITIES HR_ORGANIZATION_UNITS ORA_CLIENT_BUSINESS_STRUCTURE ORA_CLIENT_COA ORA_USER_TIMEZONES PER_ALL_ASSIGNMENTS_F PER_ALL_PEOPLE_F ORA_GL_SL_MANUAL_ENTRIES FND_PROFILE_OPTION_VALUES FND_PROFILE_OPTIONS FUN_CUSTOMER_MAPS HR_ALL_ORGANIZATION_UNITS HR_OPERATING_UNITS HZ_CUST_ACCOUNTS HZ_CUST_ACCT_SITES_ALL HZ_CUST_PROFILE_AMTS HZ_CUST_SITE_USES_ALL HZ_CUSTOMER_PROFILES HZ_PARTIES MTL_CROSS_REFERENCES_B MTL_ITEM_CATALOG_GROUPS_B MTL_ITEM_CATALOG_GROUPS_TL MTL_SYSTEM_ITEMS_B MTL_SYSTEM_ITEMS_TL MTL_UNITS_OF_MEASURE_TL OE_ORDER_HEADERS_ALL OE_ORDER_LINES_ALL OE_ORDER_SOURCES OE_TRANSACTION_TYPES_ALL OE_TRANSACTION_TYPES_TL ORG_ORGANIZATION_DEFINITIONS PER_BUSINESS_GROUPS RA_CUST_TRX_LINE_GL_DIST_ALL RA_CUST_TRX_TYPES_ALL RA_CUSTOMER_TRX_ALL RA_CUSTOMER_TRX_LINES_ALL WSH_DELIVERY_ASSIGNMENTS WSH_DELIVERY_DETAILS WSH_DELIVERY_LEGS WSH_NEW_DELIVERIES OE_DROP_SHIP_SOURCES PO_DISTRIBUTIONS_DROP_SHIP_ALL PO_HEADERS_DROP_SHIP_ALL PO_LINE_LOCATIONS_DROP_SHIP_ALL PO_LINES_DROP_SHIP_ALL RA_TERMS_LINES RA_TERMS_LINES_DISCOUNTS RA_TERMS_TL RA_TERMS_B FND_DESCR_FLEX_CONTEXTS_TL FND_DESCR_FLEX_COL_USAGE_TL ORA_TRX_TBL )
splitFiles=( FND_APPLICATION_VL FND_FLEX_VALUE_SETS FND_FLEX_VALUES_TL FND_FLEX_VALUES_VL FND_ID_FLEX_SEGMENTS FND_ID_FLEX_STRUCTURES_VL FND_LANGUAGES_VL FND_LOOKUP_VALUES FND_SEGMENT_ATTRIBUTE_VALUES FND_TERRITORIES_TL FND_TERRITORIES_VL FND_USER GL_BALANCES GL_CODE_COMBINATIONS GL_IMPORT_REFERENCES GL_JE_BATCHES GL_JE_CATEGORIES_TL GL_JE_HEADERS GL_JE_LINES GL_JE_SOURCES_TL GL_LEDGER_NORM_SEG_VALS GL_LEDGERS GL_PERIOD_STATUSES GL_PERIODS HR_LEGAL_ENTITIES HR_ORGANIZATION_UNITS ORA_CLIENT_BUSINESS_STRUCTURE ORA_CLIENT_COA ORA_USER_TIMEZONES PER_ALL_ASSIGNMENTS_F PER_ALL_PEOPLE_F ORA_GL_SL_MANUAL_ENTRIES FND_PROFILE_OPTION_VALUES FND_PROFILE_OPTIONS FUN_CUSTOMER_MAPS HR_ALL_ORGANIZATION_UNITS HR_OPERATING_UNITS HZ_CUST_ACCOUNTS HZ_CUST_ACCT_SITES_ALL HZ_CUST_PROFILE_AMTS HZ_CUST_SITE_USES_ALL HZ_CUSTOMER_PROFILES HZ_PARTIES MTL_CROSS_REFERENCES_B MTL_ITEM_CATALOG_GROUPS_B MTL_ITEM_CATALOG_GROUPS_TL MTL_SYSTEM_ITEMS_B MTL_SYSTEM_ITEMS_TL MTL_UNITS_OF_MEASURE_TL OE_ORDER_HEADERS_ALL OE_ORDER_LINES_ALL OE_ORDER_SOURCES OE_TRANSACTION_TYPES_ALL OE_TRANSACTION_TYPES_TL ORG_ORGANIZATION_DEFINITIONS PER_BUSINESS_GROUPS RA_CUST_TRX_LINE_GL_DIST_ALL RA_CUST_TRX_TYPES_ALL RA_CUSTOMER_TRX_ALL RA_CUSTOMER_TRX_LINES_ALL WSH_DELIVERY_ASSIGNMENTS WSH_DELIVERY_DETAILS WSH_DELIVERY_LEGS WSH_NEW_DELIVERIES OE_DROP_SHIP_SOURCES PO_DISTRIBUTIONS_DROP_SHIP_ALL PO_HEADERS_DROP_SHIP_ALL PO_LINE_LOCATIONS_DROP_SHIP_ALL PO_LINES_DROP_SHIP_ALL RA_TERMS_TL RA_TERMS_B FND_DESCR_FLEX_CONTEXTS_TL FND_DESCR_FLEX_COL_USAGE_TL ORA_TRX_TBL )
hashTables=( GL_BALANCES GL_IMPORT_REFERENCES GL_JE_BATCHES GL_JE_HEADERS GL_JE_LINES )
hashTableFields=( PERIOD_NET_CR HASH_IDX HASH_IDX HASH_IDX ACCOUNTED_CR )
# ************************************************************************
# Program logic begins
# ************************************************************************
# ************************************************************************
# Get APPS login credentials
# ************************************************************************
echo -n "APPS username : "
read apps_user
read -s -p "APPS password : " apps_pwd
echo " "
echo "*******************************************************************"
echo " Enter Extraction Program Parameters "
echo "*******************************************************************"
echo " "
echo "-------------------------------------------------------------------"
echo "Please note we need to install custom table and package in APPS "
echo "which will dropped at end of the extraction program "
echo "-------------------------------------------------------------------"
echo " "
#------------------------- Enter Parameters ------------------------------
echo -n "Ledger ID(s)/SOB ID(s) separated by commas : "
read v_ledger_ids
echo -n " Extraction Start Date (YYYY/MM/DD) : "
read v_analysis_start_date
echo -n " Extraction End Date (YYYY/MM/DD) : "
read v_analysis_end_date
#-------------------------------------------------------------------------
echo "*******************************************************************"
echo "Printing current path : "`pwd`
echo "Please do not use leading slashes while entering the path "
echo -n "Absolute path where you kept SQL Files : "
read exec_file_path
echo -n "Absolute path where you want to generate the o/p files : "
read out_file_path
echo "*******************************************************************"
# ************************************************************************
# Clean the paths for leading slashes if entered
# ************************************************************************
if [[ "$exec_file_path" == */ ]]; then
exec_file_path=`echo $exec_file_path | rev | cut -c 2- | rev`
fi
if [[ "$out_file_path" == */ ]]; then
out_file_path=`echo $out_file_path | rev | cut -c 2- | rev`
fi
# ************************************************************************
# Verify file paths provided
# ************************************************************************
if [[ -z $exec_file_path || -z $out_file_path ]]; then
echo "File paths entered are not valid. Please try again."
exit 2
fi
echo "********************************************************************"
echo " Verifying the paths entered are valid directories "
echo "********************************************************************"
if [[ -d $exec_file_path ]]; then
echo "$exec_file_path status : Validated"
else
echo "$exec_file_path status : Invalid"
exit 2
fi
if [[ -d $out_file_path ]]; then
echo "$out_file_path status : Validated"
else
echo "$out_file_path status : Invalid"
exit 2
fi
RIGHT_NOW=$(date +"%x %r %Z")
FILE_TIMESTAMP=$(date "+%Y.%m.%d_%H.%M.%S")
cd $out_file_path
# Check for files or directories
files=`ls -1 | wc -l`
if [[ $files -gt 0 ]]; then
echo "$out_file_path location is not empty. Please ensure it is free of any files or directories to proceed"
exit 2
fi
LOG_FILE=LOG_EXTRACTION_${erp}_${FILE_TIMESTAMP}.log
RECORD_COUNT_FILE=LOG_RECORDCOUNT_${erp}_${FILE_TIMESTAMP}.log
HASHTOTAL_FILE=LOG_HASHTOTALS_${erp}_${FILE_TIMESTAMP}.log
# ************************************************************************
# Logging the parameters entered by ORACLE
# ************************************************************************
echo " *************************************************** " >> ${LOG_FILE}
echo " Master Oracle Extraction " >> ${LOG_FILE}
echo " Date : ${RIGHT_NOW} " >> ${LOG_FILE}
echo " *************************************************** " >> ${LOG_FILE}
echo " erp : ${erp} " >> ${LOG_FILE}
echo " OS : `uname -a` " >> ${LOG_FILE}
echo " v_ledger_ids : ${v_ledger_ids} " >> ${LOG_FILE}
echo " v_analysis_start_date : ${v_analysis_start_date} " >> ${LOG_FILE}
echo " v_analysis_end_date : ${v_analysis_end_date} " >> ${LOG_FILE}
echo " row_num_split : ${row_num_split} " >> ${LOG_FILE}
echo " --------------------------------------------------- " >> ${LOG_FILE}
echo " Execution file path : ${exec_file_path} " >> ${LOG_FILE}
echo " output file path : ${out_file_path} " >> ${LOG_FILE}
echo " --------------------------------------------------- " >> ${LOG_FILE}
# ************************************************************************
# Removing the spaces from variables if entered
# ************************************************************************
v_ledger_ids=$(echo $v_ledger_ids | tr -d ' ')
v_analysis_start_date=$(echo $v_analysis_start_date | tr -d ' ')
v_analysis_end_date=$(echo $v_analysis_end_date | tr -d ' ')
# ************************************************************************
echo " --------------------------------------------------- " >> ${LOG_FILE}
echo " Parameters after removing spaces " >> ${LOG_FILE}
echo " --------------------------------------------------- " >> ${LOG_FILE}
echo " v_ledger_ids : ${v_ledger_ids} " >> ${LOG_FILE}
echo " v_analysis_start_date : ${v_analysis_start_date} " >> ${LOG_FILE}
echo " v_analysis_end_date : ${v_analysis_end_date} " >> ${LOG_FILE}
echo " --------------------------------------------------- " >> ${LOG_FILE}
# ************************************************************************
# Verify execution files existence
# ************************************************************************
sqlFile_exists="0"
tblFile_exists="0"
pkgFile_exists="0"
drpFile_exists="0"
echo "Verifying files existence in the execution file location...." >> ${LOG_FILE}
if [[ ! -z $sqlFile ]]; then
if [[ -f ${exec_file_path}/$sqlFile ]]; then
sqlFile_exists="1"
echo "$sqlFile file exists...." >> ${LOG_FILE}
else
echo "$sqlFile file does not exist...." >> ${LOG_FILE}
exit 2
fi
fi
if [[ ! -z $tblFile ]]; then
if [[ -f ${exec_file_path}/$tblFile ]]; then
tblFile_exists="1"
echo "$tblFile file exists...." >> ${LOG_FILE}
else
echo "$tblFile file does not exist...." >> ${LOG_FILE}
fi
fi
if [[ ! -z $pkgFile ]]; then
if [[ -f ${exec_file_path}/$pkgFile ]]; then
pkgFile_exists="1"
echo "$pkgFile file exists...." >> ${LOG_FILE}
else
echo "$pkgFile file does not exist...." >> ${LOG_FILE}
fi
fi
if [[ ! -z $drpFile ]]; then
if [[ -f ${exec_file_path}/$drpFile ]]; then
drpFile_exists="1"
echo "$drpFile file exists...." >> ${LOG_FILE}
else
echo "$drpFile file does not exist...." >> ${LOG_FILE}
fi
fi
echo " "
echo " "
echo " 1. Verification of paths provided and sql files existence : Complete. $(date +"%x %r %Z")"
# ************************************************************************
# Create the Record Count and Hashtotal file
# ************************************************************************
echo "FILENAME#|#TABLENAME#|#RECORD_COUNT#|#DATE#|#TIME#|#EXECUTION_TIME" >> ${RECORD_COUNT_FILE}
echo "TABLENAME#|#FIELD#|#TOTAL" >> ${HASHTOTAL_FILE}
# ************************************************************************
# Execute SQL
# ************************************************************************
# ---------------------------------------------
# Install Table
# ---------------------------------------------
if [ ${tblFile_exists} = "1" ]; then
RIGHT_NOW=$(date +"%x %r %Z")
echo "------------------------------------------------------------------- " >> ${LOG_FILE}
echo " ${tblFile} execution started at : ${RIGHT_NOW} " >> ${LOG_FILE}
echo "------------------------------------------------------------------- " >> ${LOG_FILE}
sqlplus ${apps_user}/${apps_pwd} @${exec_file_path}/${tblFile} <<EOF >> ${LOG_FILE}
WHENEVER OSERROR EXIT 9;
WHENEVER SQLERROR EXIT 2;
EOF
if [[ $? -eq 0 ]]; then
echo "SQL Call Successful for ${tblFile}" >> ${LOG_FILE}
else
echo "SQL Error in ${tblFile}" >> ${LOG_FILE}
exit 2
fi
RIGHT_NOW=$(date +"%x %r %Z")
echo "------------------------------------------------------------------- " >> ${LOG_FILE}
echo " ${tblFile} execution completed at : ${RIGHT_NOW} " >> ${LOG_FILE}
echo "------------------------------------------------------------------- " >> ${LOG_FILE}
fi
#*************************************************************************
# ---------------------------------------------
# Install Package
# ---------------------------------------------
if [ ${pkgFile_exists} = "1" ]; then
RIGHT_NOW=$(date +"%x %r %Z")
echo "------------------------------------------------------------------- " >> ${LOG_FILE}
echo " ${pkgFile} install started at : ${RIGHT_NOW} " >> ${LOG_FILE}
echo "------------------------------------------------------------------- " >> ${LOG_FILE}
sqlplus ${apps_user}/${apps_pwd} @${exec_file_path}/${pkgFile} <<EOF >> ${LOG_FILE}
WHENEVER OSERROR EXIT 9;
WHENEVER SQLERROR EXIT 2;
EOF
if [[ $? -eq 0 ]]; then
echo "SQL Call Successful for ${pkgFile}" >> ${LOG_FILE}
else
echo "SQL Error in ${pkgFile}" >> ${LOG_FILE}
exit 2
fi
RIGHT_NOW=$(date +"%x %r %Z")
echo "------------------------------------------------------------------- " >> ${LOG_FILE}
echo " ${pkgFile} install completed at : ${RIGHT_NOW} " >> ${LOG_FILE}
echo "------------------------------------------------------------------- " >> ${LOG_FILE}
fi
#*************************************************************************
# ---------------------------------------------
# Execute Package
# ---------------------------------------------
if [ ${pkgFile_exists} = "1" ]; then
RIGHT_NOW=$(date +"%x %r %Z")
echo "------------------------------------------------------------------- " >> ${LOG_FILE}
echo " ORA_OTC_EXTRACTION_PKG execution started at : ${RIGHT_NOW} " >> ${LOG_FILE}
echo "------------------------------------------------------------------- " >> ${LOG_FILE}
sqlplus ${apps_user}/${apps_pwd} << EOF >> ${LOG_FILE}
SET TIMING ON
SET SERVEROUTPUT ON
WHENEVER SQLERROR EXIT FAILURE;
WHENEVER OSERROR EXIT FAILURE;
DECLARE
v_msg VARCHAR2 (500);
v_status NUMBER;
BEGIN
ORA_OTC_EXTRACTION_PKG.ORA_TRX_TBL_POPULATE('${v_analysis_start_date}','${v_analysis_end_date}','${v_ledger_ids}',v_msg,v_status);
END;
/
EOF
RIGHT_NOW=$(date +"%x %r %Z")
echo "------------------------------------------------------------------- " >> ${LOG_FILE}
echo " ORA_OTC_EXTRACTION_PKG execution completed at : ${RIGHT_NOW} " >> ${LOG_FILE}
echo "------------------------------------------------------------------- " >> ${LOG_FILE}
fi
#*************************************************************************
# ---------------------------------------------
# Data extraction SQL execution
# ---------------------------------------------
if [ ${sqlFile_exists} = "1" ]; then
RIGHT_NOW=$(date +"%x %r %Z")
echo "------------------------------------------------------------------- " >> ${LOG_FILE}
echo " Extraction started at : ${RIGHT_NOW} " >> ${LOG_FILE}
echo "------------------------------------------------------------------- " >> ${LOG_FILE}
sqlplus ${apps_user}/${apps_pwd} @${exec_file_path}/${sqlFile} ${v_ledger_ids} ${v_analysis_start_date} ${v_analysis_end_date} <<EOF >> ${LOG_FILE}
WHENEVER OSERROR EXIT 9;
WHENEVER SQLERROR EXIT 2;
EOF
if [[ $? -eq 0 ]]; then
echo "SQL Call Successful for ${sqlFile}" >> ${LOG_FILE}
else
echo "SQL Error in ${sqlFile}" >> ${LOG_FILE}
exit 2
fi
RIGHT_NOW=$(date +"%x %r %Z")
echo "------------------------------------------------------------------- " >> ${LOG_FILE}
echo " Extraction completed at : ${RIGHT_NOW} " >> ${LOG_FILE}
echo "------------------------------------------------------------------- " >> ${LOG_FILE}
fi
echo " 3. Data extraction : Complete. $(date +"%x %r %Z")"
#*************************************************************************
# ---------------------------------------------
# Drop table and package
# ---------------------------------------------
if [ ${drpFile_exists} = "1" ]; then
RIGHT_NOW=$(date +"%x %r %Z")
echo "------------------------------------------------------------------- " >> ${LOG_FILE}
echo " ${drpFile} execution started at : ${RIGHT_NOW} " >> ${LOG_FILE}
echo "------------------------------------------------------------------- " >> ${LOG_FILE}
sqlplus ${apps_user}/${apps_pwd} @${exec_file_path}/${drpFile} <<EOF >> ${LOG_FILE}
WHENEVER OSERROR EXIT 9;
WHENEVER SQLERROR EXIT 2;
EOF
if [[ $? -eq 0 ]]; then
echo "SQL Call Successful for ${drpFile}" >> ${LOG_FILE}
else
echo "SQL Error in ${drpFile}" >> ${LOG_FILE}
exit 2
fi
RIGHT_NOW=$(date +"%x %r %Z")
echo "------------------------------------------------------------------- " >> ${LOG_FILE}
echo " ${drpFile} execution completed at : ${RIGHT_NOW} " >> ${LOG_FILE}
echo "------------------------------------------------------------------- " >> ${LOG_FILE}
fi
#*************************************************************************
# ************************************************************************
# Generate Report
# ************************************************************************
echo "-------------------------------------------------------------------" >> ${LOG_FILE}
echo " Evaluation Report " >> ${LOG_FILE}
echo "-------------------------------------------------------------------" >> ${LOG_FILE}
cnt=${#tableList[@]}
printf "%-10s %-40s %-10s\n" "No" "Table" "File Exists" >> ${LOG_FILE}
for (( index=0; index < $cnt; index++ ))
do
fileExists=0
slNo=0
slNo=$(($index+1))
fileExists=`ls ${tableList[index]}.txt |wc -l`
if [[ fileExists -gt 0 ]]; then
printf "%-10s %-40s %-10s\n" "${slNo}" "${tableList[index]}" "Yes" >> ${LOG_FILE}
else
printf "%-10s %-40s %-10s\n" "${slNo}" "${tableList[index]}" "No" >> ${LOG_FILE}
fi
done
echo " 4. Generate extraction evaluation report : Complete. $(date +"%x %r %Z")"
# ************************************************************************
# Hash Total Calculation
# ************************************************************************
# -- Register hash total only when the OS is not SunOS
if [[ ${awkcmd} = "awk" ]]; then
echo "------------------------------------------------------------------- " >> ${LOG_FILE}
echo " Register hash total log " >> ${LOG_FILE}
echo "------------------------------------------------------------------- " >> ${LOG_FILE}
if [ "${#hashTables[@]}" -ne "${#hashTableFields[@]}" ]; then
echo "There is problem in hash variables substitution .... " >> ${LOG_FILE}
exit 2;
fi
cnt=${#hashTables[@]}
if [ $cnt -gt 0 ]
then
printf "%-35s %-20s %-20s\n" "Hash Table" "Event" "Time" >> ${LOG_FILE}
fi
for (( index=0; index < $cnt; index++ ))
do
RIGHT_NOW=$(date +"%x %r %Z")
printf "%-35s %-20s %-20s\n" "${hashTables[index]}" "Start Time" "${RIGHT_NOW}" >> ${LOG_FILE}
sum=0
"$awkcmd" -F"#|#" -v c="${hashTableFields[index]}" 'NR==1{
for (i=1; i<=NF; i++)
if ($i==c)
{
p=i;
break
};
next
}
{
if ( $p=="|" || $p=="#|" || $p=="#" || $p=="|#" )
print $(p+1)
else
print $p
}' "${hashTables[index]}.txt" >> temp.hash.txt
# ----------------------------------------------------------------
# Remove blank lines from temp.hash.txt. This should resolve the parser issue if it finds blank values
# ----------------------------------------------------------------
sed '/^$/d' temp.hash.txt > temp.hash.out
mv -f temp.hash.out temp.hash.txt
sum=`"$awkcmd" ' BEGIN {print "sum=0"} {print "sum += " $1; asum += $1} END {printf "sum\nscale=4\nsum=sum/1\n"}' temp.hash.txt | bc`
# ----------------------------------------------------------------
# Calling the function to convert hashtotals in correct precision
# ----------------------------------------------------------------
sum=`round ${sum} ${precision}`
if [ $? -eq 0 ]; then
# Print the hash total
echo "${hashTables[index]}#|#${hashTableFields[index]}#|#${sum}" >> ${HASHTOTAL_FILE}
fi
rm -f temp.hash.txt
RIGHT_NOW=$(date +"%x %r %Z")
printf "%-35s %-20s %-20s\n" "${hashTables[index]}" "End Time" "${RIGHT_NOW}" >> ${LOG_FILE}
done
echo " 5. Register hash total log : Complete. $(date +"%x %r %Z")"
fi
# ************************************************************************
# Split Files
# ************************************************************************
if [[ ${awkcmd} = "awk" ]]; then
echo "------------------------------------------------------------------- " >> ${LOG_FILE}
echo " Split Files " >> ${LOG_FILE}
echo "------------------------------------------------------------------- " >> ${LOG_FILE}
cnt=${#splitFiles[@]}
if [ $cnt -gt 0 ]
then
printf "%-35s %-20s %-20s\n" "Split Table" "Event" "Time" >> ${LOG_FILE}
fi
for splitFileIdx in "${splitFiles[@]}"
do
if [[ -f ${splitFileIdx}.txt ]]; then
RIGHT_NOW=$(date +"%x %r %Z")
printf "%-35s %-20s %-20s\n" "${splitFileIdx}" "Start Time" "${RIGHT_NOW}" >> ${LOG_FILE}
SPLIT_COUNT=0
tail -n +2 ${splitFileIdx}.txt | split -d -a 4 -l ${row_num_split} - ${splitFileIdx}_
for file in ${splitFileIdx}_*
do
head -n 1 ${splitFileIdx}.txt > tmp_file
cat $file >> tmp_file
mv -f tmp_file $file
mv -f ${file} ${file}.txt
done
# ****************************************************
# Get the number of files and check for split success.
# if successful delete the original file
# ****************************************************
SPLIT_COUNT=$(find ${splitFileIdx}_* -type f | wc -l)
if [[ $SPLIT_COUNT -gt 1 ]]; then
rm -f ${splitFileIdx}.txt
else
mv -f ${file}.txt ${splitFileIdx}.txt
fi
RIGHT_NOW=$(date +"%x %r %Z")
printf "%-35s %-20s %-20s\n" "${splitFileIdx}" "End Time" "${RIGHT_NOW}" >> ${LOG_FILE}
fi
done
echo " 6. Split files if applicable : Complete. $(date +"%x %r %Z")"
fi
# ************************************************************************
# Record Count Calculation
# ************************************************************************
echo "-------------------------------------------------------------------" >> ${LOG_FILE}
echo " Register record count log " >> ${LOG_FILE}
echo "-------------------------------------------------------------------" >> ${LOG_FILE}
printf "%-35s %-20s %-20s\n" "File" "Event" "Time" >> ${LOG_FILE}
for file in *.txt; do
if [[ $file != *"CD_ORA"* ]]; then
printf "%-35s %-20s %-20s\n" "$file" "Start Time" "${RIGHT_NOW}" >> ${LOG_FILE}
today=$(date +"%D")
ORIG_NUM_LINES=$(wc -l < "$file")
ACTUAL_NUM_LINES=$(($ORIG_NUM_LINES-1))
time_now="$(date +%T)"
if [[ "$ORIG_NUM_LINES" -gt "$row_num_split" ]]; then
echo "${file}#|#${file%_0*}#|#${ACTUAL_NUM_LINES}#|#${today}#|#${time_now}#|#0" >> ${RECORD_COUNT_FILE}
else
echo "${file}#|#${file%.*}#|#${ACTUAL_NUM_LINES}#|#${today}#|#${time_now}#|#0" >> ${RECORD_COUNT_FILE}
fi
RIGHT_NOW=$(date +"%x %r %Z")
printf "%-35s %-20s %-20s\n" "$file" "End Time" "${RIGHT_NOW}" >> ${LOG_FILE}
fi
done
echo " 7. Record count calculation for each file : Complete. $(date +"%x %r %Z")"
# ************************************************************************
# Identify Possible Errors
# ************************************************************************
echo "-------------------------------------------------------------------" >> ${LOG_FILE}
echo " Possible errors in files " >> ${LOG_FILE}
echo "-------------------------------------------------------------------" >> ${LOG_FILE}
grep -il "ORA ERROR" *.txt >> ${LOG_FILE}
echo " 8. Identify possible errors for each file : Complete. $(date +"%x %r %Z")"
echo " "
echo "******************************************************************** "
echo " Thank you for choosing Automated Oracle extraction. "
echo " Extraction process completed at $(date +"%x %r %Z") "
echo " Output file path ${out_file_path} "
echo "******************************************************************** "
echo " " >> ${LOG_FILE}
echo "******************************************************************** " >> ${LOG_FILE}
echo " Thank you for choosing Automated Oracle extraction. " >> ${LOG_FILE}
echo " Extraction process completed at $(date +"%x %r %Z") " >> ${LOG_FILE}
echo " Output file path ${out_file_path} " >> ${LOG_FILE}
echo "******************************************************************** " >> ${LOG_FILE}
# Rename log files
for file in *.log
do
mv -f "$file" "${file%.log}.txt"
done
vgersh99:
It's been awhile since I dealt with Solaris' nawk
, but it did have its quirks..
I'd start with: "$awkcmd" -F"#|#"
What do you think your field separator is?
A string #|#
? Maybe or maybe not. It might be a regex: #
OR #
.
I'd debug this first with the debug printf
for each field.
Or try "#[|]#"
to force it into a string.
Also: if ( $p=="|" || $p=="#|" || $p=="#" || $p=="|#" )
doesn't make too much sense given your definition of of the field separator.
Post sample input and the desired output.
Setting FS
to #|#
(via direct assignment or by using the -F
option) should effectively set the field separator to an octothorpe (i.e. #
) on any awk
based on the version of awk
that Aho, Weinberger, and Kernighan released in 1988. The /usr/bin/awk
or /bin/awk
provided on SunOS and System V was an earlier version of awk
.
So when the real field separator is #|#
instead of the #
matched by FS
, the code will frequently find a field containing just |
. But, it will never find a field containing #|
, #
, or |#
since #
is the field separator.
As you said, using -F"#[|]#"
instead of -F"#|#"
, should get rid of that problem. But, since there is no |
in the hashTables[]
array, I don't see why that check would ever be needed as long as the field separators in the 1st line of an input file are the same as the field separators on the other lines in that file.
The field separator is #|#
Did my proposal from post #for the new split(1) command fix the issues?
If -F"#[|]#"
is still not correcting the issue, can you give us more details on what output / error messages you get when it fails(as requested in post #2 ).