My requirement is to put all the files from output directory(ATT) to archive directory(archive\) creating a new
folder with datetimestamp(20100212_120014) every time it runs.
where ${IMF_TARGET_DIR} is my base directory.
${IMF_ARCHIVE_DIR} is my Archive directory
IMF_TARGET_DIR=/grid/PowerCenter/stage/r3_dev/outbound/ATRPU # local target file directory
IMF_ARCHIVE_DIR=${IMF_TARGET_DIR}/archive # archive directory
I have 3 files and a folder called archive\ in my IMF_TARGET_DIR. And I have to keep these 3 files in new
timestamped archive folder. For that my shell script is as below. Andi am getting error as below
2010/02/12 12:01:47 : Running function archive_file
/grid/PowerCenter/stage/velocity_r3_dev/outbound/ATRPU/ff_pre_sanity_check.bad
2010/02/12 12:01:47 : Running function archive_file
/grid/PowerCenter/stage/velocity_r3_dev/outbound/ATRPU/ff_pre_sanity_check.out
2010/02/12 12:01:47 : Running function archive_file
/grid/PowerCenter/stage/velocity_r3_dev/outbound/ATRPU/tgt_atu_rp.bad
2010/02/12 12:01:47 : Running function archive_file
/grid/PowerCenter/stage/velocity_r3_dev/outbound/ATRPU/IMF_ATRPU_12022010114110.csv
2010/02/12 12:01:47 : ERROR : Failed to archive file
/grid/PowerCenter/stage/velocity_r3_dev/outbound/ATRPU/IMF_ATRPU_12022010114110.csv
my shell script:
for file in `find ${IMF_TARGET_DIR} -type f `
do
FN=`basename $file`
echo ${FN}
# archive the input file after the workflow has completed
co_call_function "archive_file ${IMF_TARGET_DIR}/${FN}"
done
archive_file()
{
${DEBUG}
# local parameters
local FILE_NAME=$1
# create target system specific archive directory if it does not exist
co_create_directory ${IMF_ARCHIVE_DIR}/${DATE_TIME} ${OSLOG}
# archive the delivered data file
cp ${FILE_NAME} ${IMF_ARCHIVE_DIR}/${DATE_TIME} 1>> ${OSLOG} 2>&1
co_error_check $? "Failed to archive file ${FILE_NAME}"
}
Please help me .. it is very urgent to me.