Help rewriting my KSH-script...

Hi!

I'm a beginner at scripting, but have managed to complete a working KSH-script.. :smiley:

But since this is going to production i a few weeks, I would like to optimize it and make it better structured!

The id�a of the script is to collect data from the database, put it in a file, and then ftp it to another system.

The file has to look like this, $FILE:
0705959226:0616319971
0705989262:0686317272
0705919275:0656319473
0730907151:066616145

But from spooling the result from sqlplus I got this (and I have to get rid of emty spaces and lines, and non numeric lines)
Numbers.lst:
0705959226 :0616319971

0705989262 :0686317272
0705919275 :0656319473

185 rows selected.

So there is a bit mixturing with the file to make it look good.

The script I made look like this, and it is the red part that I'm frustrated over! It has to be done in a better way:

#!/usr/bin/ksh
#

#Set alias for running enviromet variables
. /.profile

DATA_FILES_DIR=/data_files/Numbers
FILE=Numbers_`date '+%Y%m%d'`
FTPSERVER=
FTPLOGIN=
FTPPASS=
FTPDIR=

export DATA_FILES_DIR FILE FTPSERVER FTPLOGIN FTPPASS FTPDIR

sqlplus /nolog @/scripts/Numbers.sql

cd $DATA_FILES_DIR

awk 'NF >= 2' Numbers.lst > Numbers.tmp
awk 'NF <= 2' Numbers.tmp > Numbers.tmp2
awk '{ print $1 $2 }' Numbers.tmp2 > $FILE
rm Numbers.tmp*
rm Numbers.lst

#FTP file to another system

ftp -n -i $FTPSERVER <<EOF
user $FTPLOGIN $FTPPASS
cd $FTPDIR
put $FILE
quit
EOF

mv $FILE processed

Any id�as or inputs?
Please...

/Linda

sqlplus /nolog @/scripts/Numbers.sql

cd $DATA_FILES_DIR

awk 'NF >= 2' Numbers.lst > Numbers.tmp
awk 'NF <= 2' Numbers.tmp > Numbers.tmp2
awk '{ print $1 $2 }' Numbers.tmp2 > $FILE
rm Numbers.tmp*
rm Numbers.lst


instead of doing the changes in unix, u can add some more lines into your Number.sql.

use trim(column) for the columns which you used in the sql.

set pagesize 0;
set feedback off;

Great thinking, palsevlohit_123...
:smiley:
It worked just fine to make the file ok in sqlplus!
So much easier than in unix.
I'm so thankful!
/Linda