Any way to improve performance of this script

I have a data file of 2 gig

I need to do all these, but its taking hours, any where i can improve performance, thanks a lot

#!/usr/bin/ksh
echo TIMESTAMP="$(date +'%y-%m-%d.%H-%M-%S')"
function showHelp {
cat << EOF >&2
syntax extreme.sh FILENAME
Specify filename to parse
EOF
echo "$" >&2
exit 1;
}
if [ $# -gt 1 -o $# -eq 0 ]; then showHelp "Please Specify Only One File\n";fi
#if [ $# -eq 0 ]; then showHelp "Please Specify Only One File\n";fi
sed -e "/$UNKNOWN SEGMENT/d" $1 > $1.tmp
mv $1.tmp $1
for i in `cut -c216-231 $1 |uniq`
do
awk /$i/ $1 > $i.file
split -a 4 -l 20000 $i.file $i.file
rm $i.file
FILES=01
for y in `ls $i
`
do
NUM_OF_FILES="$(ls $i* |wc -l|tr -d " ")"
if [ $NUM_OF_FILES -lt 10 ];then
NUM_OF_FILES=0$NUM_OF_FILES
sed -e "s/./$NUM_OF_FILES/230" -e "s/.//232" -e "s/../$FILES/110" -e "s/../$FILES/225" -e "s/./$NUM_OF_FILES/468" -e "s/.//470" $y |nl -nrz -ba -w6 > $y.tmp
else
NUM_OF_FILES=$NUM_OF_FILES
sed -e "s/./$NUM_OF_FILES/230" -e "s/.//232" -e "s/../$FILES/110" -e "s/../$FILES/225" -e "s/./$NUM_OF_FILES/468" -e "s/.//470" $y |nl -nrz -ba -w6 > $y.tmp
fi
FILES=`expr $FILES + 1`
typeset -Z2 FILES
mv $y.tmp $y
done
cat $i* |nl -nrz -ba -w7 > $i.file
rm $i.*aa*
done
cat *.file > $1.new
cut -f2,1- $1.new |sed 's/ //1' > $1.new.tmp
mv $1.new.tmp $1.new
awk '{one=substr($0,1,13); two=substr($0,525,13);gsub(two,one); print}' $1.new |cut -f2- > $1.new.tmp
mv $1.new.tmp $1.new
rm -f *.file
echo "Please Check the Modified \"$1.new\" in the current directory"
echo TIMESTAMP="$(date +'
%y-%m-%d.%H-%M-%S')"

run it on a faster box?

What exactly is the script doing?

too much awk, sed, cats lying around. what exactly are you wanting to do.

I have a big data file , these are things i am doing

I splitting the file with same pattern & then split into 20000 lines each

& adding count number of files(splitted) using sed & adding line numbers

again merging all same pattern files & again add line numbers.

Then swap first 2 columns, add substitute in another position in the file

Finally merging all files back.