KSH script -text file processing NULL issues

I'm trying to strip any garbage that may be at the end of my text file and that part is working. The problem only seems to be with the really long lines in the file. When the head command is executed I am directing the output to a new file. The new file always get a null in the 4096 position but only on the long lines. The long lines are 4405 long and the rest of the line is in the new file but has a null in position 4096. Not sure what the issue might be. Any thoughts?

Here's my script:

#! /bin/ksh

# Strip off extraneous characters from text file
# To envoke, type ./s.fixfile <filename> <tempfilename>

if [ $# -ne 2 ]
then
echo "Usage: $0 filename tempfilename"
exit 1
elif [ ! -a "$1" ]
then
echo "$1 does not exist"
exit 1
elif [ ! -f "$1" ]
then
echo "$1 is not an ordinary file"
exit 1
elif [ ! -r "$1" -o ! -w "$1" ]
then
echo "$1 is either not a readable or a writable file"
exit 1
fi
integer WCSIZE

WCSIZE=$(cat $1 | wc -l) # Get line count of file

head -$WCSIZE $1 > $2 # Create temp file based on line count

cp $2 $1 # Copy temp file to original

rm $2

Try

getconf LINE_MAX

Most UNIX tool utilties will not read a line longer than LINE_MAX. C, perl, python, dd, and some others do not have that problem. I'm guessing your LINE_MAX is 4096.

Bingo!

2048 words = 4096 bytes

Thanks, that's just the ticket I was hunting for!