/usr/bin/cut not working with largefiles on Solaris 10

I have a person running a perl script that is parsing > 2G log files and pipes to cut -d " " -f 1,6,7,8...
The script itself is in a nfs mounted home directory. It runs fine when started from a solaris 8 box but fails after about 400 lines when started from the solaris 10 box. The solaris 8 box is old and slow so the use of the solaris 10 box is desired.
Additional info: if the script is run without the pipe to cut, you see the thousands of lines scrolling. If the script is run with a pipe to head or wc, it works as expected. Only the /usr/bin/cut seems to truncate the output. The user has tried various parameters with the pipe to cut with the same result.
The man pages indicate that cut should be largefile aware.
Any suggestions/idea are greatly appreciated.

These must be very long lines if you suspect 400 of them blew a 2 GB limit. cut may not work with super long input lines. What does "fail" mean? Any error message? Exit code? Does "cut -c 2-" also fail? Send the output of the script into a file and stop after you get a few hundred lines. Try your cut command with this data file. (By the way, post your cut command. Without the ellipsis.)

cut -d " " -f 1,6,7,8... < datafile

Does this fail? After 400 lines? Remove first 399 lines from file. Does it fail at line 2? Isolate line 2 and post it.