I am reading a file that contains over 5000 lines and I want to assign it to a shell variable array (which has a restriction of 1024 rows). I had an idea that if I could grab 1000 record hunks of the file, and pipe the records out, that I could perform a loop until I got to the end and process 1000 records at a time. For example
Does anyone know of a unix command that will allow the user to return a range of lines from a file to standard output?
The command that I was trying was.
set -A CustNo `cut -f1-19 -d',' -s RAGEFF.lst|sed 's/,/ /g'|sed 's/\L//g'|nawk '{ if (NF == 19) {print $1}}' `
Never mind. with different search parameters, I found the answer.
use a combination of head and tail will return any range you want. Sorry to bug the forum, I should have looked harder first.
For example to get lines 1001-2000 your would issue the following command.
set -A CustNo `cat RAGEFF.lst|head -2000|tail -1000|cut -f1-19 -d',' -s |sed 's/,/ /g'|sed 's/\L//g'|nawk '{ if (NF == 19) {print $1}}' `