Hi,
I'm, new to shell scripting, I have a requirement where I have to split an incoming file into separate files each containing a maximum of 3 million rows.
For e.g: if my incoming file say In.txt has 8 mn rows then I need to create 3 files, in which two will 3 mn rows and one will contain 2 mn rows. The number of rows in In.txt will vary.
My Algorithm is as follows:
I would first do a wc -l on In.txt to get no: of rows
Divide the count/3mn to get no: of files to be created (8/3=2.66 ~3)
Then in a loop use head and tail commands to transfer rows into files into new file
The above algorithm will work, but I think using awk there may be something easier.
Please let me know if there is a faster and easier way of doing this
Regards
Wahi