Segment a big file into smaller ones

Greeting to all.

I have big text file that I would like to segment into many smaller files. Each file should be maximum 10 000 lines.

The file is called time.txt. after the execution of the file I would like to have.

time_01.txt, time_02, txt, ...,time_n.txt

Can anybody help.

Br.

use split command in UNIX.

you can read about it using man command.

bash3.0$ man split

see also csplit
 
typeset -i total_lines
typeset -i no_of_files
typeset -i file_no
typeset -i start
typeset -i end
total_lines=$(sed -n '$=' time.txt)
no_of_files=$total_lines/10000+1
file_no=1
end=0
while [ "$file_no" -lt "$no_of_files" ]
do
   start=$end+1
   end=$end+9999
   sed -n "${start},${end}p" time.txt > "time_${file_no}.txt"
done
1 Like