Environment: Bash on RHEL 8.9
I have a 'master file' like below with hundreds of lines.
In this file, I have the file names with the syntax file<n>.prm.code
. The immediate line below each file name (like file1.prm.code
) should
become the content of that file.
$ cat master_file.txt
file1.prm.code
Hello world
file2.prm.code
Hello Universe
file3.prm.code
Some random content for file3
file4.prm.code
Hello galaxy
So, using the above 'master file', is there a way, I can generate files like below ?
$ cat file1.prm.code
Hello world
$
$ cat file2.prm.code
Hello Universe
$
$ cat file3.prm.code
Some random content for file3
$
$ cat file4.prm.code
Hello galaxy
1 Like
Most likely than not.
Where exactly are you stuck?
Always a single line to go to the file? It would not be hard to deal with multiple lines for each output file, terminated by either the first blank line, or the next appropriate filename.
The following relies on a closing empty line (and would take multi-line contents).
#!/bin/bash
# Read a line from stdin (redirected), it is the filename
while read fn
do
# Read further lines and print to stdout (redirected)
# until the line is empty
until
IFS= read -r line; [ -z "$line" ]
do
printf "%s\n" "$line"
done > "$fn"
# Simply continue, the next line will be a filename
done < master_file.txt
3 Likes
Please Explain this :
IFS= read -r line; [ -z "$line" ]
and why the first line itself (i.e the name of file to be created) is not added as data to file.
Thanks
You are an angel, MadeInGermany. Thank You !
I was trying some stuff with grep
but had no idea how to loop through line by line.
Plus, I didn't think of simple solution of output redirection to generate the files. Very clever.
One more question:
How does [ -z "$line" ]
test work with the UNTIL
clause ?
[ -z "$line" ]
tests if $line
is empty.
The loop repeats until $line
is empty.
The read
command is stuffed before it; it is just executed; the exit status is taken from the [ ]
command (the last command before the do
).
until
is like while not
You can also write
while
IFS= read -r line; [ -n "$line" ]
do
While $line
is not empty ...
The outer loop has the input redirected; everything up to the closing done
is redirected.
The first read (the first line) goes to the fn
variable. Then the inner loop does the next reads to the line
variable, until an empty line has been read.
3 Likes
The file name is not written to its output file because that line is safely held in fn
and is not printed at all. Your original example data does not show that as output either.
Because the redirection is owned by the until
loop, you would need to wrap another block around both a single printf "%s\n" "$fn"
before the until
, and the until ... done
loop inself, to reflect the fact that you are sending one header line and several contents lines to the same redirection.
{
printf "%s\n" "$fn"
until IFS= read -r line; [ -z "$line" ]; do
printf "%s\n" "$line"
done;
} > "$fn"
4 Likes
Still , one point is not understood.
Until loop is a different inner (subshell) loop and has different variable(line)
When this loop acts and takes the control to the blank line, then how this is communicated to the outer loop so that the control of the outer do-while loop also shifts to the line after this blank line.
This means outer loop is not executing independently and the control of inner loop is communicated to outer parent loop (but how).
Hope i have communicated my question with clarity.
Thanks.
Each read
reads one line (the next line from the stdin stream), regardless if the read
is located in the outer or inner loop.
The "communication" is the file pointer.
The stdin is redirected by the outer loop, and everything in it, including the inner loop, gets this redirected stdin (unless it were again redirected).
1 Like
Thanks buddy for increasing my knowledge..
Sincerely,