'Dynamic' setting of variables in bash script

Hi all,

I want to dynamically set variables in a bash script. I made a naive attempt in a while loop that hopefully can clarify the idea.

n=0; echo "$lst" | while read p; do n=$(($n+1)); p"$n"="$p"; done 

The error message is:

bash: p1=line1: command not found
bash: p2=line2: command not found
bash: p3=line3: command not found
bash: p4=line4: command not found
bash: p5=line5: command not found

I want each line of a list of arbitrary length and content stored in variables, so the content of a specific line can be identified by the variable p[1-5] (in this example).

Any suggestions welcome! If possible, I'd prefer a while loop solution.

Thank you for reading!

Try using eval :

eval p"$n"="$p"

It's not a very good practise to set variables like this though. A much better way is to set arrays:

$ a[1]=test
$ echo ${a[1]}
test
1 Like

Thank you! eval works well for this particular purpose!

Can you elaborate on why it isn't good practice this way?

I not familiar with arrays, so i can't figure out how to implement it on the basis of your example.

You could implement an array like this for example:

n=0; echo "$lst" | while read p; do n=$(($n+1)); a[$n]="$p"; done

Or this:

echo "$lst" | while read p ; do a[${#a[@]}]="$p" ; done

the use of eval should be limited as much as possible as it could be used to execute malicious code and it can be very confusing...

Thank you for clarifying and for your example!

This is a poor example because, if someone tried to name an array `rm -Rf ~/`, eval would happily execute that and erase your home directory.

We get dozens and dozens of requests for dynamic variable names, but they don't generally make much sense. How would you even use those variables afterwards? If you take a good look at what you're trying to do, there's usually ways to do it in a far more straightforward way.

One way is arrays. Another way is just not storing the data at all -- you don't have to keep absolutely everything stored in memory at all times. If you only intend to use the lines one at a time, there's no point -- just use them as you read them, then throw them away.

I agree with the use of eval, but I'm confused with your response to my example where I'm suggesting arrays instead of eval. In what way is this a poor example?

It's a good example of arrays. Whether arrays are actually needed here is the question.

Novice programmers tend to store entire documents:

# Store every line
while read LINE
do
        ARR[L]="$LINE"
        let L=L+1
done < inputfile

# Modify and print each line
for((M=0; M<$L; M++))
do
        V="${LINE/A/B}"
        echo "$V"
done > outputfile

...when there was no need:

# Read, modify, and print
while read LINE
do
        echo "${LINE/A/B}"
done < inputfile > outputfile
1 Like

For instance, in classic processing, you sort all the input files so you can evalutate the current lines and then discard on or the other. In the old days, machines were sold with as little as 2K, so programs had to line up their ducks before the shot them. Sometimes there were multiple sorts to join input transactions to many differently keyed masters. The merge process may also produce output transactions that may need sorting to go to the next, differently keyed master.

Note that shell arrays come in two flavors, the integer indexed simple flavor, and the alphanumeric string indexed, hash map associative arrays. Sometimes the total environment is system limited to about a meg., so robust solutions do not put into 'memory' any set that can have unpredictably large size.

$ (s=1 x=x
while [ 1 ]
do
  echo $s >/dev/tty
  (( s += s ))
 x="$x$x"
done)
1
2
4
8
16
32
64
128
256
512
1024
2048
4096
8192
16384
32768
65536
131072
262144
524288
1048576
2097152
4194304
8388608
16777216
33554432
67108864
134217728
ksh[6]: no space
$

Somewhere around 100M here.

1 Like