Grep Command

I have one text file, contains some values like <Q1:Name>1000000</Q1:Name>. I want to read the values and stores into an array for searching the each values into a file locations.
If I use this

find location |xargs grep -l "<Q1:Location>100000055042</Q1:Location>"

,I get the files, are having that value but if I store the values into an array and use the same command

find location |xargs grep -l "${array[$i]}"

it couldn't find the files. it's showing find: bad status-- location
Can you please tell me what is the problem?

Which shell are you using? sh, csh, bash?

bash shell.

Can you paste the relevant part of the script i.e. array population, actual file contents being used etc

--ahamed

How are you

What command(s) are you using to do this?

PFA the file,script. want to delete the files,hav the match values from that location.

You may want to paste the commands to a message, as well as a small section of the data.
Many, myself included, do not open zipped files.

File contains the values like this:

LOC
<Q1:Location>100000035647</Q1:Location>
<Q1:Location>012010087120</Q1:Location>
<Q1:Location>100000057069</Q1:Location>
<Q1:Location>9500023238</Q1:Location>
<Q1:Location>9001013233</Q1:Location>
<Q1:Location>9001017224</Q1:Location>
<Q1:Location>100000055042</Q1:Location>
<Q1:Location>100000193712</Q1:Location>

Code in script

index=0
while read LOC
do
[ "$LOC" == "LOC" ] && continue 
Service[$index]=$LOC
echo ${ServiceNonProd[$index]}
find location |xargs grep -l "${Service[$index]}"|xargs rm -r 
index=`expr $index + 1`
done < location/Target.txt