Creating an array that stores files to be called on.

If the user wants to call multiple files of their choosing and in different directories and then have all those files placed into one command ex: chmod * * file1 file2 how would you go about that?

I was thinking starting some like this and then thinking would I loop it.

ARRAY=($(find . -name file))
 for file in ${ARRAY[@]}; do echo $file; done

This is going to be used so that a user can add them to a command for example chmod and rm.

What keeps you from trying it (mayhap with the -x ( --xtrace ) option set)? Commands accepting multiple arguments don't even need the for loop nor the array. Try

chmod ... $(find...)

.
Be aware that running a command blind folded with a resulting argument list that you don't know upfront is dangerous; applied to the wrong directories with the wrong options might render your system unuseable!
And, files with white space chars in their names might irritate / confuse the command.

Thanks for the reply.

What keeps me from trying it is that, the user needs to be able to do multiple files from different directories and different file names, so what ended up happening is that from using

find 

if there is a file with that name then all those files get taken. I am also having difficulties with trying to set it up due to me still learning bash.

The problem is also that the result of $(find . -name file) will be field split by the shell governed by the content of IFS which by default is space, TAB and newline, which means that is will not work for filenames that contain any of those characters.

IMO a better approach would be to use find 's -exec action, which would not have this problem.

or alternatively use a while-loop:

find .....  |
while read file
do
  echo "Do something with "${file}"
done

Though the latter construct would fail with files that contain newline characters..