Execute script in series of folders

I'm trying to get a script to iterate of an array in bash run a series of commands in those folders.

#Folder names
folders=(folder1 folder2 folder3)

#Common path for folders
fold_path="/Users/Rich/Projects/"

# For each element array copy script to folder and execute
for f in "${folders[@]}"
do
cp $desktop_script $fold_path/$f
chmod +x $fold_path/$f/$desktop_script
exec $fold_path/$f/$desktop_script
done

This part of the code seems to be working ok, the script is being copied across and made executable. I think the issue is with the secondary script.

This script gets the name of the directory it is in and adds that name to the commands it executes, specific for each folder by name

path=${PWD##*/}
 rm -rf .git
git remote rm origin
git remote add origin https://github.com/myusername/"${PWD##*/}".git
git push --set-upstream origin master

The error I keep getting is that the folder name, which is is evaluating fine returns:

./gitter.sh: line 2: folder1: command not found

That for-loop will never loop. The exec will replace the shell with its command argument before the first iteration completes.

Do you actually need a copy of the script in each target directory? Why not simply cd in a subshell?

for f in "${folders[@]}"
do
    (cd "$fold_path/$f"; "$desktop_script")
done

If it matters, the subshell isn't strictly required.

Regards,
Alister

Yeah I had tried the cd command before but it kept saying that the directory didn't exist, I think to do with changes in sub-shell paths(?).
Tried it with brackets and it worked.
Cheers dude