How to store files names from a directory to an array

Hi
I want to store the file names into an array.
I have written like this but I am getting error.

declare -A arr_Filenames
ls -l  *.log | set -A arr_Filenames $(awk '{print $9}')
index=0
while (( $index < ${#arr_Filenames
[*]})); do
          Current_Filename=${arr_Filenames[index]}
          Log "index $index curr file name " ${Current_Filename}
          
          let  index="index + 1"
done

But I am getting error
declare: [log_purge.sh](file://%5C%5Cuswa-pfsx-na01b%5CInformatica-Dev%5CSD%5CFiles%5CSD_ACSHealthBenefits_1_01%5CScripts%5Cacs_log_purge.sh) 29: Unknown option "-a"

What shell are you using?

This construct is fishy:

That awk statement is out of place -- did you want to estract 9th column from ls output? Why do you use -l option then at all?
Try this:

arr_Filenames=(`ls *.log`)

or, if you really want to filter the output of ls command:

arr_Filenames=(`ls -l *.log | awk 'someAwkCommandHere'`)

The var=() construct will make 'var' an array. (In Bash, at least)

You can also assign into array in a following way:

$ ls -l  *log 
-rw-rw-r-- 1 user group  23230 May 17 10:19 Changelog
-rw-rw-r-- 1 user group 145060 May 17 12:01 config.log
$ ls -l  *log | while read -a arr ; do 
  echo "perms=${arr[0]} ; size=${arr[4]}" ;
done
perms=-rw-rw-r-- ; size=23230
perms=-rw-rw-r-- ; size=145060

Which will read each line of input being piped in into an array (-a switch of built-in read).

I am using MKS Toolkit to execute the shell script.

ls -l  *.log | while read -a arr ; do 
  Current_Filename=${arr[8]} ;
  Log "Current_Filename $Current_Filename"
done 

Error Unknown option "-a"

arr_Filenames=(`ls *.log`)

Error 43: syntax error: got (, expecting Newline

I am trying to list all files from the directory and delete which are two months old. My file has a f1_YYYY-MM-DD.log.
Wanted to delete which are two months old.

Thanks and Regards
Digambar

OK. From what I understand, MKS toolkit supports different shells. What is the first line of your script? (starting with #!)

This should work:

find . -type f -mtime +60 -exec ls {} \;

will execute 'ls -l <filename>' on all files inside current directory and all subdirectories that were modified more than 60 days ago. Look at 'man find' to check the 'mtime' option.
If you know the name format of your log files, you should specify that also to filter out old files that you might want to keep.
The following will remove all files named f1_YYYY-MM-DD.log inside the current dir (and any level deep), that were modified at least 60 days ago. You should test this without the '-exec rm {} \;' part first, to list the files matched by find; then when you are sure, run the whole thing to delete them.

find . -name "f1_[0-9][0-9][0-9][0-9]-[0-9][0-9]-[0-9][0-9].log" -type f -mtime +60 -exec rm {} \;

I haven't used the MKS toolkit, but here's an interesting test that might shed light:

#!/bin/sh
DIR=/tmp/test
rm -rf $DIR;  mkdir -p $DIR;  cd $DIR  ||  exit 1
touch "ab cd" "ef gh" "ij kl"

a=()
for f in *; do
    a[${#a[@]}]="$f"
done

for f in "${a[@]}"; do
    echo "$f"
done

These are the results I get when I run it:

ab cd
ef gh
ij kl
1 Like

Yes this is working.

Here is the code what I did.

cd $LOG_DIR
for f in *
do
let cnt="cnt + 1"
filename=$f
echo "$filename"
done
echo "total count $cnt"