Find the files based on date from filelist

Hi All,
I have a filelist which contains the "ls" output, i want to extract the filenames written in this files based on date. I know i can run loop and extract the file but i want to avoid it. also there is find command which can be used on directory i.e.

find . -type f -newermt "2019-09-25" -printf "%f\n"

But is there any way i can use find on Filelist instead of directory and get the file name. TIA

The content of the file list, i.e., filenames, has no associated filesystem metadata for any of the files named in your text file. You would have use the stat command on each one of the filenames in the text file.
And you will have to use epoch times ( seconds since Jan 1 1970).

when=1569540363
while read filename
do
  [ $(stat --format=%Y filename) -gt $when  ] && echo $filename
done < textfile

It is easier with a reference file that has been set to a specified mtime file with the touch command, shell does file time comparisons ( -ot, -nt ):

[ $file -nt $reference_filename ] && echo $filename

Can you create the text file list first with both a filename and a filetime (in epoch seconds) for each line in the text file? And then read that?

Thanks for the valuable inputs, my actual requirement was to FTP the files from remote server based on date for eg: If i am running my script today then i will save today's date in date_file, so when i run script next time the files will be FTPed which are greater than the date mentioned in my date_file.

I know there are limitations on FTP and i cannot do this approach directly so as a work around i thought i will list all file names present in FTP server to a local file and then out of that local file i will find files which are required and do the FTP and use mget(greater than date mentioned in date_file). I will try your suggestion.

*Is there any way i can send date information to FTP server and get only those files greater or equal to that date.

As i couldnt find the approach to get the file based on date directly from FTP server, i will get all the file names on my local and create filelist and then extract the file names greater than the particular date,

Date="20190926"
Filelist=FTP_DTL.txt

awk -v cutoff="$(date -d "$Date" +%s)" '{line=$0; "date -d \""$6" " $7" " $8 "\" +%s" |getline; fdate=$1} fdate > cutoff {print line} ' FTP_DTL.txt | awk -F ' ' '{print $9}' > Latest_File.txt

Above command will give me file names which are greater or equal to the date variable, and store in Latest_File.txt. Then will run the for loop on the Latest_File.txt and get those relevant files from FTP server. I cannot think of anything and not sure about the performance about this. Kindly share the views on this, or any other approach.

Thank You

Moderator comments were removed during original forum migration.