Ls ignoring files from their modification time

Hi everyone,

I'd like to know if is there a way to list files but ignoring some according to their modification time (or creation, access time, etc.) with the command 'ls' alone.

I know the option -I exist, but it seems to only looking in the file name..

Thank you in advance for the answers.

P.S : Ftm, I'm transferring the result of 'ls' to awk (by pipe ofc)..maybe there is a more elegant way to do so..

Hi,

Try this out,

#! /usr/bin/bash
for file in *;
do
   if [ -d "${file}" ];
   then
      echo "${file} is a Directory"
   elif [ -s "${file}" ];
   then
     awk 'END{print FILENAME;}' ${file}
   else
      echo "${file} is unknown file type"
   fi
done

I think the loop around the awk is the best idea with *(excludes spaces in file name issue).
I just wrote some simple example.
Please modify the code based on your req.

Cheers,
Ranga:)

find could be better choice for you.
look for -newer, -older, -mtime, -atime, -ctime options under that.

Yes, I thought about find , the problem is that I have something like 1000 folders, and in each there are something like 200 files...And I'm only looking for one..

I tried to optimize find..but it's way way longer than ls..

I doubt it!

Please post what Operating System and version you are running and what Shell you are using. There is much variation in the ls command and as much variation in the find command.

Please state what your criteria are for selecting files.

(Please avoid using textspeak or obscure abbreviations when posting. Ftm? ofc?).

@rangartasan
Not quite sure how the script addresses the original post. The script posted contains a couple of bugs and some excess semi-colon characters. Notably it did not deal with filenames containing spaces or files of zero length.

#! /usr/bin/bash
for file in *
do
   if [ -d "${file}" ]
   then
      echo "${file} is a Directory"
   elif [ -f "${file}" ]
   then
     awk 'END{print FILENAME;}' "${file}"
   else
      echo "${file} is unknown file type"
   fi
done

How would find checking every single file, be slower than ls checking every single file?

Shell : Ksh (KSH_VERSION='@(#)PD KSH v5.2.14 99/07/13.2')
O.S : Linux (2.6)
Find version : GNU find version 4.2.27
ls version : ls (GNU coreutils) 5.97

Structure : I have 1000 folders (with small variation, for example folder1 - folder1000)
in each folder I have another folder (out)
in "out" I have something like 200 or more files (every folders have the same files).

I want to select "myFile"only if the modification time is lesser than 1month then concatenate all these files..

Find solution

find / -maxdepth 3 -regex "/folder[0-1000]/out/myfile" -ctime -31 -execdir cat {} \;

ls solution (1month = 31*24*3600=2 678 400 seconds )

ls -l --time-style=+%s /folder[0-1000]/out/myfile| awk -v current=`date "+%s"` -v threshold="2 678 400" ' { if ( (current-$6) < threshold ) { print $7 } } ' | xargs cat

Use -mtime in find (not -ctime ).
Your find is using parameters not seen in unix. Perhaps someone with a similar O/S can check them?
The find posted looks like it does a find from root not from a sensible start point (which could take ages to run). What are you actually trying to do and what is the top-level directory structure?