We have a huge directory that ha 5.1 Million files in it. We are trying to get the file name and modified timestamp of the most recent 3 years from this huge directory for a migration project.
However, the ls command (background process) to list the file names and timestamp is running for more than 24 hrs and it aborted after sometime.
Is there an efficient way to get the file names and modified timestamp in such huge directory in a relatively quicker time?
Other info like filesize, permissions, etc are also available using the -printf option.
As far as feedback goes if you've been managing this huge directory on a system for a while you probably realize it's not a good idea. Many simple commands, shell auto completes etc. will take a long time and usually require processes to be killed. If your migrating this I would consider a sub folder layout, this could be derived from the filename created date etc.