Script to Output Files That Have Got Stuck

Hi,

I need to create a script that we will schedule to run say every 30 mins to check a directory for files that have been present for a set period of time.

Situation we have is that we have an input folder where files are processed from and once processed they get moved to a done folder, sometimes however this movement doesnt take place (unsure why as yet) and the file remains in the input folder even though it has been fully processed.

Any ideas?

Cheers

Try this, change the number of seconds to what you want. Internally, file times are kept as seconds since Jan 1, 1970.

#!/bin/ksh
#  this works on any unix with perl.

filetime()
{
perl -e '
      $mtime = (stat("$ARGV[0]"))[9];
      $diff = time - $mtime;
      printf ("%d", $diff); ' "$1"
}


TIME_ALLOWED=1200  # 1200 seconds
# cd /path/to/files
for file in *.DAT   # assuming the files have a special format
do
   secs=$(filetime $file)
   if [ $secs -gt $TIME_ALLOWED ] ; then
      echo "$file is $secs old"
   fi
done