Regarding deletion of old files

Hi,

I have a list of directories which contain old files that are to be deleted.

I put the list of all directories in a txt file and it is being read by a script

Iam searching for the files older than 60 days using mtime and then deleting it

But all the files are getting deleted irrespective of mdate.iam providing the

script code here

#! /bin/sh

#setting the date

DATE='date + "%y%m%d"'

cat param.txt|

while read DIR TIME

do

#Check if the directory is present

if [ -d "${DIR}" ]

then

echo "$DIR is a directory on this machine"

echo "Time Limit:$TIME"   

echo "looking  for files older than $TIME days to be deleted from $DIR"

find $DIR -type f   -mtime $TIME |xargs rm -f

echo "older files are now  deleted successfully"

else

echo "$DIR is not a directory.please check again"

fi

done

Is there any thing wrong with the find command

my datafile is like this
/var/home/.... +60
/home/chidvilas/.. +60

But the script is removing all files irrespective of mtime??

CAn anyone suggest on this??

It works fine for me - ran on Fedora.

FIRST RUN - ONLY LOOKING FOR THE FILES (CHANGED | xargs rm -f to just -ls)
[tghunter@localhost ~]$ ./oldfiles
. is a directory on this machine
Time Limit:+1900
looking for files older than +1900 days to be deleted from .
387600    8 -rw-r--r--   1 tghunter tghunter     2048 Sep 19  2000 ./.openoffice.org2.0/user/gallery/sg100.sdv
older files are now deleted successfully

[tghunter@localhost ~]$ cp ./.openoffice.org2.0/user/gallery/sg100.sdv ./.openoffice.org2.0/user/gallery/sg100.sdvhog
[tghunter@localhost ~]$ ./oldfiles
. is a directory on this machine
Time Limit:+1900
looking for files older than +1900 days to be deleted from .
387600    8 -rw-r--r--   1 tghunter tghunter     2048 Sep 19  2000 ./.openoffice.org2.0/user/gallery/sg100.sdv
older files are now deleted successfully
CHANGED TO REMOVE FILES
[tghunter@localhost ~]$ vi oldfiles
[tghunter@localhost ~]$ ./oldfiles
. is a directory on this machine
Time Limit:+1900
looking for files older than +1900 days to be deleted from .
older files are now deleted successfully
[tghunter@localhost ~]$ ls ./.openoffice.org2.0/user/gallery/sg100.sdv
ls: ./.openoffice.org2.0/user/gallery/sg100.sdv: No such file or directory
[tghunter@localhost ~]$ ls ./.openoffice.org2.0/user/gallery/sg100.sdvhog
./.openoffice.org2.0/user/gallery/sg100.sdvhog
[tghunter@localhost ~]$ mv ./.openoffice.org2.0/user/gallery/sg100.sdvhog ./.openoffice.org2.0/user/gallery/sg100.sdv
[tghunter@localhost ~]$ cat param.txt
.    +1900

Check that your param.txt file is set up correctly and check your script by listing the files versus removing them to insure it will only get the files you really want to remove. (find $DIR -type f -mtime $TIME -ls )

I think you can use the find . -ctime -60 > $file to get the name of all the files 60 days older and then you can delete all the files in that $file.

Or do it all in one go

find <the path you want to check> -ctime -60 -exec rm {} \;

You would want to be very, very sure of where you run this, as most operating system files would be older than 60 days.