I have 63,000 files in a directory [per one day]
I want to move all the 20100219 day files to /target directory
I used
$mv *20100219* /target
too many arguments
Then i used
$find . -name "*.txt"|grep "20100221"|xargs -I '{}' mv {} /target
It will take more than 3 hours to move all the files
Is there any way to optimise the command where it will take less time
Same for removing
$find . -name "*.txt"|grep "20100221"|xargs rm -rf
[For 60,000 files it's taking 20 minutes ]
How to reduce the time for both command . or is there any other way to do this.
Please let me know
pludi
2
You might get a bit of speed using
find . -type f -name '*20100219*.txt' -exec mv {} /target \;
or, if you have GNU find
find . -type f -name '*20100219*.txt' -exec mv {} /target \+
But in the long run, it's probably better to change whatever application that's writing these files to use a more manageable structure.
You are two pipes to achieve this .It may slow down the process.
Without the pipe we can achieve,
find . -maxdepth 1 -iname "*20100221*.txt" -exec mv {} target \;
Here I used the maxdepth option.Because If the target folder is in the same directory then it will through error like,
mv: `./target/a20100221b.txt' and `target/a20100221b.txt' are the same file
So better we can use maxdepth.
Nila
4
mv command itself we can do that one. command is,
mv *date*
Another way,
find . -type f -iname '*20100111*.txt' -exec mv {} /target \;
Those ways are give some speed to move command.