I am writing a script which reads a file line by line and then assigns it to a variable like this 1090373422_4028715212.jpg. I have images with file name of this format in some other directory. In my script I want to assign variable with this file name and then find this filename in some other directory and delete it.
Maybe You could try something like this. I am sure there are more efficient ways to do it, especially when there are lots of files and deep structures. But it can give You an idea of how it can be done. This works in Bash. Probably in most other shells as well.
With the command line below You let the operation "while read" (the < listofnames.txt at the end means that that's where the input comes from) take each line, and use it as an argument to the program find.
When the program find finds it in or below ~(the ~ means Your home directory, replace it with something else, /share/pics or whatever), You use that result and pass it as an argument to the program xargs, which then performs an operation on it.
In this case I use ls (list or dir) since it is non-destructive, just to try it out and it will give me a confirmation of which files will be affected. I can replace it with rm when I think I am certain that it will work.
So:
lakris@Lakrix:~/Projects/scripts$ while read filename;do find ~ -type f
-name "$filename" -print0 |xargs ls;done < listofnames.txt
/home/lakris/bin/resumed-transactions.txt
/home/lakris/.anjuta/session.properties
lakris@Lakrix:~/Projects/scripts$
Well, if he hasn't fallen asleep... or cursed his lack of taking hourly backups...
-exec would be better than how I used xargs in my example. I was thinking of the possibility of long "lists" in the result but it wouldn't apply to how I wrote it. But I think the OP wants to treat a list of of items given, so maybe:
while read filename;do find ~ -type f -name "$filename" -print0 -exec ls {} \; ;done < listofnames.txt
and when You are sure, replace ls with rm...
would be better? Using rm -rf would delete all files (and directories) matching the pattern given in the constant string (DON'T RUN THAT COMMAND!) and I don't think that is what he wanted.
yep. if its a list, that would better suit his query. I've also edited the previous post because I noticed it was kinda dangerous for beginners... sorry about that.
When doing bulk removes, I do it in stages: generate the list, examine, then do the remove. Then I can check that I'm aiming for the correct foot One Last Time(tm) before shooting.
In the case above where the user wants to transform the file names before removing, sed is your friend. Using the ':' character instead of the traditional '/' for the 's' command allows me to avoid tiresome escaping in the sed command.
Thanks for your help everybody, my script is working fine now, I wasn't doing it right.
I wrote this command and its working fine, its searching for files with filename in some directory and deleteing it from there.
But just a hint. Shouldn't ksh be able to extract substrings internally, so that You don't have to call awk so often? Especially in a loop and with a lot of files. You may save SECONDS! Try to find it, in bash it would be something like
I still think it's clear enough and probably a lot faster.
But qneill's post is very valid, when developing scripts, it is good to save intermediary results, so You can run the "first" part and check to see that these really are the files You want to affect. And then operate on that set. And so on. It may even be more efficient.