Hi,
What's the best way to find all files under a directory - including ones with space - in order to apply a command to each of them. For instance I want get a list of files under a directory and generate a checksum for each file.
Here's the csh script:
#!/bin/csh
set files = `find $1 -name '*'`
foreach file ($files)
set checksum = `sum $file`
\# Ignore directories
if\($status == 0\) then
echo "$checksum $file"
endif
end
The problem with this is that if there are spaces in files, it doesn't work. Unfortunately csh is the only scripting language I sort of know. Is there a better way to do this?
Thank you in advance,
Sam
find $1 -name "*" -type f -exec sum {} \;
If this is on Linux (i.e. using the GNU utilities) try:
find $1 -type f -print0 | xargs -0 sum
-print0 tells it to use nulls to separate the filenames instead of line feeds, and the -0 option tells xargs to expect the input in that format.
If you don't have the GNU utilities handy, try this method:
find $1 -type f ; while read file ; do sum "$file" ; done
On a separate note, generally I would avoid using csh unless you have a good reason to use it.
I do have GNU tools and that worked very well. Thanks.
I was wondering if there's a way to print out just the checksums and not the pathnames. I have a huge number of files - and I want to keep the size of the checksum report to a minimum.
Thanks again,
Sam
Do you not need to know which file each checksum belongs to? You can easily just pull out the checksum column by adding | awk '{print $1}' to the pipeline.
Incidentally I would recommend using cksum or md5sum rather than sum, which has a very simplistic checksum algorithm.
Thanks for those tips. I tried cksum. It much faster which is what I need.