Hi I am using find command --
find "directory1" -type f | xargs -i mv {} "directory2"
to avoid above argument list too long problem.
But, issue i am facing is directory1 is having subdirectories due to this i am facing directory traversal problem as i dont want to traverse subdirectories coming under directory1.
I tried to use -maxdepth option also but it is not giving me correct soultion i guess.
If anyone can help me to get some solution for this....
-maxdepth may not work sometime.. you can try this alternate way
find ../<<Your Directory>> \( ! -name <<Your Directory>> -prune \) -type f | xargs -i mv {} "directory2"
1 Like
If I understand you correctly, you want to move files in directory1 but not in sub directories into directory2?
for i in directory1/* ; do if [ -f $i ] ; then mv $i directory2/ ; fi; done
1 Like
You can do
with GNU find
find "directory1" -maxdepth 1 -type f | xargs -i mv {} "directory2"
or with Unix find
find "directory1" \! -name "directory1" -prune -type f | xargs -i mv {} "directory2"
But the error message looks like there is a bug either in xargs or in mv. Or you have a filename with a * character.
1 Like
The shell can hold a maximum of 131072 bytes for command line arguments. If you try to pass more than that number you will greeted with an error that read as follows:
Argument list too long
Remember each character requires one byte of storage.
1 Like
Minor nit: The parentheses are redundant.
More importantly, the solution is incorrect. If ../<<Your Directory>> itself contains a directory named <<Your Directory>>, it will not be pruned since -name only looks at the basename.
A very simple fix would be to cd into the starting directory and use . as the starting directory.
Having pointed out how to prune with find, i endorse Skrynesaver's approach (although I would use quotes to be safe).
Regards,
Alister
1 Like
But with find | xargs it's not the shell.
Either xargs does not know the system limit or mv has a smaller limit than the system.
A work around is
xargs -n 50 ...
providing at maximum 50 arguments to mv.
BTW {}
should be quoted "{}"
. This will handle most special characters alright.
1 Like
I don't think there is any error message. The first post states that find-xargs is used to avoid it and that the problem is preventing a descent into subdirectories.
I think this is a case of a poorly titled thread.
Regards,
Alister
---------- Post updated at 04:22 AM ---------- Previous update was at 04:11 AM ----------
Neither makes sense since the OP's xargs -i passes only one argument per mv.
Ignore the thread title and re-read the first post. I think you've been mislead.
Regards,
Alister
---------- Post updated at 04:38 AM ---------- Previous update was at 04:22 AM ----------
Those quotes will not make any difference at all.
xargs will never see them, since the shell will consume them.
To the shell, they are pointless because {} is nothing special (not even to shells which perform brace expansion, which only do so when there's an unquoted comma within the braces).
Regards,
Alister
2 Likes
Thank you all for your quick responses...
alister is right.
"More importantly, the solution is incorrect. If ../<<Your Directory>> itself contains a directory named <<Your Directory>>, it will not be pruned since -name only looks at the basename. "
and Skrynesaver's approach works fine...
Littele more help here....
going forward in script i have specified condition that If directory2 is having files in it then tar those file... code snippet looks like as follows
if [ "find directory2/*.*" ]
then
tar -czvf $2.tar.gz directory2
mv $2.tar.gz directory3
for i in $1/wrk/*
do if [ -f $i ]
then
rm $i
fi
done
problem here i am facing is when there are no files in directory2 it is still creating tar file. is there any alternative to that?
P. S.: we are using "find" because using "ls" will result in again "argument list too long error"
If there are no files under directory2 i strictly dnt want to create tar file for the same...
The reason that code doesn't complain about too many arguments is because it does absolutely nothing at all, it doesn't run find, it doesn't expand , nothing, nada, zip. *. is a DOS thing by the way, not a UNIX one.
The way to avoid too many arguments is to not experiment with quotes and find and backticks until you find a way that stops complaining when you jam too many arguments in the shell... If you find a way that probably means you broke it, because too many arguments is too many arguments no matter how you cut it. The way to avoid too many arguments is to not jam too many arguments into the shell.
Here I limit find's output to one filename, which should do it:
if [ -f "`find directory2 -type f | head -n 1`" ]
then
...
fi
1 Like