Loop to move files in different directories

Hi,

I have various log files in different paths. e.g.

a/b/c/d/e/server.log
a/b/c/d/f/server.log
a/b/c/d/g/server.log
a/b/c/h/e/server.log
a/b/c/h/f/server.log
a/b/c/h/g/server.log
a/b/c/i/e/server.log
a/b/c/i/e/server.log
a/b/c/i/e/server.log

and above these have an archive folder e.g.

a/b/c/d/e/server.log
a/b/c/d/e/archive/
a/b/c/d/f/server.log
a/b/c/d/f/archive/
a/b/c/d/g/server.log
a/b/c/d/g/archive

How do i go about moving the server.log files into the respctive archive folder in the same directory structure. I could use a find statement along the lines of

find /a/b/c/ '*.log' -exec mv * /archive/ {} \:

??

$ for d in /a/b/c/*/*/archive/
do
 mv ${d%/archive/}/server.log $d
done
cd /a/b/c

find . -type f -name "*.log" |while read file
do
  mv $file ${file%/*}/archive
done

forgot to mention this will be run on a daily cron job. I used the code below to run this process but it deletes all the .log files that are already in the archive folder. How do I get the .log files that are already in the archive folder to stay and move the rest of the .log files that are not in archive folder into archive?

cd /a/b/c

find . -type f -name "*.log" |while read file
do
  mv $file ${file%/*}/archive
done

Thanks

BTW: I hardly ever cd, even in scripts, as I found it hindered my productivity and led to errors. The problem with cd is then, commands are not reusable. The cd is really a typing shorthand. Effectively, it is in competition with command recall. Absolute paths are not much of a burden if they are not typed. UNIX ksh life was more consistent, error-free, better if I used X cut/paste, command recall & editing: set -o vi/viraw w/export HISTSIZE=32767, even archiving .sh_history occasionally for my tools that recall history. I even wrote a vi wrapper so my xterm scroll history (also set real big) was not overwritten and the return is always zero (so results are not discarded). Since I never leave $HOME, I can use relative paths for all common things, some through my own sym-links, and absolute for things less frequent. All my history is rerunnable. If it needs cd, I do: (cd ... ; .... )

---------- Post updated at 11:20 AM ---------- Previous update was at 11:12 AM ----------

My solution moves one dir at a time, not one file, which might be a bit faster and lower overhead.

It also ensures the target is present. It does not check to see if the source files are present, but that is not worth scripting or easy to script cheaply.

If you get a huge dir, when someone allows too many files to expand the directory inode, the speed difference is very substantial. With my plan, you are going to traverse that directory:

  1. once in find to find the archive/,
  2. once in ksh to find the source files for the mv command line, and then
  3. once (for every file?) inside mv to find the archive\
  4. once for every file inside mv, but just far enough through the directory to find that file and overwrite that part of the directory.

you can exclude the archive folder.

find . -type f -name "*.log" |grep -v archive |while read file
do
  mv $file ${file%/*}/archive
done

When moving logs (hopefully unattached: not open long term), time stamps in the file names are nice, like log_until_$(date '+%Y''%m''%d_%H''%M''%S'). Do "sleep 1" after each so all logs are uniquely named.

Why the ''? things like %M% are SCCS metastrings, so this means they can be checked in.

Also, cron likes % as a metacharacter so persistently that I wrote a date replacement wrapper script date_cron using ~ for %, just so cron lines could use dated log files from the very start!

No need for find when all the files you want to move are at the same depth, just glob:

for f in a/b/c/*/*/server.log
do
 mv $f ${f%/server.log}/archive/log_until_$(date '+%Y''%m''%d_%H''%M''%S')
 sleep 1
done