script to move logs

Does anyone have a good script to move log files from a cron?

There are a lot of logs, about 100 or more and they are constantly
being written too.

The better way is to use syslog for that.

Do you not have a logrotate command?

it turns out the problem is a little more serious.

df -h
/ is 51G used
/proc is 0k

du -shd /
/ is 38G used

du -shd /proc
/proc is 18G

because this is a zone on the root file system, I think
there is something in another zone or global zone that
is causing file system to grow but I can't find it.

Only suspicious directory is proc.

Does anyone know how to find the culprit or solution here?

/proc does not take up any space; it is a virtual file system only.

Do you have a logrotate command?

which logrotate says no.

There are three kinds of logs.

1.) Logs that do not end.

application1.log
application2.log

I am planning to copy these to backup drive as follows:
cp application1.log /backup/application1.log
cp /dev/null applications1.log

2.) some logs that when they hit 50M create 7 files.

file.log
file.log.0
file.log.1
...etc...
file.log.7

I wont really be doing anything here as it seems these logs
are under control.

3.) others create daily logs such as:

file.2008-10-01.log
file.2008-10-02.log

I created a perl script to move all these files to backup drive when
older that three days.

However, I still can't find the extra 11-12G that df shows and du does not.

du and df measure different things. They will not show the same numbers.

I was able to use the information from this thread.
here

I went to global zone and ran this.

du -akd / | sort -nr > du.out
awk '/\/logs$/' du.out > tmp.out

tmp.out gave me a nice list of all logs from biggest to smallest on root
slice. I did this with core files too and it worked great. Some where on
global zone others where on zones in root.