Report with mutiple files.

Hi all,
In the process of creating CPU reports.
I've already used `sar` to create a daily file, then monthly reports for CPU usage (which is averaged across all 4 CPU's).
I've now used `cpusar -P ?`(?=CPU#) to collect individual CPU data and have four files for each day which is great.

The problem I'm trying to work out is how to script the loops to get what details I need.
The filename for the daily cpusar files are like the following...
-r--r--r-- 1 operator group 4123 Jan 22 23:55 cpu022.1
-r--r--r-- 1 operator group 4123 Jan 22 23:55 cpu022.2
-r--r--r-- 1 operator group 4123 Jan 22 23:55 cpu022.3
-r--r--r-- 1 operator group 4123 Jan 22 23:55 cpu022.4
-r--r--r-- 1 operator group 4123 Jan 23 23:55 cpu023.1
-r--r--r-- 1 operator group 4123 Jan 23 23:55 cpu023.2
-r--r--r-- 1 operator group 4123 Jan 23 23:55 cpu023.3
-r--r--r-- 1 operator group 4123 Jan 23 23:55 cpu023.4
-r--r--r-- 1 operator group 4123 Jan 24 23:55 cpu024.1
-r--r--r-- 1 operator group 4123 Jan 24 23:55 cpu024.2
-r--r--r-- 1 operator group 4123 Jan 24 23:55 cpu024.3
-r--r--r-- 1 operator group 4123 Jan 24 23:55 cpu024.4

The file name is made up with the Julian date and the CPU#.

I'm wanting to create a report similar to the following, cycling backwards 30 days from the current Julian date.

       ---- CPU #1 ----  ---- CPU #2 ----  ---- CPU #3 ----  ---- CPU #4 ----
Date   Avg Pek Pek-Time  Avg Pek Pek-Time  Avg Pek Pek-Time  Avg Pek Pek-Time  
01/01  000 000 00:00:00  000 000 00:00:00  000 000 00:00:00  000 000 00:00:00

Any suggestions? I've really only got basic scripting skills presently and know little(if any) `awk` or `perl`.

Cameron,

Will you please send/show the content of any of
the files you are creating ?
It will help to work out the solution.

J1yant

J1yant,

Thanks for the reply, but if you have `sar` or `cpusar` then you'll already have an understanding of the output. Fortunately the content is not relavent as I'm wishing to acquire the necessary looping logic to then retrieve the details.

Cheers,
Cameron

OK, so you have limited scripting skills.

Here's an easy suggestion which won't tax them.

This only works by groups of the same month, which is what I'm assuming you want to do.

1) ls -lt will list files in time stamp order

2) grep will pick out certain months for you

3) cat those files into your report

e.g.

ls -lt *[filenames] | grep Jan | awk '{ print $9 }' > temp

produces a list files from January

edit temp to look like

cat [file1] > report
cat [file2] >> report
cat [file3] >> report
cat etc.

then simply source temp

Ok there are far more suffisticated ways to do it, but this way just uses ls & grep ( oh and maybe awk)

AreaMan,

Thanks for your reply, but it's far from what I was requiring.
You may not have read my previous posting correctly.
Where in earth was I going to use `s -lt *[filenames] | grep Jan | awk '{ print $9 }' > temp ` when I'm using Julian dates to form the filenames.

Sorry, my scripting skills are a bit sharper than that.
As I said before - looping logic.
I've got 4 files for each day that I need to loop through, get stats and report.