Filter data and send email

Need script....We get sar file sa15 with 7 days data which gets appended.. we want filter and parse the outptu such that we should filter only last 48 hours data and then email it to out team.

we read -- sar -f sa15

Please help me with the script either by use sar itself or even sar and awk

Its urgent?

sar -f sa15 ---- last 48 hours >> output
mailx ---------

The simplest solution would be to calculate how many lines of data would be generated over 48 hours, e.g. if sar data is collected ever 20 minutes then that would be 3 lines per hour making 144 lines so you could run:

# sar -f sa15 | grep -vi average | grep -v "^$" | grep -vi restart | tail -144 > sa15.$$

# mailx -s "Last 48 hours sar data at `date`" user@domain.com < sa15.$$

# rm sa15.$$

Note use of single arrow to replace any file that was present so the mail only contains the last 48 hour's data.

---------- Post updated 22-01-10 at 16:43 ---------- Previous update was 21-01-10 at 21:03 ----------

With a sar output like this:

and to get a report every hour then use a script like:

 
sar -f sa15 | grep -v Average | grep -v "^$" | grep -vi restart | grep -v System | grep -v AIX | tail -144 > sa15.$$
mailx -s "Last 48 hours sar data at `date`" user@domain.com < sa15.$$
rm sa15.$$

Put this in a file, e.g. /usr/local/bin/sar_mail.sh, make it executable and onwed by the user you want to run it:

# chmod 500 /usr/local/bin/sar_mail.sh
# chown sys /usr/local/bin/sar_mail.sh

, then add a line to a user's crontab to call it every hour, e.g.:
Run:

$ crontab -e

the add line to the user's crontab:

0 * * * *  /usr/local/bin/sar_mail.sh

This will run the script on the hour every hour.

What format are the dates in?

cfajohnson - Good question the sar output noorm mailed to me had times only so you'd have to count back how many times the last time occurs to extract 48 hours.
I'm used to each day being in separate sann and sarnn files.

Hi TonyFuller,

I got this reply from my official when i ran ur 1st sar shell prog -

---- Reply is----
The assumption and calculation are not correct. You must look at the top line where you have the starting date.
next lines have starting time. Use both of these pieces for info. Sar file will not contain 48 hours of data-----
Sar output for you again --

$sar -f sa15
AIX eqrdb26u1 1 6 00CFD5E34C00    01/15/10
System configuration: lcpu=8 ent=2.00 mode=Uncapped 
13:25:18    %usr    %sys    %wio   %idle   physc   %entc
14:25:18      46      13       2      39    1.22    60.9
15:25:18      63      16       2      19    1.62    81.2
16:25:18      47      13       6      34    1.23    61.4
17:25:18      42      12       8      39    1.10    55.1
18:25:18      46      14       7      34    1.22    61.1
19:25:18      26      11       1      63    0.76    38.0
20:25:18      37      11       1      51    0.99    49.6
21:25:18      63      12       1      23    1.54    77.1
22:25:18      42       9       5      43    1.06    52.9
23:25:18      49       9       3      38    1.19    59.5
00:25:18      34       8       2      57    0.85    42.4
01:25:18      45      10       1      44    1.11    55.6
02:25:18      26       7       1      66    0.67    33.7
03:25:18      53      10       0      37    1.28    63.9
04:25:18      27       7       1      66    0.70    34.8
05:25:18      15       6       1      79    0.44    22.1
06:25:18      16       6       1      78    0.46    23.1
07:25:18      31       7       1      61    0.80    39.9
08:25:18      62      11       0      27    1.49    74.5
09:25:18      64      11       1      24    1.53    76.5
10:25:18      28       7       0      65    0.71    35.6
11:25:18      15       5       2      79    0.42    20.9
12:25:18      16       5       1      77    0.46    22.8
13:25:18      20       6       4      69    0.55    27.7
14:25:18      32       8       3      58    0.82    40.9
15:25:18      63      11       1      26    1.50    75.1
16:25:18      31       7       1      61    0.78    39.2
17:25:18      17       5       1      77    0.46    23.0
18:25:18      21       6       1      72    0.56    28.0
19:25:18      17       5       1      76    0.47    23.3
20:25:18      24       6       1      69    0.62    30.9
21:25:18      61      11       1      28    1.46    73.1
22:25:18      60      11       1      28    1.45    72.3
23:25:18      36       7       1      56    0.88    43.8
00:25:18      17       6       1      76    0.49    24.4
01:25:18      16       5       1      78    0.44    22.1
02:25:18      13       5       1      81    0.39    19.5
03:25:18      16       5       1      78    0.44    21.9
04:25:18      18       5       1      76    0.48    24.0
05:25:18      14       5       1      80    0.41    20.5
06:25:18      20       6       2      73    0.53    26.6
07:25:18      33      10       3      54    0.89    44.3
08:25:18      61      11       2      25    1.49    74.4
09:25:18      20       6       0      74    0.54    27.2
10:25:18      16       6       1      77    0.46    22.9
11:25:18      21       6       1      73    0.55    27.6
12:25:18      24       6       0      69    0.64    31.8
13:25:18      17       6       2      76    0.47    23.3
14:25:18      15       5       1      79    0.42    20.9
15:25:18      13       5       0      82    0.38    19.1
16:25:18      11       5       0      84    0.33    16.5
17:25:18      11       5       0      84    0.34    16.8
18:25:18      11       5       1      83    0.34    17.1
19:25:18      11       5       0      84    0.33    16.5
20:25:18      19       5       0      75    0.51    25.3
21:25:18      14       5       0      81    0.39    19.6
22:25:18      18       5       0      76    0.50    24.9
23:25:18      42      11       1      46    1.10    55.1
00:25:18      47      17       1      35    1.34    66.8
01:25:18      51      11       2      35    1.30    64.9
02:25:18      39      12       3      46    1.05    52.6
03:25:18      23       7       1      68    0.64    32.0
04:25:18      23       6       1      70    0.61    30.6
05:25:18      42      12       2      45    1.11    55.3
06:25:18      46      11       5      38    1.17    58.7
07:25:18      36       9       4      51    0.93    46.5
08:25:18      50      10       3      37    1.23    61.4
09:25:19      59      12       2      26    1.46    73.0
10:25:19      30       7       2      61    0.77    38.4
11:25:19      23       6       1      70    0.61    30.4
12:25:19      20       6       0      74    0.53    26.4
13:25:19      26       7       0      67    0.68    34.0
14:25:19      60      11       0      29    1.44    71.9
15:25:19      42       8       0      49    1.03    51.6
16:25:19      28       7       0      65    0.71    35.5
17:25:19      20       6       0      73    0.55    27.7
18:25:19      26       7       0      67    0.68    33.9
19:25:19      22       7       0      71    0.60    29.8
20:25:19      32       7       1      60    0.81    40.4
21:25:19      57      11       0      32    1.38    69.1
22:25:19      48      11       0      41    1.21    60.3
23:25:19      35       7       0      57    0.87    43.6
00:25:19      13       7       0      80    0.43    21.3
01:25:19      17       6       1      77    0.47    23.6
02:25:19      32       9       1      58    0.84    42.1
03:25:19      15       6       0      78    0.46    23.0
04:25:19      18       7       1      75    0.52    25.8
05:25:19      26       9       2      62    0.73    36.3
06:25:19      15       6       1      79    0.43    21.7
07:25:19      21       7       0      72    0.58    29.0
08:25:19      45      10       0      45    1.12    56.1
09:25:19      47      11       1      41    1.19    59.3
10:25:19      26       7       0      66    0.69    34.4
11:25:19      17       6       0      77    0.48    23.9
Average       31       8       1      60    0.81    40.4

It has become a challenge for to get this done ---I will never let you down.

Please help?

Let me know of you need any info?

Official = Teacher or Lecturer methinks.
Is this homework?

Tony Fuller,

This is not homework? Officials means my Seniors...They want this report..i am a DBA witn minimum
o/s knowledge. So i need ur help in this

Customers specifying solutions rather than giving requirements is never good!

If you are getting sar data once an hour rather than every 20 minutes then you can do the maths to see hour many lines of data you want...

Otherwise I will hand over to cfajohnson's far better scripting abilities than mine.

The update to the script knowing that a line of data is added every hour is:

sar -f sa15 | grep -v Average | grep -v "^$" | grep -vi restart | grep -v System | grep -v AIX | tail -48 > sa15.$$
mailx -s "Last 48 hours sar data at `date`" user@domain.com < sa15.$$
rm sa15.$$

Please read the grep man page; look at the -e option.

I hope that this is the current thread (of many) for this topic.

To my mind the key point is in this statement from O/P. I could be wrong.

This ties up with my assumption that this is an incorrectly set up "sar" data collection and that there are multiple occurances of the day of the month appended into the sar(nn) data file. There is a strong hint in the quote above that you need to process more than one day file.
Depending on what time of day the report is run it would take two (or three) of these sar data files to get 48 hours data. This is further complicated at month boundaries. It is much too complicated to process in retrospect and requires some forward planning. If my guess is correct the sa(nn) files will grow as more data is appended.

A logical approach would be to prepare one day's summary data at the end of each day in a file named after the date in the sar heading. It is then a case of mailing the most recent two files concatonated.