Need script....We get sar file sa15 with 7 days data which gets appended.. we want filter and parse the outptu such that we should filter only last 48 hours data and then email it to out team.
we read -- sar -f sa15
Please help me with the script either by use sar itself or even sar and awk
Its urgent?
sar -f sa15 ---- last 48 hours >> output
mailx ---------
The simplest solution would be to calculate how many lines of data would be generated over 48 hours, e.g. if sar data is collected ever 20 minutes then that would be 3 lines per hour making 144 lines so you could run:
# sar -f sa15 | grep -vi average | grep -v "^$" | grep -vi restart | tail -144 > sa15.$$
# mailx -s "Last 48 hours sar data at `date`" user@domain.com < sa15.$$
# rm sa15.$$
Note use of single arrow to replace any file that was present so the mail only contains the last 48 hour's data.
---------- Post updated 22-01-10 at 16:43 ---------- Previous update was 21-01-10 at 21:03 ----------
With a sar output like this:
and to get a report every hour then use a script like:
sar -f sa15 | grep -v Average | grep -v "^$" | grep -vi restart | grep -v System | grep -v AIX | tail -144 > sa15.$$
mailx -s "Last 48 hours sar data at `date`" user@domain.com < sa15.$$
rm sa15.$$
Put this in a file, e.g. /usr/local/bin/sar_mail.sh, make it executable and onwed by the user you want to run it:
cfajohnson - Good question the sar output noorm mailed to me had times only so you'd have to count back how many times the last time occurs to extract 48 hours.
I'm used to each day being in separate sann and sarnn files.
I got this reply from my official when i ran ur 1st sar shell prog -
---- Reply is----
The assumption and calculation are not correct. You must look at the top line where you have the starting date.
next lines have starting time. Use both of these pieces for info. Sar file will not contain 48 hours of data-----
Sar output for you again --
I hope that this is the current thread (of many) for this topic.
To my mind the key point is in this statement from O/P. I could be wrong.
This ties up with my assumption that this is an incorrectly set up "sar" data collection and that there are multiple occurances of the day of the month appended into the sar(nn) data file. There is a strong hint in the quote above that you need to process more than one day file.
Depending on what time of day the report is run it would take two (or three) of these sar data files to get 48 hours data. This is further complicated at month boundaries. It is much too complicated to process in retrospect and requires some forward planning. If my guess is correct the sa(nn) files will grow as more data is appended.
A logical approach would be to prepare one day's summary data at the end of each day in a file named after the date in the sar heading. It is then a case of mailing the most recent two files concatonated.