My script executes too long than expected

Below is my script code which filters failed logins from existing and non-existing users from the server. The problem is, it takes longer to execute and complete than expected. Anyone here can edit the script to speed up its execution.

#!/bin/bash

LIST=`cat $1`
for i in $LIST
do
date=`date --date="yesterday" +'%b %d'`

ssh -q root@$i cat /var/log/secure | grep "$date" | grep Failed > failed-logins-$i.txt

    #Splitting the failed-logins in INVALID & VALID 

    cat failed-logins-$i.txt | grep -i invalid > invalid-failed-logins-$i.txt
    cat failed-logins-$i.txt | grep -v invalid > valid-failed-logins-$i.txt

    # Count the valid-failed-logins per host and print

    echo -e "\n************"
    echo -e "\n$i"

    echo -e "\n--Failed Logins for EXISTING Accounts:"
    
    cat valid-failed-logins-$i.txt | awk '{a[$1 FS $2 FS $9 FS "\t"$11]++} END {for (i in a) {print i "  =  " a }}'

    echo -e "\n--Failed Logins for NON-EXISTING Account:"
    
    # Count the invalid-failed-logins per host and print

    cat invalid-failed-logins-$i.txt | awk '{a[$1 FS $2 FS $11 FS "\t"$13]++} END {for (i in a) {print i "  =  " a }}'
 
done

This link would be a start I guess: 10 Tips for Writing Efficient Bash Scripts | HACKTUX

Cheers,
Lo�c.

You are executing several external commands for each and every word of the file you are processing. You can reduce the number first by removing all those useless cats and giving the filename as an argument to awk or grep or other commands.

Thanks cfajohnson and loic.. I'll review your tips and also remove those cat command.

Best Regards..:slight_smile:

Worth a look at the size of /var/log/secure and check whether it is maintained. How old is the oldest entry in that file?

The ssh line is just causing the whole of /var/log/secure to be copied up the network link to a local pipe.
It would be more efficient to run the commands in a script on the remote server and then copy the output files back to the calling server.

Hi CfaJOHNSON,

I have this below script from TonyFuller,

sar -f sa15 | grep -v Average | grep -v "^$" | grep -vi restart | grep -v System | grep -v AIX | tail -144 > sa15.$$
mailx -s "Last 48 hours sar data at `date`" user@domain.com < sa15.$$
rm sa15.$$

i have sar file which i run as sar -f sa15 -- which has 7 days data..
Each row has a timestamp where 1st line tells me the date and each line has hour
I want to email only last 48 hours....

$sar -f sa15
AIX eqrdb26u1 1 6 00CFD5E34C00    01/15/10
System configuration: lcpu=8 ent=2.00 mode=Uncapped 
13:25:18    %usr    %sys    %wio   %idle   physc   %entc
14:25:18      46      13       2      39    1.22    60.9
15:25:18      63      16       2      19    1.62    81.2
16:25:18      47      13       6      34    1.23    61.4
17:25:18      42      12       8      39    1.10    55.1
18:25:18      46      14       7      34    1.22    61.1
19:25:18      26      11       1      63    0.76    38.0
20:25:18      37      11       1      51    0.99    49.6
21:25:18      63      12       1      23    1.54    77.1
22:25:18      42       9       5      43    1.06    52.9
23:25:18      49       9       3      38    1.19    59.5
00:25:18      34       8       2      57    0.85    42.4
01:25:18      45      10       1      44    1.11    55.6
02:25:18      26       7       1      66    0.67    33.7
03:25:18      53      10       0      37    1.28    63.9
04:25:18      27       7       1      66    0.70    34.8
05:25:18      15       6       1      79    0.44    22.1
06:25:18      16       6       1      78    0.46    23.1
07:25:18      31       7       1      61    0.80    39.9
08:25:18      62      11       0      27    1.49    74.5
09:25:18      64      11       1      24    1.53    76.5
10:25:18      28       7       0      65    0.71    35.6
11:25:18      15       5       2      79    0.42    20.9
12:25:18      16       5       1      77    0.46    22.8
13:25:18      20       6       4      69    0.55    27.7
14:25:18      32       8       3      58    0.82    40.9
15:25:18      63      11       1      26    1.50    75.1
16:25:18      31       7       1      61    0.78    39.2
17:25:18      17       5       1      77    0.46    23.0
18:25:18      21       6       1      72    0.56    28.0
19:25:18      17       5       1      76    0.47    23.3
20:25:18      24       6       1      69    0.62    30.9
21:25:18      61      11       1      28    1.46    73.1
22:25:18      60      11       1      28    1.45    72.3
23:25:18      36       7       1      56    0.88    43.8
00:25:18      17       6       1      76    0.49    24.4

Could you make changes to the script or suggest a new script......

I also want to run this script every hour----but not using crontab other than cron...
like sleep/autosys/autorep etc what ever you suggest..

@noorm Hmm, you have two threads of your own open on your issue with sar.

As previously stated, the command "sar -f sa15" is flawed. Looking at your latest example the file "sa15" is corrupt because it should not contain data for more than one day. It could contain a jumble of multiple previous 15th of the month data when sar data collection is not set up correctly on a system.

Methyl,

Data get appended to the sa15 file automatically.....we have filter and send only last 48 hours data. every row is hourly data ie., tail -48.

How can i schedule it to run every hour without scheduling it in cron...as we dont have access to cron... any other way out (using sleep, autosys etc).
Pleas suggest....

@noorm
Other correspondents have replied to your questions on your other posts. I've lost the plot. Any chance you can consolidate the whole lot into a private document. While doing this the lightbulb might light before your project work is due in.

Hi i am not sure how much time gain you are going to get

but you can provide file name as argument to grep you dont have to cat file every time same goes with awk

    
    cat failed-logins-$i.txt | grep -i invalid > invalid-failed-logins-$i.txt
    can be => grep -i invalid ailed-logins-$i.txt > invalid-failed-logins-$i.txt

    cat failed-logins-$i.txt | grep -v invalid > valid-failed-logins-$i.txt
    can be => grep -v invalid failed-logins-$i.txt > valid-failed-logins-$i.txt
  
    cat valid-failed-logins-$i.txt | awk '{a[$1 FS $2 FS $9 FS "\t"$11]++} END {for (i in a) {print i "  =  " a }}'
  
      can be => awk '{a[$1 FS $2 FS $9 FS "\t"$11]++} END {for (i in a) {print i "  =  " a }}' valid-failed-logins-$i.txt
    
    cat invalid-failed-logins-$i.txt | awk '{a[$1 FS $2 FS $11 FS "\t"$13]++} END {for (i in a) {print i "  =  " a }}'

Correct me if i am wrong

@noorm & @cfajohnson
Lets stick to noorm's current thread (of many) and unhijack this one.
http://www.unix.com/shell-programming-scripting/128273-filter-data-send-email.html