Get count of multiple word in single command

Hello Experts,

I have a log file that contains 4 different type of exception :
1- Exception
2- Fatal
3- Error
4- Exec

My requirement is to find count of each type of exception, i tried using combination of -E and -C but that doesn't seems to be working :

grep -ec 'Exception' -ec 'Fatal' -ec 'Error' -ec 'Exec' log.txt

Any guess ow to do it?

grep builds one overall pattern to match in the file(s) and thus can only yield one count. Use either four grep s, one for every single keyword, or try

awk -vKEYS="Exception Fatal Error Exec" '
BEGIN   {for (MX=n=split(KEYS, T); n>0; n--) SRCH[T[n]]
        }
        {for (s in SRCH) if ($0 ~ s) CNT++
        }
END     {for (c in CNT) print c, CNT[c]
        }
' file
Fatal 1
Exec 1
Error 1
Exception 1
1 Like

Hi.

Using a pipeline grep -> sort -> uniq :

#!/usr/bin/env bash

# @(#) s1       Demonstrate count of 4 unique strings, grep, sort, uniq.

# Utility functions: print-as-echo, print-line-with-visual-space, debug.
# export PATH="/usr/local/bin:/usr/bin:/bin"
LC_ALL=C ; LANG=C ; export LC_ALL LANG
pe() { for _i;do printf "%s" "$_i";done; printf "\n"; }
pl() { pe;pe "-----" ;pe "$*"; }
em() { pe "$*" >&2 ; }
db() { ( printf " db, ";for _i;do printf "%s" "$_i";done;printf "\n" ) >&2 ; }
db() { : ; }
C=$HOME/bin/context && [ -f $C ] && $C grep sort uniq

FILE=${1-data1}

pl " Input data file $FILE:"
cat $FILE

pl " Results:"
grep -Eo "Exception|Fatal|Error|Exec" $FILE |
tee f1 |
sort |
tee f2 |
uniq -c

exit 0

producing:

$ ./s1

Environment: LC_ALL = C, LANG = C
(Versions displayed with local utility "version")
OS, ker|rel, machine: Linux, 3.16.0-4-amd64, x86_64
Distribution        : Debian 8.7 (jessie) 
bash GNU bash 4.3.30
grep (GNU grep) 2.20
sort (GNU coreutils) 8.23
uniq (GNU coreutils) 8.23

-----
 Input data file data1:
Now is the time
This is an Exception
for all good men Exec
to come to the aid
Sparkle Farkle Error
of their country.
And now, this: Fatal
Exception, finally.

-----
 Results:
      1 Error
      2 Exception
      1 Exec
      1 Fatal

See files f1, f2 for intermediate data. See man pages for details.

Best wishes ... cheers, drl

1 Like

Perl solution for a log file with highly improbable text in line # 9:

$ 
$ cat -n logfile.txt
     1    this is line # 1
     2    this is Exception line
     3    this is line # 3
     4    this is Fatal line
     5    this is line # 5
     6    this is Error line
     7    this is Exec line
     8    this is line # 8
     9    this is Error line and Exception line
    10    this is line # 10
    11    this is Exception line
$ 
$ perl -lne '$h{$1}++ while(/(Exception|Fatal|Error|Exec)/g)}{print "$v $k" while(($k,$v)=each %h)' logfile.txt
1 Exec
2 Error
3 Exception
1 Fatal
$ 
$ 
1 Like

Hi.

The grep->sort->uniq pipeline also works with that:

...
 Input data file data2:
this is line # 1
this is Exception line
this is line # 3
this is Fatal line
this is line # 5
this is Error line
this is Exec line
this is line # 8
this is Error line and Exception line
this is line # 10
this is Exception line

-----
 Results:
      2 Error
      3 Exception
      1 Exec
      1 Fatal

Best wishes ... cheers, drl