Help needed to extract distinct logs from a Log File

Hi,
I urgently need some help how to extract distinct entries from a Log file.
The Log File may have same error occuring many times so how do i count the occurance of an error in file and also extract out distinct errors in a file.
Eg:-
I have a file name A.log
it contains entries as below:-
Error 1
Error 2
Error 1
Error 1
Error 2

I want distinct errors and their occurances.

What you have tried so far ..

---------- Post updated at 04:28 PM ---------- Previous update was at 02:51 PM ----------

@ roro , For your attention : Awk to print distinct col values

egrep -v "Error 1|Error 2" logfile

this will remove ALL the entries you dont care about from the log file and whatever is outputted will be distinct.

Here's a quick and dirty for you

#Create your working file, if this isn't a running logfile you can skip this step and just pipe the output from sort on A.log the next step.
cp A.log A.log.tmp

#Sort for unique values and grep on your common Error text
sort A.log.tmp | uniq | grep "Error" >> my_unique_errors.log

You can try this :

 awk 'BEGIN{i=1}(NR==1){a=$0;b=1;i++;next}
{for (j=1;j<i;j++){if($0==a[j]){b[j]+=1;next}};a=$0;b=1;i++}END{for (j=1;j<i;j++){print a[j]":"b[j]}}' a.log

The output will be like this:

Error 1:3
Error 2:2
cat file | grep  "Error"  | sort | uniq -c
      3 Error 1
      2 Error 2