Naive Bayes Algorithm
Naive Bayes Classifier is based on Bayes theorem which gives the conditional probabity of an event A given B
Recap from conditional probability - given random variables X and Y, probability of X given Y can be expressed as:
P(X | Y) = P(X∩Y) / P(Y)
The same can be written for Y given X as:
P(Y | X) = P(Y∩X) / P(X)
Since P(X∩Y) = P(Y∩X), solving both equations gives:
P(X∩Y) = P(X | Y) P(Y) = P(Y | X) P(X)
We can rewrite conditional probability of X given Y as:
P(X | Y) = P(Y | X) P(X) / P(Y)
This is known as Bayes theorem. In plain English, this can be written as
Where is Naive Bayes used
Some of real world examples are as given below
- To mark an email as spam, or not spam
- Categorize a news article about technology, politics, or sports
- Check a piece of text expressing positive emotions, or negative emotions
- Medical Diagnosis- human body is in high risk or nor risk with cancer.
- Face recognition softwares to identify nose, mouth, eyes
Probability of Tossing two coins
Channces of outcome when toss two coins:{HH, HT, TH, TT}
P(Getting two heads) = 1/4
P(Atleast one tails) = 3/4
P(Second coin being head give first coin is tail) = 1/2
P(Getting two heads given first coin is a head) = 1/2

This is very simple and we know all the values of each occurrence, To understand bayes thearom will use this problem.