Probability Foundations and Bayes theorem intro

  • Random Event: Any event in which the outcome can’t be predicted in advance is a Random event (Ex: Toss of a coin, Roll of a dice)
  • Random variable: The outcome of a random event is called random variable
    1. Discrete
    2. Continuous
  • Probability: Chance of occurrence of a particular outcome in a random event (ex: heads in a coin toss)
  • Random Experiment: An experiment where a series of related or unrelated random events are performed, ex: toss of a coin repeated for N times, pick a box at random and then pick a ball inside the chosen box

Discrete Random Variable

A random variable which can assume any of the finite number of different values
Examples – Outcome in toss of a coin, roll of a dice
Consider a random variable x that can take discrete values from the set X = { v1, v2, … vm } and probability pi = P(x=vi) the following conditions must be met:
      pi ≥ 0
      Σ pi = 1
Set of probabilities {p1, p2, . . . , pm} can also be expressed as probability mass function P(x)

Continuous Random Variable

A random variable which can assume infinitely many values (real number set).
Examples – Market price of an item on a particular day.
It is very similar to z-score and normal distribution calculation.

Continuous Random Variable

Conditional Probability

Conditional probability is the probability of one event occurring with some relationship to one or more other events. For example: Event A is that it is raining outside, and it has a 30% chance of raining today.
Event B is that you will need to go outside, and that has a probability of 50%.
A conditional probability would look at these two events in relationship with one another, such as the probability that it is both raining and you will need to go outside.
In that case, probability of both events occurring P(X∩Y) = P(X).P(Y)
Ex: Outcome of 2 coin tosses
If random variable X is dependent on Y, then the probability of X given Y occurs is termed as ‘conditional probability’ P(X | Y), which can be expressed as the fraction of times X and Y occurs over the occurrence of Y, i.e.

      P(X | Y) = P(X∩Y) / P(Y)
      P(Y | X) = P(Y∩X) / P(X)
Here P(X∩Y) and P(Y∩X) are same. Intersection of two objects will have same results even after change the positions.
Here is the value P(X∩Y) = P(X | Y) * P(Y) = P(Y | X) * P(X) = P(Y∩X)

P(X | Y) * P(Y) = P(Y | X) * P(X)

P(X | Y) = (P(Y | X) * P(X)) / P(Y). -- This is called Bayes Theorem.