There are three types of probability:
\(\sf \text{Probability of Occurrence} = \large \frac{X}{T}\)
\(\sf \text{Where, X = number of ways in which the event occurs}\)
\(\sf \ \ \ \ \ \ \ \ \ \ \ \ \ \text{T = total number of possible outcomes}\)
In a priori probability,the probability of an occurrence is based on priori knowledge of the process involved. For example, number of ways the event occurs and the total number of possible outcomes are known from the composition of the deck of cards or the faces of the die.
In the empirical probability approach, the probabilities are based on observed data, not on priori knowledge process. For example, Surverys are often used to generate empirical probabilities.
subjective probability differs from the other two approaches because, this type of probability differs from person to person, hence the name Subjective. This probability is usually based on individuals’s past experience, personal opinion, and analysis of a particular situation. Subjective probability is especially useful in making decisions in situations in which you cannot use a priori probability or empirical probability.
It refers to an activity or measurement that results in an outcome
Example: Tossing a single coin for 50 times
Each possible outcome of a variable is reffered to as an event
In a sample space containing at least two events, the chance of the occurence of each of the event is same
Example: In a coss tossing experiment, having a head or tail is equal to 1/2 of each
At a time if there is only one outcome,then the event is said to be mutually exclusive
Example: Coin tossing experiment, only one outcome at a time. heads or tails
Result of a random experiment is called an outcome
Example: In coin tossing experiment, the two outcomes are head and tail
When you toss a coin, possible outcomes are either head or tail. Each of these are called as a simple event.
A joint event has two or more characteristics
Getting two heads when you toss a coin twice is an example of a joint event because it consists of heads on the first toss and heads on the second toss
The complement of event A is represented by \(\bf A^\prime\)
The complement of head is a tail in a coin toss as it is the only event that is not a head
The collection of all possible events is called the sample space
It refers to the probability of occurrence of a simple event,P(A)
\(\text{Probability of Occurrence} = \frac{\large \text{Number of ways in which the event occurs}}{\large \text{Total number of events}}\)
Whereas simple probability refers to the probability of occurrence of simple events, joint probability refers to the probability of an occurrence involving two or more events An example, joint probability is the probability that you will get heads on the first toss of a coin and heads on the second toss of a coin
The Marginal probability of an event consists of a set of joint probabilities.
For example, if B consists of two events \(\bf B_1\) and \(\bf B_2\), then P(A), the probility of event A, consists of the joint probability of event A occurring with event \(\bf B_1\) and event \(\bf B_2\)
Marginal Probability P(A) = P(A and \(B_1\)) + P(A and \(B_2\)) + …. + P(A and \(B_k\))
\(\text{Where }B_1,B_2,....,B_k \text{ are k mutually exclusive and collectively exhaustive events}\)
P(A or B) = P(A) + P(B) - P(A and B)
Conditional probability refers to the occurrence of event A, given information about the occurrence of another event B
\(P(A|B) = \large \frac {P(A\cap B)}{P(B)}\)
\(P(B|A) = \large \frac {P(A\cap B)}{P(A)}\)
Two events, A and B are said to be independent only if
P(A|B) = P(A)
where,
P(A|B) = Conditional probability of A and B
P(A) = Marginal probability of A
The probability of A and B is equal to the probability of A given B times the probability of B
\(P(A\cap B) = P(A|B) * P(B)\)
Multiplication rule for independent events is derived by substituting P(A) for P(A|B) in the above equation
\(P(A\cap B) = P(A) * P(B)\)
If this rule holds for two events, A and B, then A and B are independent.
Therefore, there are two ways to determine independence :
P(A) = P(A|\(B_1\)) * P(\(B_1\)) + P(A|\(B_2\)) * P(\(B_2\)) + ….. + P(A|\(B_k\)) * P(\(B_k\))
P(\(A_i\)|B) = \(\large \frac{P(A_i)*P(B|A_i)}{\large \sum_{\large i=1}^{\large n} P(A_i)*P(B|A_i)} \ ,\normalsize i = 1,2,3,....,n\)