2.2 The multiplication principle

Chain rule:

Convert “$\cap$” into “$A|B$”(conditional prob).

$$ \mathbb P(A\cap B\cap C)=\mathbb P(A | B\cap C)\mathbb P(B\cap C)=\mathbb P(A | B\cap C)\mathbb P(B | C)\mathbb P(C) $$

Law of total probability:

Let $\mathbb P$ be a probability on $\Omega$ and let $\{C_1, C_2, \cdots, C_n\}$ be a partition of $\Omega$ chose so that $\mathbb P(C_i) \neq 0$ for all $i = 1, 2, \cdots, n$.

Then for any event $A \subseteq \Omega$,

$$ \begin{align*} \mathbb P(A) &= \mathbb P(A|C_1)\mathbb P(C_1) + \mathbb P(A|C_2)\mathbb P(C_2) + \cdots + \mathbb P(A|C_n)\mathbb P(C_n)\\ &= \sum \limits^n_{i=1}\mathbb P(A|C_i)\mathbb P(C_i) \end{align*} $$


2.3 The law of total probability

Let $\Omega$ be the sample space of the experiment, and $A$ and $B$ are the events.


2.4 Bayes’ formula

$$ \mathbb P (C_i|A) = \frac{\mathbb P (A|C_i)\mathbb P (C_i)}{\sum^n_{j=1}\mathbb P (A|C_j)\mathbb P (C_j)} $$

Situation:

Untitled

Since the law of total probability gives: $\mathbb P(A) = \mathbb P(A|B)\mathbb P(B) +\mathbb P(A|B^c)\mathbb P(B^c)$$\mathbb P (B_i|A) = \frac{\mathbb P (A|B_i)\mathbb P (B_i)}{\mathbb P (A)} = \frac{\mathbb P (A|B_i)\mathbb P (B_i)}{\mathbb P(A|B)\mathbb P(B) +\mathbb P(A|B^c)\mathbb P(B^c)} = \frac{\mathbb P (A|B_i)\mathbb P (B_i)}{\sum^n_{j=1}\mathbb P (A|B_j)\mathbb P (B_j)}$


2.5 Independent events

Disjoint & Independent