Note in Probability basis
Published:
Some of the concepts of probability I think we will use frequently.
Expectations.
Some useful properties:
- Linearity. $\mathbb{E}[\sum_i c_jg_j(X)] = \sum_j c_j \mathbb{E}[g_j(X)]$, where $c_j$’s are not random variables.
- Independence. If $X, Y$ are independent random variables $P(X \in A, Y \in B) = P(X \in A) P(X\in B)$ for all $A, B$, then $\mathbb{E}[X, Y] = \mathbb{E}[X] \mathbb{E}[Y]$.
- Iterated Expectation. \(\mathbb{E}[Y] = \mathbb{E}[\mathbb{E}[Y |X] = \int{\mathbb{E}[y |X = x]p(x)dx}\)
Variance. The variance of a random variale is: \(Var[X] = \mathbb{E}[(X - \mathbb{E}[X])^2 = \mathbb{E}[X^2] - {\mathbb{E}[X]}^2\)
Some useful properties:
- Independence. If $X_1, …, X_n$ are independent, then $Var[\sum_i a_iX_i] = \sum_i a_i^2 Var[X_i]$.
- Law of total variance. $Var[y] = Var[\mathbb{E}[Y|X]] + \mathbb{E}[Var[Y|X]]$.
Markov’s inequality. If $X$ is a non-negative random variable, then for any $\epsilon > 0$: \(P[X \geq \epsilon] \leq \frac{\mathbb{E}[X]}{\epsilon}\)
