statistical_review
This is an old revision of the document!
Rules for Variance
$$\DeclareMathOperator{\Var}{Var} \DeclareMathOperator{\Cov}{Cov} \DeclareMathOperator{\Corr}{Corr} $$
- The variance of a constant is zero.
- $ \sigma_{c} = {VAR}(c) = 0 $
- Adding a constant value, c to a variable does not change variance (because the expectation increases by the same amount).
- $ \sigma_{x+c} = VAR(X+c) = E[((X_{i} + c)-E(\overline{X} + c))^{2}] = VAR(X) $
- Multiplying a constant value, c to a variable increase the variance by square of the constant, c.
- $\sigma_{c*x} = VAR(cX) = c^{2}VAR(X) $
- The variance of the sum of two or more random variables is equal to the sum of each of their variances only when the random variables are independent.
- $\text{VAR}(X+Y) = \text{VAR}(X) + 2 \text{COV}(X,Y) + \text{VAR}(Y) $
- because X and Y are independent to each other Covariance X and Y is 0.
- $\text{VAR}(X+Y) = \text{VAR}(X) + \text{VAR}(Y) $
statistical_review.1512951570.txt.gz · Last modified: 2017/12/11 08:49 by hkimscil