statistical_review
This is an old revision of the document!
Rules for Variance
$$ \DeclareMathOperator{\Var}{Var} \DeclareMathOperator{\Cov}{Cov} \DeclareMathOperator{\Corr}{Corr} $$
- The variance of a constant is zero.
- $ \sigma_{c} = {VAR}(c) = 0 $
- Adding a constant value, c to a variable does not change variance (because the expectation increases by the same amount).
- $ \sigma_{x+c} = VAR(X+c) = E[((X_{i} + c)-E(\overline{X} + c))^{2}] = VAR(X) $
- Multiplying a constant value, c to a variable increase the variance by square of the constant, c.
- $\sigma_{c*x} = VAR(cX) = c^{2}VAR(X) $
- The variance of the sum of two or more random variables is equal to the sum of each of their variances only when the random variables are independent.
- $\Var(X+Y) = \Var(X) + 2 \Cov(X,Y) + \Var(Y) $
- because X and Y are independent to each other Covariance X and Y is 0.
- $\Var(X+Y) = \Var(X) + \Var(Y) $
statistical_review.1512951653.txt.gz · Last modified: 2017/12/11 08:50 by hkimscil