statistical_review
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
statistical_review [2017/12/11 09:08] – hkimscil | statistical_review [2023/10/05 17:30] (current) – [Rules for the Covariance] hkimscil | ||
---|---|---|---|
Line 1: | Line 1: | ||
====== Rules for Variance ====== | ====== Rules for Variance ====== | ||
- | The variance of a constant is zero. | + | see [[: |
- | + | ||
- | Adding a constant | + | |
- | $ \sigma_{x+c} = Var(X+c) = E[((X_{i} + c)-E(\overline{X} + c))^{2}] = Var(X) $ | + | |
- | + | ||
- | Multiplying a constant value, c to a variable increase the variance by square of the constant, c. | + | |
- | $ \sigma_{c*x} = Var(cX) = c^{2}Var(X)$ | + | |
- | + | ||
- | The variance of the sum of two or more random variables is equal to the sum of each of their variances only when the random variables are independent. | + | |
- | $ Var(X+Y) = Var(X) + 2 Cov(X,Y) + Var(Y)$ | + | |
- | and $ Cov(X,Y) = 0 $ | + | |
====== Rules for the Covariance ====== | ====== Rules for the Covariance ====== | ||
- | The covariance of two constants, c and k, is zero. | + | see [[:covariance properties]] |
- | $Cov(c,k) = E[(c-E(c))(k-E(k)] = E[(0)(0)] = 0$ | + | |
- | + |
statistical_review.1512952725.txt.gz · Last modified: 2017/12/11 09:08 by hkimscil