User Tools

Site Tools


statistical_review

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
statistical_review [2023/10/05 17:28] – [Rules for Variance] hkimscilstatistical_review [2023/10/05 17:30] (current) – [Rules for the Covariance] hkimscil
Line 2: Line 2:
 see [[:expected value and variance properties]] see [[:expected value and variance properties]]
 ====== Rules for the Covariance ====== ====== Rules for the Covariance ======
-  - The covariance of two constants, c and k, is zero. \\ $Cov(c,k) = E[(c-E(c))(k-E(k)] = E[(0)(0)] = 0$ +see [[:covariance properties]]
-  - The covariance of two independent random variables is zero. \\ $Cov(X, Y) = 0$ When X and Y are independent. +
-  - The covariance is a combinative as is obvious from the definition. \\ $Cov(X, Y) = Cov(Y, X)$ +
-  - The covariance of a random variable with a constant is zero. \\ $Cov(X, c) = 0 $ +
-  - Adding a constant to either or both random variables does not change their covariances.  \\ $Cov(X+c, Y+k) = Cov(X, Y)$ +
-  - Multiplying a random variable by a constant multiplies the covariance by that constant. \\ $Cov(cX, kY) = c*k \Cov(X, Y)$ +
-  - The additive law of covariance holds that the covariance of a random variable with a sum of random variables is just the sum of the covariances with each of the random variables. \\ $Cov(X+Y, Z) = Cov(X, Z) + Cov(Y, Z)$ +
-  - The covariance of a variable with itself is the variance of the random variable.  \\ $Cov(X, X) = Var(X) $+
statistical_review.1696494527.txt.gz · Last modified: 2023/10/05 17:28 by hkimscil

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki