Uncorrelated Random Variables
(Redirected from uncorrelated)
Jump to navigation
Jump to search
Uncorrelated Random Variables are two random variables which covariance equals zero, [math]\displaystyle{ \operatorname{cov}(X,Y) =0 }[/math].
References
2016
- (Wikipedia, 2016) ⇒ https://www.wikiwand.com/en/Uncorrelated_random_variables Retrieved 2016-07-24
- In probability theory and statistics, two real-valued random variables, X,Y, are said to be uncorrelated if their covariance, E(XY) − E(X)E(Y), is zero. A set of two or more random variables is called uncorrelated if each pair of them are uncorrelated. If two variables are uncorrelated, there is no linear relationship between them.
- Uncorrelated random variables have a Pearson correlation coefficient of zero, except in the trivial case when either variable has zero variance (is a constant). In this case the correlation is undefined.
- In general, uncorrelatedness is not the same as orthogonality, except in the special case where either X or Y has an expected value of 0. In this case, the covariance is the expectation of the product, and X and Y are uncorrelated if and only if E(XY) = 0.
- If X and Y are independent, with finite second moments, then they are uncorrelated. However, not all uncorrelated variables are independent. For example, if X is a continuous random variable uniformly distributed on [−1, 1] and Y = X2, then X and Y are uncorrelated even though X determines Y and a particular value of Y can be produced by only one or two values of X.