Variance Metric: Difference between revisions

From GM-RKB
Jump to navigation Jump to search
No edit summary
No edit summary
Line 1: Line 1:
A [[Variance Metric]] is a [[metric]] that represents the [[expected value]] of [[square function|square]]s of the [[deviations from the mean]].
A [[Variance Metric]] is a [[metric]] that represents the dispersion of a [[continuous dataset]].
* <B><U>Context:</U></B>
* <B><U>Context:</U></B>
** It can (typically) measure the [[expected value]] of [[square function|square]]s of the [[deviations from the mean]].
** It can be represented as Var(''X''), for [[random variable]] ''X''.
** It can be represented as Var(''X''), for [[random variable]] ''X''.
** It can be based on a [[Covariance Function]].
** It can be based on a [[Covariance Function]].

Revision as of 19:17, 7 April 2014

A Variance Metric is a metric that represents the dispersion of a continuous dataset.



References

2012

  • http://en.wikipedia.org/wiki/Variance
    • QUOTE: In probability theory and statistics, the variance is a measure of how far a set of numbers is spread out. It is one of several descriptors of a probability distribution, describing how far the numbers lie from the mean (expected value). In particular, the variance is one of the moments of a distribution. In that context, it forms part of a systematic approach to distinguishing between probability distributions. While other such approaches have been developed, those based on moments are advantageous in terms of mathematical and computational simplicity.

      The variance is a parameter describing in part either the actual probability distribution of an observed population of numbers, or the theoretical probability distribution of a sample (a not-fully-observed population) of numbers. In the latter case a sample of data from such a distribution can be used to construct an estimate of its variance: in the simplest cases this estimate can be the sample variance, defined below.


  • http://en.wikipedia.org/wiki/Variance#Definition
    • QUOTE: If a random variable X has the expected value (mean) μ = E[X], then the variance of X is given by: :[math]\displaystyle{ \operatorname{Var}(X) = \operatorname{E}\left[(X - \mu)^2 \right]. \, }[/math]

      That is, the variance is the expected value of the squared difference between the variable's realization and the variable's mean. This definition encompasses random variables that are discrete, continuous, or neither (or mixed). It can be expanded as follows:  :[math]\displaystyle{ \begin{align} \operatorname{Var}(X) &= \operatorname{E}\left[(X - \mu)^2 \right] \\ &= \operatorname{E}\left[X^2 - 2\mu X + \mu^2 \right] \\ &= \operatorname{E}\left[X^2 \right] - 2\mu\,\operatorname{E}[X] + \mu^2 \\ &= \operatorname{E}\left[X^2 \right] - 2\mu^2 + \mu^2 \\ &= \operatorname{E}\left[X^2 \right] - \mu^2 \\ &= \operatorname{E}\left[X^2 \right] - (\operatorname{E}[X])^2. \end{align} }[/math]

      A mnemonic for the above expression is "mean of square minus square of mean".

      The variance of random variable X is typically designated as Var(X), [math]\displaystyle{ \scriptstyle\sigma_X^2 }[/math], or simply σ2 (pronounced "sigma squared").

2009

1987

  • (Davidian & Carroll, 1987) ⇒ M. Davidian and R. J. Carroll. (1987). "Variance Function Estimation." In: Journal of the American Statistical Association, 82(400). http://www.jstor.org/stable/2289384