Variance Function
Jump to navigation
Jump to search
A Variance Function is a Function that describes the dispersion of a Random Variable/Sample Space around its Mean Value.
- AKA: Variance, Var.
- Context:
- Input: a Random Variable.
- Output: a Real Number.
- It can be:
- an Arithmetic Variance Function, if the Expected Value and the Mean Value is known.
- a Sample Variance Function, if only a Sample is available.
- See: Standard Deviation Function.
References
- (Wikipedia, 2009) ⇒ http://en.wikipedia.org/wiki/Variance
- In probability theory and statistics, the variance of a random variable, probability distribution, or sample is one measure of statistical dispersion, averaging the squared distance of its possible values from the expected value (mean). Whereas the mean is a way to describe the location of a distribution, the variance is a way to capture its scale or degree of being spread out. The unit of variance is the square of the unit of the original variable. The positive square root of the variance, called the standard deviation, has the same units as the original variable and can be easier to interpret for this reason.
- The variance of a real-valued random variable is its second central moment, and it also happens to be its second cumulant. Just as some distributions do not have a mean, some do not have a variance. The mean exists whenever the variance exists, but not vice versa.
- If random variable X has expected value (mean) μ = E(X), then the variance Var(X) of X is given by:
- \operatorname{Var}(X) = \operatorname{E}[ (X - \mu ) ^ 2].\,
- This definition encompasses random variables that are discrete, continuous, or neither. Of all the points about which squared deviations could have been calculated, the mean produces the minimum value for the averaged sum of squared deviations.
- The variance of random variable X is typically designated as Var(X), \scriptstyle\sigma_X^2, or simply σ2. If a distribution does not have an expected value, as is the case for the Cauchy distribution, it does not have a variance either. Many other distributions for which the expected value does exist do not have a finite variance because the relevant integral diverges. An example is a Pareto distribution whose Pareto index k satisfies 1 < k ≤ 2.