Variance Metric: Difference between revisions

Jump to navigation Jump to search
m
Text replacement - "===2" to "=== 2"
m (Text replacement - " represented as " to " represented as ")
m (Text replacement - "===2" to "=== 2")
Line 19: Line 19:
==References==
==References==


===2012===
=== 2012===
* http://en.wikipedia.org/wiki/Variance
* http://en.wikipedia.org/wiki/Variance
** QUOTE: In [[probability theory]] and [[statistics]], the '''variance</B> is a measure of how far a set of numbers is spread out. It is one of several descriptors of a [[probability distribution]], describing how far the numbers lie from the [[mean]] (expected value). In particular, the variance is one of the [[Moment (mathematics)|moments]] of a distribution. In that context, it forms part of a systematic approach to distinguishing between probability distributions. While other such approaches have been developed, those based on [[Moment (mathematics)|moments]] are advantageous in terms of mathematical and computational simplicity.    <P>  The variance is a [[population parameter|parameter]] describing in part either the actual probability distribution of an observed population of numbers, or the theoretical probability distribution of a sample (a not-fully-observed population) of numbers. In the latter case a sample of data from such a distribution can be used to construct an estimate of its variance: in the simplest cases this estimate can be the '''sample variance</B>, defined below.
** QUOTE: In [[probability theory]] and [[statistics]], the '''variance</B> is a measure of how far a set of numbers is spread out. It is one of several descriptors of a [[probability distribution]], describing how far the numbers lie from the [[mean]] (expected value). In particular, the variance is one of the [[Moment (mathematics)|moments]] of a distribution. In that context, it forms part of a systematic approach to distinguishing between probability distributions. While other such approaches have been developed, those based on [[Moment (mathematics)|moments]] are advantageous in terms of mathematical and computational simplicity.    <P>  The variance is a [[population parameter|parameter]] describing in part either the actual probability distribution of an observed population of numbers, or the theoretical probability distribution of a sample (a not-fully-observed population) of numbers. In the latter case a sample of data from such a distribution can be used to construct an estimate of its variance: in the simplest cases this estimate can be the '''sample variance</B>, defined below.
Line 26: Line 26:
** QUOTE: If a [[random variable]] ''X</i> has the [[expected value]] (mean) {{nowrap|1 = ''μ'' = E[''X'']}}, then the variance of ''X</i> is given by: :<math>\operatorname{Var}(X) = \operatorname{E}\left[(X - \mu)^2 \right]. \,</math>  <P> That is, the variance is the expected value of the squared difference between the variable's realization and the variable's mean.  This definition encompasses random variables that are [[discrete random variable|discrete]], [[continuous random variable|continuous]], or neither (or mixed). It can be expanded as follows:  :<math>\begin{align}  \operatorname{Var}(X)  &= \operatorname{E}\left[(X - \mu)^2 \right] \\      &= \operatorname{E}\left[X^2 - 2\mu X + \mu^2 \right] \\      &= \operatorname{E}\left[X^2 \right] - 2\mu\,\operatorname{E}[X] + \mu^2 \\      &= \operatorname{E}\left[X^2 \right] - 2\mu^2 + \mu^2 \\      &= \operatorname{E}\left[X^2 \right] - \mu^2 \\      &= \operatorname{E}\left[X^2 \right] - (\operatorname{E}[X])^2.  \end{align}</math>  <P> A mnemonic for the above expression is "mean of square minus square of mean". <P> The variance of random variable ''X</i> is typically designated as Var(''X''), <math>\scriptstyle\sigma_X^2</math>, or simply σ<sup>2</sup> (pronounced "[[sigma]] squared").
** QUOTE: If a [[random variable]] ''X</i> has the [[expected value]] (mean) {{nowrap|1 = ''μ'' = E[''X'']}}, then the variance of ''X</i> is given by: :<math>\operatorname{Var}(X) = \operatorname{E}\left[(X - \mu)^2 \right]. \,</math>  <P> That is, the variance is the expected value of the squared difference between the variable's realization and the variable's mean.  This definition encompasses random variables that are [[discrete random variable|discrete]], [[continuous random variable|continuous]], or neither (or mixed). It can be expanded as follows:  :<math>\begin{align}  \operatorname{Var}(X)  &= \operatorname{E}\left[(X - \mu)^2 \right] \\      &= \operatorname{E}\left[X^2 - 2\mu X + \mu^2 \right] \\      &= \operatorname{E}\left[X^2 \right] - 2\mu\,\operatorname{E}[X] + \mu^2 \\      &= \operatorname{E}\left[X^2 \right] - 2\mu^2 + \mu^2 \\      &= \operatorname{E}\left[X^2 \right] - \mu^2 \\      &= \operatorname{E}\left[X^2 \right] - (\operatorname{E}[X])^2.  \end{align}</math>  <P> A mnemonic for the above expression is "mean of square minus square of mean". <P> The variance of random variable ''X</i> is typically designated as Var(''X''), <math>\scriptstyle\sigma_X^2</math>, or simply σ<sup>2</sup> (pronounced "[[sigma]] squared").


===2005===
=== 2005===
* ([[Lord et al., 2005]]) &rArr; [[Dominique Lord]], [[Simon P. Washington]], and [[John N. Ivan]]. (2005). "Poisson, Poisson-gamma and zero-inflated regression models of motor vehicle crashes: balancing statistical fit and theory." In: Accident Analysis & Prevention, 37(1). [http://dx.doi.org/10.1016/j.aap.2004.02.004 doi:10.1016/j.aap.2004.02.004]  
* ([[Lord et al., 2005]]) &rArr; [[Dominique Lord]], [[Simon P. Washington]], and [[John N. Ivan]]. (2005). "Poisson, Poisson-gamma and zero-inflated regression models of motor vehicle crashes: balancing statistical fit and theory." In: Accident Analysis & Prevention, 37(1). [http://dx.doi.org/10.1016/j.aap.2004.02.004 doi:10.1016/j.aap.2004.02.004]  
** QUOTE: The [[arithmetic mean|mean]] and [[arithmetic variance|variance]] of the [[binomial distribution]] are <math>E(Z) = Np</math> and <math>VAR(Z) = Np(1-p)</math> respectively.
** QUOTE: The [[arithmetic mean|mean]] and [[arithmetic variance|variance]] of the [[binomial distribution]] are <math>E(Z) = Np</math> and <math>VAR(Z) = Np(1-p)</math> respectively.

Navigation menu