Variance Metric: Difference between revisions

From GM-RKB
Jump to navigation Jump to search
m (Remove links to pages that are actually redirects to this page.)
m (Text replacement - "ments]]" to "ment]]s")
 
(6 intermediate revisions by 2 users not shown)
Line 9: Line 9:
* <B>Example(s):</B>
* <B>Example(s):</B>
** [[Income Variance Measure]].
** [[Income Variance Measure]].
** …
* <B>Counter-Example(s):</B>
* <B>Counter-Example(s):</B>
** a [[Sample Variance]].
** a [[Sample Variance]].
Line 15: Line 16:
** an [[Inter-Quartile Range]].
** an [[Inter-Quartile Range]].
* <B>See:</B> [[Statistical Deviation]], [[Gini Coefficient]].
* <B>See:</B> [[Statistical Deviation]], [[Gini Coefficient]].
----
----
----
----
Line 22: Line 24:
=== 2012 ===
=== 2012 ===
* http://en.wikipedia.org/wiki/Variance
* http://en.wikipedia.org/wiki/Variance
** QUOTE: In [[probability theory]] and [[statistics]], the '''variance</B> is a measure of how far a set of numbers is spread out. It is one of several descriptors of a [[probability distribution]], describing how far the numbers lie from the [[mean]] (expected value). In particular, the variance is one of the [[Moment (mathematics)|moments]] of a distribution. In that context, it forms part of a systematic approach to distinguishing between probability distributions. While other such approaches have been developed, those based on [[Moment (mathematics)|moments]] are advantageous in terms of mathematical and computational simplicity.        <P>          The variance is a [[population parameter|parameter]] describing in part either the actual probability distribution of an observed population of numbers, or the theoretical probability distribution of a sample (a not-fully-observed population) of numbers. In the latter case a sample of data from such a distribution can be used to construct an estimate of its variance: in the simplest cases this estimate can be the '''sample variance</B>, defined below.
** QUOTE: In [[probability theory]] and [[statistics]], the '''variance</B> is a measure of how far a set of numbers is spread out. It is one of several descriptors of a [[probability distribution]], describing how far the numbers lie from the [[mean]] (expected value). In particular, the variance is one of the [[Moment (mathematics)|moment]]s of a distribution. In that context, it forms part of a systematic approach to distinguishing between probability distributions. While other such approaches have been developed, those based on [[Moment (mathematics)|moment]]s are advantageous in terms of mathematical and computational simplicity.        <P>          The variance is a [[population parameter|parameter]] describing in part either the actual probability distribution of an observed population of numbers, or the theoretical probability distribution of a sample (a not-fully-observed population) of numbers. In the latter case a sample of data from such a distribution can be used to construct an estimate of its variance: in the simplest cases this estimate can be the '''sample variance</B>, defined below.
<BR>
<BR>
* http://en.wikipedia.org/wiki/Variance#Definition
* http://en.wikipedia.org/wiki/Variance#Definition
** QUOTE: If a [[random variable]] ''X</i> has the [[expected value]] (mean) {{nowrap|1 = ''μ'' = E[''X'']}}, then the variance of ''X</i> is given by: :<math>\operatorname{Var}(X) = \operatorname{E}\left[(X - \mu)^2 \right]. \,</math> <P> That is, the variance is the expected value of the squared difference between the variable's realization and the variable's mean.  This definition encompasses random variables that are [[discrete random variable|discrete]], [[continuous random variable|continuous]], or neither (or mixed). It can be expanded as follows:  :<math>\begin{align}  \operatorname{Var}(X)  &= \operatorname{E}\left[(X - \mu)^2 \right] \\      &= \operatorname{E}\left[X^2 - 2\mu X + \mu^2 \right] \\      &= \operatorname{E}\left[X^2 \right] - 2\mu\,\operatorname{E}[X] + \mu^2 \\      &= \operatorname{E}\left[X^2 \right] - 2\mu^2 + \mu^2 \\      &= \operatorname{E}\left[X^2 \right] - \mu^2 \\      &= \operatorname{E}\left[X^2 \right] - (\operatorname{E}[X])^2.  \end{align}</math> <P> A mnemonic for the above expression is "mean of square minus square of mean".        <P>        The variance of random variable ''X</i> is typically designated as Var(''X''), <math>\scriptstyle\sigma_X^2</math>, or simply σ<sup>2</sup> (pronounced “[[sigma]] squared").
** QUOTE: If a [[random variable]] ''X</i> has the [[expected value]] (mean) {{nowrap|1 = ''μ'' = E[''X'']}}, then the variance of ''X</i> is given by: :<math>\operatorname{Var}(X) = \operatorname{E}\left[(X - \mu)^2 \right]. \,</math>         <P>         That is, the variance is the expected value of the squared difference between the variable's realization and the variable's mean.  This definition encompasses random variables that are [[discrete random variable|discrete]], [[continuous random variable|continuous]], or neither (or mixed). It can be expanded as follows:  :<math>\begin{align}  \operatorname{Var}(X)  &= \operatorname{E}\left[(X - \mu)^2 \right] \\      &= \operatorname{E}\left[X^2 - 2\mu X + \mu^2 \right] \\      &= \operatorname{E}\left[X^2 \right] - 2\mu\,\operatorname{E}[X] + \mu^2 \\      &= \operatorname{E}\left[X^2 \right] - 2\mu^2 + \mu^2 \\      &= \operatorname{E}\left[X^2 \right] - \mu^2 \\      &= \operatorname{E}\left[X^2 \right] - (\operatorname{E}[X])^2.  \end{align}</math>         <P>         A mnemonic for the above expression is "mean of square minus square of mean".        <P>        The variance of random variable ''X</i> is typically designated as Var(''X''), <math>\scriptstyle\sigma_X^2</math>, or simply σ<sup>2</sup> (pronounced “[[sigma]] squared").


=== 2005 ===
=== 2005 ===
* ([[Lord et al., 2005]]) ⇒ [[Dominique Lord]], [[Simon P. Washington]], and [[John N. Ivan]]. ([[2005]]). “Poisson, Poisson-gamma and zero-inflated regression models of motor vehicle crashes: balancing statistical fit and theory.” In: Accident Analysis & Prevention, 37(1). [http://dx.doi.org/10.1016/j.aap.2004.02.004 doi:10.1016/j.aap.2004.02.004]  
* ([[Lord et al., 2005]]) ⇒ [[Dominique Lord]], [[Simon P. Washington]], and [[John N. Ivan]]. ([[2005]]). “Poisson, Poisson-gamma and zero-inflated regression models of motor vehicle crashes: balancing statistical fit and theory.” In: Accident Analysis & Prevention, 37(1). [http://dx.doi.org/10.1016/j.aap.2004.02.004 doi:10.1016/j.aap.2004.02.004]  
** QUOTE: The [[arithmetic mean|mean]] and [[arithmetic variance|variance]] of the [[binomial distribution]] are <math>E(Z) = Np</math> and <math>VAR(Z) = Np(1-p)</math> respectively.
** QUOTE: The [[arithmetic mean|mean]] and [[Variance Metric|variance]] of the [[binomial distribution]] are <math>E(Z) = Np</math> and <math>VAR(Z) = Np(1-p)</math> respectively.


=== 1987 ===
=== 1987 ===

Latest revision as of 04:48, 24 June 2024

A Variance Metric is a dispersion metric that represents the dispersion of a continuous dataset.



References

2012

  • http://en.wikipedia.org/wiki/Variance
    • QUOTE: In probability theory and statistics, the variance is a measure of how far a set of numbers is spread out. It is one of several descriptors of a probability distribution, describing how far the numbers lie from the mean (expected value). In particular, the variance is one of the moments of a distribution. In that context, it forms part of a systematic approach to distinguishing between probability distributions. While other such approaches have been developed, those based on moments are advantageous in terms of mathematical and computational simplicity.

      The variance is a parameter describing in part either the actual probability distribution of an observed population of numbers, or the theoretical probability distribution of a sample (a not-fully-observed population) of numbers. In the latter case a sample of data from such a distribution can be used to construct an estimate of its variance: in the simplest cases this estimate can be the sample variance, defined below.


  • http://en.wikipedia.org/wiki/Variance#Definition
    • QUOTE: If a random variable X has the expected value (mean) μ = E[X], then the variance of X is given by: :[math]\displaystyle{ \operatorname{Var}(X) = \operatorname{E}\left[(X - \mu)^2 \right]. \, }[/math]

      That is, the variance is the expected value of the squared difference between the variable's realization and the variable's mean. This definition encompasses random variables that are discrete, continuous, or neither (or mixed). It can be expanded as follows:  :[math]\displaystyle{ \begin{align} \operatorname{Var}(X) &= \operatorname{E}\left[(X - \mu)^2 \right] \\ &= \operatorname{E}\left[X^2 - 2\mu X + \mu^2 \right] \\ &= \operatorname{E}\left[X^2 \right] - 2\mu\,\operatorname{E}[X] + \mu^2 \\ &= \operatorname{E}\left[X^2 \right] - 2\mu^2 + \mu^2 \\ &= \operatorname{E}\left[X^2 \right] - \mu^2 \\ &= \operatorname{E}\left[X^2 \right] - (\operatorname{E}[X])^2. \end{align} }[/math]

      A mnemonic for the above expression is "mean of square minus square of mean".

      The variance of random variable X is typically designated as Var(X), [math]\displaystyle{ \scriptstyle\sigma_X^2 }[/math], or simply σ2 (pronounced “sigma squared").

2005

1987