Chebyshev's Inequality Relationship
A Chebyshev's Inequality Relationship is a Statistical Statement about a random variable X with statistical mean μ and statistical variance Sigma2.
- AKA: Tchebysheff's Inequality, Markov's Inequality, Bienayme's Inequality.
- …
- Counter-Example(s):
- See: Expected Value, Weak Law of Large Numbers, 68–95–99.7 Rule.
References
2018a
- (Wikipedia, 2018) ⇒ https://en.wikipedia.org/wiki/Chebyshev's_inequality Retrieved:2018-11-15.
- In probability theory, Chebyshev's inequality (also called the Bienaymé-Chebyshev inequality) guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from the mean. Specifically, no more than 1/k2 of the distribution's values can be more than k standard deviations away from the mean (or equivalently, at least 1−1/k2 of the distribution's values are within k standard deviations of the mean). The rule is often called Chebyshev's theorem, about the range of standard deviations around the mean, in statistics. The inequality has great utility because it can be applied to any probability distribution in which the mean and variance are defined. For example, it can be used to prove the weak law of large numbers.
In practical usage, in contrast to the 68–95–99.7 rule, which applies to normal distributions, Chebyshev's inequality is weaker, stating that a minimum of just 75% of values must lie within two standard deviations of the mean and 89% within three standard deviations.[1] [2]
The term Chebyshev's inequality also refer to Markov's inequality, especially in the context of analysis. They are closely related, and some authors refer to Markov's inequality as "Chebyshev's First Inequality," and the similar one referred to on this page as "Chebyshev's Second Inequality."
- In probability theory, Chebyshev's inequality (also called the Bienaymé-Chebyshev inequality) guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from the mean. Specifically, no more than 1/k2 of the distribution's values can be more than k standard deviations away from the mean (or equivalently, at least 1−1/k2 of the distribution's values are within k standard deviations of the mean). The rule is often called Chebyshev's theorem, about the range of standard deviations around the mean, in statistics. The inequality has great utility because it can be applied to any probability distribution in which the mean and variance are defined. For example, it can be used to prove the weak law of large numbers.
2018b
- (Wikipedia, 2018) ⇒ https://en.wikipedia.org/wiki/Markov's_inequality Retrieved:2018-11-15.
- In probability theory, Markov's inequality gives an upper bound for the probability that a non-negative function of a random variable is greater than or equal to some positive constant. It is named after the Russian mathematician Andrey Markov, although it appeared earlier in the work of Pafnuty Chebyshev (Markov's teacher), and many sources, especially in analysis, refer to it as Chebyshev's inequality (sometimes, calling it the first Chebyshev inequality, while referring to Chebyshev's inequality as the second Chebyshev's inequality) or Bienaymé's inequality.
Markov's inequality (and other similar inequalities) relate probabilities to expectations, and provide (frequently loose but still useful) bounds for the cumulative distribution function of a random variable.
- In probability theory, Markov's inequality gives an upper bound for the probability that a non-negative function of a random variable is greater than or equal to some positive constant. It is named after the Russian mathematician Andrey Markov, although it appeared earlier in the work of Pafnuty Chebyshev (Markov's teacher), and many sources, especially in analysis, refer to it as Chebyshev's inequality (sometimes, calling it the first Chebyshev inequality, while referring to Chebyshev's inequality as the second Chebyshev's inequality) or Bienaymé's inequality.
2006
- (Dubnicka, 2006c) ⇒ Suzanne R. Dubnicka. (2006). “Random Variables - STAT 510: Handout 3." Kansas State University, Introduction to Probability and Statistics I, STAT 510 - Fall 2006.
- A general result that underlies the importance of the variance (standard deviation) of a distribution is Chebyshev’s Inequality.
- CHEBYSHEV’S INEQUALITY : If a random variable X has mean μ and variance Sigma2, then
P(|X − μ| <= cSigma) = P(μ − cSigma <= X <= μ + cSigma) 1 − 1/c2, for c 1.
- ↑ Kvanli, Alan H.; Pavur, Robert J.; Keeling, Kellie B. (2006). Concise Managerial Statistics. cEngage Learning. pp. 81–82. ISBN 9780324223880.
- ↑ Chernick, Michael R. (2011). The Essentials of Biostatistics for Physicians, Nurses, and Clinicians. John Wiley & Sons. pp. 49–50. ISBN 9780470641859.