Expected Value
An Expected Value is an average value for some random variable (representing some random process).
- AKA: Random Variable Mean, Expectation, Statistical Expectation, Mathematical Expectation.
- Context:
- It can (typically) be produced by an Expected Value Function.
- It can range from being a Discrete Expected Value to being an Ordinal Expected Value to being a Continuous Expected Value.
- …
- Example(s):
- Counter-Example(s):
- See: Arithmetic Mean, First Moment, Discrete Random Variable, Outlier Value.
References
2015
- (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/expected_value Retrieved:2015-4-28.
- In probability theory, the expected value of a random variable is intuitively the long-run average value of repetitions of the experiment it represents. For example, the expected value of a dice roll is 3.5 because, roughly speaking, the average of an extremely large number of dice rolls is practically always nearly equal to 3.5. Less roughly, the law of large numbers guarantees that the arithmetic mean of the values almost surely converges to the expected value as the number of repetitions goes to infinity. The expected value is also known as the expectation, mathematical expectation, EV, mean, or first moment.
More practically, the expected value of a discrete random variable is the probability-weighted average of all possible values. In other words, each possible value the random variable can assume is multiplied by its probability of occurring, and the resulting products are summed to produce the expected value. The same works for continuous random variables, except the sum is replaced by an integral and the probabilities by probability densities. The formal definition subsumes both of these and also works for distributions which are neither discrete nor continuous: the expected value of a random variable is the integral of the random variable with respect to its probability measure.
The expected value does not exist for random variables having some distributions with large "tails", such as the Cauchy distribution. For random variables such as these, the long-tails of the distribution prevent the sum/integral from converging.
The expected value is a key aspect of how one characterizes a probability distribution; it is one type of location parameter. By contrast, the variance is a measure of dispersion of the possible values of the random variable around the expected value. The variance itself is defined in terms of two expectations: it is the expected value of the squared deviation of the variable's value from the variable's expected value.
The expected value plays important roles in a variety of contexts. In regression analysis, one desires a formula in terms of observed data that will give a "good" estimate of the parameter giving the effect of some explanatory variable upon a dependent variable. The formula will give different estimates using different samples of data, so the estimate it gives is itself a random variable. A formula is typically considered good in this context if it is an unbiased estimator—that is, if the expected value of the estimate (the average value it would give over an arbitrarily large number of separate samples) can be shown to equal the true value of the desired parameter.
In decision theory, and in particular in choice under uncertainty, an agent is described as making an optimal choice in the context of incomplete information. For risk neutral agents, the choice involves using the expected values of uncertain quantities, while for risk averse agents it involves maximizing the expected value of some objective function such as a von Neumann-Morgenstern utility function.
- In probability theory, the expected value of a random variable is intuitively the long-run average value of repetitions of the experiment it represents. For example, the expected value of a dice roll is 3.5 because, roughly speaking, the average of an extremely large number of dice rolls is practically always nearly equal to 3.5. Less roughly, the law of large numbers guarantees that the arithmetic mean of the values almost surely converges to the expected value as the number of repetitions goes to infinity. The expected value is also known as the expectation, mathematical expectation, EV, mean, or first moment.
- http://en.wiktionary.org/wiki/Appendix:Glossary_of_probability_and_statistics#E
- expectation of a random variable is the sum of the probability of each possible outcome of the experiment multiplied by its payoff ("value"). Thus, it represents the average amount one "expects" to win per bet if bets with identical odds are repeated many times. For example, the expected value of a six-sided die roll is 3.5. The concept is similar to the mean. The expected value of random variable X is typically written E(X) or [math]\displaystyle{ \mu }[/math] (mu).
2009
- (WordNet, 2009) ⇒ http://wordnetweb.princeton.edu/perl/webwn?s=expected%20value
- S: (n) arithmetic mean, first moment, expectation, expected value (the sum of the values of a random variable divided by the number of values)
- http://en.wiktionary.org/wiki/expected_value
- The Template:L/en of outcome values, using probability as the weighting function.
2006
- (Dubnicka, 2006c) ⇒ Suzanne R. Dubnicka. (2006). “Random Variables - STAT 510: Handout 3." Kansas State University, Introduction to Probability and Statistics I, STAT 510 - Fall 2006.
- The pmf of a discrete random variable and the pdf of a continuous random variable provides complete information about the probabilistic properties of a random variable. However, it is sometimes useful to employ summary measures. The most basic summary measure is the expectation or mean of a random variable [math]\displaystyle{ X }[/math], denoted [math]\displaystyle{ E(X) }[/math], which can be thought of as an “average” value of a random variable.
- TERMINOLOGY : Let [math]\displaystyle{ X }[/math] be a discrete random variable with pmf pX(x) and support SX. The expected value of [math]\displaystyle{ X }[/math] is given by [math]\displaystyle{ E(X) = X x \in SX xpX(x) }[/math].