Law of Large Numbers
A Law of Large Numbers is a theorem that proves that for a random sample of size [math]\displaystyle{ n }[/math], as [math]\displaystyle{ n\to\infty }[/math]the sample mean approaches to population mean.
- AKA: LLN.
- Context:
- It can state that a Sample Average of a Sequence of Independent and Identically Distributed Random Variables [math]\displaystyle{ X_k }[/math] converges towards their common expectation [math]\displaystyle{ \mu }[/math], provided that the expectation of [math]\displaystyle{ |X_k| }[/math] is finite.
- It can range from being a Strong Law of Large Numbers to being a Weak Law of Large Numbers.
- Example(s):
- In a throwing of a die experiment, the probability of getting [math]\displaystyle{ 1, 2, 3,\dots\, 6 }[/math] are[math]\displaystyle{ \frac {1}{6} }[/math] each. The expected mean [math]\displaystyle{ E(X)=\frac {1}{6}*1+\frac {1}{6}*2+\frac {1}{6}*3+\dots+\frac {1}{6}*6=3.5 }[/math]
Let [math]\displaystyle{ X_i }[/math] be the face value of the die in the [math]\displaystyle{ ith }[/math] throw and the die is thrown n times. The sample mean is given by [math]\displaystyle{ \bar{X_n}=\frac{X_1+X_2+X_3+\dots+X_n}{n} }[/math].
If the sample size is large, that is when [math]\displaystyle{ n\to \infty }[/math] the sample mean [math]\displaystyle{ \bar{X_n}\to E(X) }[/math] (that is in this example [math]\displaystyle{ \bar{X_n}\to 3.5 }[/math]). This is what called Law of Large Numbers.
- In a throwing of a die experiment, the probability of getting [math]\displaystyle{ 1, 2, 3,\dots\, 6 }[/math] are[math]\displaystyle{ \frac {1}{6} }[/math] each. The expected mean [math]\displaystyle{ E(X)=\frac {1}{6}*1+\frac {1}{6}*2+\frac {1}{6}*3+\dots+\frac {1}{6}*6=3.5 }[/math]
- Counter-Example(s):
- See: Probability Theory, Expected Value, Roulette, Gambler's Fallacy, Arithmetic Mean, Multinomial Distribution.
References
2014
- (Wikipedia, 2014) ⇒ http://en.wikipedia.org/wiki/Law_of_large_numbers Retrieved:2014-5-14.
- In probability theory, the law of large numbers (LLN) is a theorem that describes the result of performing the same experiment a large number of times. According to the law, the average of the results obtained from a large number of trials should be close to the expected value, and will tend to become closer as more trials are performed.
The LLN is important because it "guarantees" stable long-term results for the averages of random events. For example, while a casino may lose money in a single spin of the roulette wheel, its earnings will tend towards a predictable percentage over a large number of spins. Any winning streak by a player will eventually be overcome by the parameters of the game. It is important to remember that the LLN only applies (as the name indicates) when a large number of observations are considered. There is no principle that a small number of observations will coincide with the expected value or that a streak of one value will immediately be "balanced" by the others. See the Gambler's fallacy.
- In probability theory, the law of large numbers (LLN) is a theorem that describes the result of performing the same experiment a large number of times. According to the law, the average of the results obtained from a large number of trials should be close to the expected value, and will tend to become closer as more trials are performed.
- (Wikipedia, 2014) ⇒ http://en.wikipedia.org/wiki/Probability_theory#Law_of_large_numbers Retrieved:2014-5-14.
- Common intuition suggests that if a fair coin is tossed many times, then roughly half of the time it will turn up heads, and the other half it will turn up tails. Furthermore, the more often the coin is tossed, the more likely it should be that the ratio of the number of heads to the number of tails will approach unity. Modern probability provides a formal version of this intuitive idea, known as the law of large numbers. This law is remarkable because it is not assumed in the foundations of probability theory, but instead emerges from these foundations as a theorem. Since it links theoretically derived probabilities to their actual frequency of occurrence in the real world, the law of large numbers is considered as a pillar in the history of statistical theory and has had widespread influence.
The law of large numbers (LLN) states that the sample average :[math]\displaystyle{ \overline{X}_n=\frac1n{\sum_{k=1}^n X_k} }[/math] of a sequence of independent and identically distributed random variables [math]\displaystyle{ X_k }[/math] converges towards their common expectation [math]\displaystyle{ \mu }[/math], provided that the expectation of [math]\displaystyle{ |X_k| }[/math] is finite.
It is in the different forms of convergence of random variables that separates the weak and the strong law of large numbers :[math]\displaystyle{ \begin{array}{lll} \text{Weak law:} & \overline{X}_n \, \xrightarrow{P} \, \mu & \text{for } n \to \infty \\ \text{Strong law:} & \overline{X}_n \, \xrightarrow{\mathrm{a.\,s.}} \, \mu & \text{for } n \to \infty . \end{array} }[/math] It follows from the LLN that if an event of probability p is observed repeatedly during independent experiments, the ratio of the observed frequency of that event to the total number of repetitions converges towards p.
For example, if [math]\displaystyle{ Y_1,Y_2,...\, }[/math] are independent Bernoulli random variables taking values 1 with probability p and 0 with probability 1-p, then [math]\displaystyle{ \textrm{E}(Y_i)=p }[/math] for all i, so that [math]\displaystyle{ \bar Y_n }[/math] converges to p almost surely.
- Common intuition suggests that if a fair coin is tossed many times, then roughly half of the time it will turn up heads, and the other half it will turn up tails. Furthermore, the more often the coin is tossed, the more likely it should be that the ratio of the number of heads to the number of tails will approach unity. Modern probability provides a formal version of this intuitive idea, known as the law of large numbers. This law is remarkable because it is not assumed in the foundations of probability theory, but instead emerges from these foundations as a theorem. Since it links theoretically derived probabilities to their actual frequency of occurrence in the real world, the law of large numbers is considered as a pillar in the history of statistical theory and has had widespread influence.
2001
- (Breiman, 2001) ⇒ Leo Breiman. (2001). “Random Forests." Machine learning 45, no. 1
1995
- (Vapnik, 1995) ⇒ Vladimir Vapnik. (1995). “The Nature of Statistical Learning Theory." Springer Science & Business Media,
1985
- (Judd, 1985) ⇒ Kenneth L. Judd. (1985). “The Law of Large Numbers with a Continuum of Iid Random Variables.” In: Journal of Economic theory 35, no. 1
1965
- (Baum & Katz, 1965) ⇒ Leonard E. Baum, and Melvin Katz. (1965). “Convergence Rates in the Law of Large Numbers." Transactions of the American Mathematical Society
1947
- (Hsu & Robbins, 1947) ⇒ Pao-Lu Hsu, and Herbert Robbins. (1947). “Complete Convergence and the Law of Large Numbers." Proceedings of the National Academy of Sciences of the United States of America 33, no. 2