Discrete Probability Distribution Family
(Redirected from discrete probability distribution family)
Jump to navigation
Jump to search
A Discrete Probability Distribution Family is a probability distribution family that defines a set of discrete probability functions.
- Context:
- It can range from being a Finite Support Discrete Probability Distribution Family to being an Infinite Support Discrete Probability Distribution Family.
- It can range from being a Binomial Probability Distribution Family to being a Multinomial Probability Distribution Family.
- Example(s):
- Counter-Example(s):
- See: Conditional Probability Distribution, Discrete Probability Model Fitting.
References
2015
- (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/Probability_distribution#Discrete_probability_distribution Retrieved:2015-6-2.
- A discrete probability distribution is a probability distribution characterized by a probability mass function. Thus, the distribution of a random variable X is discrete, and X is called a discrete random variable, if :[math]\displaystyle{ \sum_u \Pr(X=u) = 1 }[/math] as u runs through the set of all possible values of X. Hence, a random variable can assume only a finite or countably infinite number of values—the random variable is a discrete variable. For the number of potential values to be countably infinite, even though their probabilities sum to 1, the probabilities have to decline to zero fast enough. for example, if [math]\displaystyle{ \Pr(X=n) = \tfrac{1}{2^n} }[/math] for n = 1, 2, ..., we have the sum of probabilities 1/2 + 1/4 + 1/8 + … = 1.
Well-known discrete probability distributions used in statistical modeling include the Poisson distribution, the Bernoulli distribution, the binomial distribution, the geometric distribution, and the negative binomial distribution. Additionally, the discrete uniform distribution is commonly used in computer programs that make equal-probability random selections between a number of choices.
- A discrete probability distribution is a probability distribution characterized by a probability mass function. Thus, the distribution of a random variable X is discrete, and X is called a discrete random variable, if :[math]\displaystyle{ \sum_u \Pr(X=u) = 1 }[/math] as u runs through the set of all possible values of X. Hence, a random variable can assume only a finite or countably infinite number of values—the random variable is a discrete variable. For the number of potential values to be countably infinite, even though their probabilities sum to 1, the probabilities have to decline to zero fast enough. for example, if [math]\displaystyle{ \Pr(X=n) = \tfrac{1}{2^n} }[/math] for n = 1, 2, ..., we have the sum of probabilities 1/2 + 1/4 + 1/8 + … = 1.
- (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/List_of_probability_distributions#With_finite Retrieved:2015-6-2.
- With finite support.
- The Bernoulli distribution, which takes value 1 with probability p and value 0 with probability q = 1 − p.
- The Rademacher distribution, which takes value 1 with probability 1/2 and value −1 with probability 1/2.
- The binomial distribution, which describes the number of successes in a series of independent Yes/No experiments all with the same probability of success.
- The beta-binomial distribution, which describes the number of successes in a series of independent Yes/No experiments with heterogeneity in the success probability.
- The degenerate distribution at x0, where X is certain to take the value x0. This does not look random, but it satisfies the definition of random variable. This is useful because it puts deterministic variables and random variables in the same formalism.
- The discrete uniform distribution, where all elements of a finite set are equally likely. This is the theoretical distribution model for a balanced coin, an unbiased die, a casino roulette, or the first card of a well-shuffled deck.
- The hypergeometric distribution, which describes the number of successes in the first m of a series of n consecutive Yes/No experiments, if the total number of successes is known. This distribution arises when there is no replacement.
- The Poisson binomial distribution, which describes the number of successes in a series of independent Yes/No experiments with different success probabilities.
- Fisher's noncentral hypergeometric distribution.
- Wallenius' noncentral hypergeometric distribution.
- Benford's law, which describes the frequency of the first digit of many naturally occurring data.
- With finite support.
- (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/List_of_probability_distributions#With_infinite_support Retrieved:2015-6-2.
- With infinite support
- The beta negative binomial distribution.
- The Boltzmann distribution, a discrete distribution important in statistical physics which describes the probabilities of the various discrete energy levels of a system in thermal equilibrium. It has a continuous analogue. Special cases include:
- The Borel distribution.
- The extended negative binomial distribution.
- The extended hypergeometric distribution.
- The generalized log-series distribution.
- The geometric distribution, a discrete distribution which describes the number of attempts needed to get the first success in a series of independent Yes/No experiments, or alternatively only the number of losses before the first success (i.e. one less).
- The logarithmic (series) distribution.
- The negative binomial distribution or Pascal distribution a generalization of the geometric distribution to the nth success.
- The parabolic fractal distribution.
- The Poisson distribution, which describes a very large number of individually unlikely events that happen in a certain time interval. Related to this distributions are a number of other distributions: the displaced Poisson, the hyper-Poisson, the general Poisson binomial and the Poisson type distributions.
- The Conway–Maxwell–Poisson distribution, a two-parameter extension of the Poisson distribution with an adjustable rate of decay.
- The Zero-truncated Poisson distribution, for processes in which zero counts are not observed
- The Polya–Eggenberger distribution.
- The Skellam distribution, the distribution of the difference between two independent Poisson-distributed random variables.
- The skew elliptical distribution.
- The Yule–Simon distribution.
- The zeta distribution has uses in applied statistics and statistical mechanics, and perhaps may be of interest to number theorists. It is the Zipf distribution for an infinite number of elements.
- Zipf's law or the Zipf distribution. A discrete power-law distribution, the most famous example of which is the description of the frequency of words in the English language.
- The Zipf–Mandelbrot law is a discrete power law distribution which is a generalization of the Zipf distribution.
- With infinite support