Class Conditional Probability Function
Jump to navigation
Jump to search
A class conditional probability function is a conditional probability function that is a discrete probability function (for a discrete random variable).
- AKA: Class Conditional Probability, Class-Conditional Density, Class Conditional Density, Class Conditional Density Function, Class Conditional Distribution, Class Conditional Distribution Function.
- Context:
- It can be written as [math]\displaystyle{ (X \vert c_i) }[/math], where [math]\displaystyle{ X }[/math] is a random variable and [math]\displaystyle{ c_i }[/math] is an a posteriori random experiment outcome.
- It can be interpreted as "what proportion of events with in [math]\displaystyle{ c_i }[/math] exhibit [math]\displaystyle{ X }[/math]?"
- It can be estimated by:
- Example(s):
- [math]\displaystyle{ p(\text{Dice}=2 \vert \text{Dice} \lt 3) }[/math]
- Counter-Example(s):
- See: Discriminative Classification Model.
References
2011
- (Wikipedia, 2011) ⇒ http://en.wikipedia.org/wiki/Conditional_probability#Example
- Consider the rolling of two fair six-sided dice.
- Let [math]\displaystyle{ A }[/math] be the value rolled on die 1
- Let [math]\displaystyle{ B }[/math] be the value rolled on die 2
- Let [math]\displaystyle{ A_n }[/math] be the event that [math]\displaystyle{ A=n }[/math]
- Let [math]\displaystyle{ \Sigma_m }[/math] be the event that [math]\displaystyle{ A+B \leq m }[/math]
- … Suppose however we roll the dice many times, but ignore cases in which [math]\displaystyle{ A+B\gt 5 }[/math]. In what proportion of the remaining rolls would [math]\displaystyle{ A=2 }[/math]? … [math]\displaystyle{ A=2 }[/math] in 3 of these. The answer is therefore [math]\displaystyle{ \textstyle \frac{3}{10} = 0.3 }[/math]. We say, the probability that [math]\displaystyle{ A=2 }[/math] given that [math]\displaystyle{ A+B \leq 5 }[/math], is 0.3. This is a conditional probability, because it has a condition that limits the sample space. In more compact notation, [math]\displaystyle{ P(A_2 | \Sigma_5) = 0.3 }[/math].
- Consider the rolling of two fair six-sided dice.
2002
- (Hill, 2003) ⇒ Paul R. Hill. (2002). “Wavelet Based Texture Analysis and Segmentation for Image Retrieval and Fusion." PhD Thesis, University of Bristol, Faculty of Engineering, March 2002
- 2.7 Parametric classifiers: The probability density function within each class is assumed to be of a given form (e.g. Gaussian) completely defined by a small number of parameters. Training therefore reduces to a problem of parameter estimation.
- 2.7.1 Gaussian Classifiers: The Gaussian classifier assumes that the class conditional probabilities are Gaussian.