Gaussian Mixture Model Family
Jump to navigation
Jump to search
A Gaussian Mixture Model Family is a mixture model family of the form [math]\displaystyle{ p(\mathbf{x}) = \sum_{k=1}^K \pi_k N(\mu_k,\Sigma_k) }[/math], where [math]\displaystyle{ \pi_k }[/math] is the probability that a sample is drawn from kth mixture component. Note that [math]\displaystyle{ \sum_{k=1}^K \pi_k = 1 }[/math] and [math]\displaystyle{ \pi_k \gt 0 }[/math].
- Context:
- It can be instantiated in a Gaussian Mixture Function.
- It can range from being a Univariate Gaussian Mixture Model Family to being a Multivariate Gaussian Mixture Model Family.
- It can range from being a Finite Gaussian Mixture Model Family to being an Infinite Gaussian Mixture Model Family.
- It can be instantiated in a Gaussian Mixture Model.
- It can be an input to a Gaussian Mixture Model Fitting Task.
- …
- Counter-Example(s):
- See: Generative Statistical Model, Gaussian Density Function, EM Algorithm.
References
2013
- (Yu et al., 2013) ⇒ Dong Yu, Michael L. Seltzer, Jinyu Li, Jui-Ting Huang, and Frank Seide. (2013). “Feature Learning in Deep Neural Networks-studies on Speech Recognition Tasks.” arXiv preprint arXiv:1301.3605
- ABSTRACT: Recent studies have shown that deep neural networks (DNNs) perform significantly better than shallow networks and Gaussian mixture models (GMMs) on large vocabulary speech recognition tasks.
2009
- http://www.ics.uci.edu/~smyth/courses/ics274/notes4.pdf
- (Reynolds, 2009) ⇒ Douglas Reynolds. (2009). “Gaussian Mixture Models.” In: Encyclopedia of Biometrics.