Bayesian Non-Parametric Learning Algorithm
Jump to navigation
Jump to search
A Bayesian Non-Parametric Learning Algorithm is a Bayesian algorithm that is a non-parametric learning algorithm.
- Context:
- It can accept a Nonparametric Bayesian Models Family.
- It can range from being an Exact Non-Parametric Bayesian Inference Algorithm to being an Approximate Non-Parametric Bayesian Inference Algorithm.
- …
- Example(s):
- Counter-Example(s):
- See: Dirichlet Process.
References
2017
- https://blog.statsbot.co/bayesian-nonparametrics-9f2ce7074b97
- QUOTE: Bayesian Nonparametrics is a class of models with a potentially infinite number of parameters. High flexibility and expressive power of this approach enables better data modelling compared to parametric methods. Bayesian Nonparametrics is used in problems where a dimension of interest grows with data, for example, in problems where the number of features is not fixed but allowed to vary as we observe more data. Another example is clustering where the number of clusters is automatically inferred from data.
2014
- Peter Orbanz. (2014). “Lecture Notes on Bayesian Nonparametrics."
- QUOTE: A nonparametric Bayesian model is a Bayesian model whose parameter space has infinite dimension. To define a nonparametric Bayesian model, we have to define a probability distribution (the prior) on an infinite-dimensional space. A distribution on an infinite-dimensional space [math]\displaystyle{ \bf{T} }[/math] is a stochastic process with paths in [math]\displaystyle{ \bf{T} }[/math]. Such distributions are typically harder to define than distributions on [math]\displaystyle{ \Re^d }[/math], but we can draw on a large arsenal of tools from stochastic process theory and applied probability.