Hierarchical Bayesian Metamodel
Jump to navigation
Jump to search
A Hierarchical Bayesian Metamodel is a directed conditional statistical metamodel that describes a family of hierarchical Bayesian networks.
- AKA: Hierarchical Probabilistic Graphical Metamodel.
- …
- Example(s):
- See: Hierarchical Latent Dirichlet Allocation Metamodel.
References
2011
- http://en.wikipedia.org/wiki/Hierarchical_Bayes_model
- The hierarchical Bayes method is a topic in modern Bayesian analysis. It is a powerful tool for expressing rich statistical models that more fully reflect a given problem than a simpler model could. Given data [math]\displaystyle{ x\,\! }[/math] and parameters [math]\displaystyle{ \vartheta }[/math], a simple Bayesian analysis starts with a prior probability (prior) [math]\displaystyle{ p(\vartheta) }[/math] and likelihood [math]\displaystyle{ p(x|\vartheta) }[/math] to compute a posterior probability [math]\displaystyle{ p(\vartheta|x) \propto p(x|\vartheta)p(\vartheta) }[/math]. Often the prior on [math]\displaystyle{ \vartheta }[/math] depends in turn on other parameters [math]\displaystyle{ \varphi }[/math] that are not mentioned in the likelihood. So, the prior [math]\displaystyle{ p(\vartheta) }[/math] must be replaced by a likelihood [math]\displaystyle{ p(\vartheta|\varphi) }[/math], and a prior [math]\displaystyle{ p(\varphi) }[/math] on the newly introduced parameters [math]\displaystyle{ \varphi }[/math] is required, resulting in a posterior probability [math]\displaystyle{ p(\vartheta,\varphi|x) \propto p(x|\vartheta)p(\vartheta|\varphi)p(\varphi). }[/math] This is the simplest example of a hierarchical Bayes model. The process may be repeated; for example, the parameters [math]\displaystyle{ \varphi }[/math] may depend in turn on additional parameters [math]\displaystyle{ \psi\,\! }[/math], which will require their own prior. Eventually the process must terminate, with priors that do not depend on any other unmentioned parameters.
2003
- (Blei, Jordan & Ng, 2003) ⇒ David M. Blei, Michael I. Jordan, and Andrew Y. Ng. (2003). “Hierarchical Bayesian Models for Applications in Information Retrieval.” In: Bayesian Statistics, 7. ISBN:0198526156
- QUOTE: We present a simple hierarchical Bayesian approach to the modeling collections of texts and other large-scale data collections. For text collections, we posit that a document is generated by choosing a random set of multinomial probabilities for a set of possible “topics, and then repeatedly generating words by sampling from the topic mixture. This model is intractable for exact probabilistic inference, but approximate posterior probabilities and marginal likelihoods can be obtained via fast variational methods. We also present extensions to coupled models for joint text/image data and multiresolution models for topic hierarchies.