Approximate Bayesian Inference Algorithm
(Redirected from approximate inference algorithm)
Jump to navigation
Jump to search
An approximate Bayesian inference algorithm is a Bayesian inference algorithm that is an approximation algorithm.
- AKA: Approximate Posterior Inference.
- Context:
- It is an NP-Hard Task.
- It can range from being a Deterministic Approximate Bayesian Inference Algorithm to being a Sampling-based Approximate Bayesian Inference Algorithm.
- …
- Example(s):
- Counter-Example(s):
- See: Approximate Reasoning.
References
2009a
- http://en.wikipedia.org/wiki/Bayesian_network#Inferring_unobserved_variables
- … The most common approximate inference algorithms are stochastic MCMC simulation, mini-bucket elimination which generalizes loopy belief propagation, and variational methods.
2009b
- Zoubin Ghahramani. (2009). http://learning.eng.cam.ac.uk/zoubin/approx.html
- Approximate Inference: For all but the simplest statistical models, exact learning and inference are computationally intractable. Approximate inference methods make it possible to learn realistic models from large data sets. Generally, approximate inference methods trade off computation time for accuracy. Some of the major classes of approximate inference methods include Markov chain Monte Carlo methods, variational methods and related algorithms such as Expectation Propagation.
2006a
- (Mott & Lester, 2006) ⇒ Bradford W. Mott, and James C. Lester. (2006). “U-director: A decision-theoretic narrative planning architecture for storytelling environments.” In: Proceedings of the fifth international joint conference on Autonomous agents and multiagent systems. doi:10.1145/1160633.1160808
- QUOTE: Because exact inference in Bayesian networks is known to be extraordinarily inefficient (in the worst case NP-hard), U-DIRECTOR exploits recent advances in approximate Bayesian inference via stochastic sampling. The accuracy of these methods depends on the number of samples used. Moreover, stochastic sampling methods typically have an “anytime” property which is particularly attractive for real-time applications. … a performance analysis was conducted to measure the network update time using an exact Bayesian inference algorithm (Clustering [17]) and two approximate Bayesian inference algorithms (EPIS-BN [36] and Likelihood weighting [31]).
2006b
- (Bishop, 2006) ⇒ Christopher M. Bishop. (2006). “Pattern Recognition and Machine Learning.” Springer, Information Science and Statistics. ISBN:0387310738
- QUOTE: A central task in the application of probabilistic models is the evaluation of the posterior distribution p(Z|X) of the latent variables Z given the observed (visible) data variables X, and the evaluation of expectations computed with respect to this distributions. The model might also contain some determinist parameter, which we will leave implicit for the moment, or it may be a fully Bayesian model in which any unknown parameter are given prior distribution and are absorbed into the set of latent variables denoted by the vector Z. For instance, in the EM algorithm we need to evaluate the expectation of the complete-data log likelihood with respect to the posterior distribution of the latent variables. … In such situations, we need to resort to approximation schemes, and these fall broadly into two classes, according to whether they rely on stochastic or deterministic approximations.
2004
- Zoubin Ghahramani. (2004). “Bayesian methods in machine learning.” Seminar Talk, Oct 18 2004 at University of Birmingham.
- Bayesian methods It can be applied to a wide range of probabilistic models commonly used in machine learning and pattern recognition. The challenge is to discover approximate inference methods that can deal with complex models and large scale data sets in reasonable time. In the past few years Variational Bayesian (VB) approximations have emerged as an alternative to MCMC methods. …
2003
- (Beal, 2003) ⇒ Matthew J. Beal. (2003). “Variational Algorithms for Approximate Bayesian Inference.” Ph.D. thesis, Gatsby Computational Neuroscience Unit, University College London.
2001
- (Minka, 2001) ⇒ Thomas P. Minka. (2001). “Expectation Propagation for Approximate Bayesian Inference.” In: Uncertainty in Artificial Intelligence (UAI 2001).
1993
- (Yuan & Druzdzel, 1993) ⇒ C. Yuan, and M. Druzdzel. (1993). “An Importance Sampling Algorithm Based on Evidence Pre-Propagation.” In: Proceedings of the Nineteenth Annual Conference on Uncertainty in Artificial Intelligence.
- (Tzeras & Hartman, 1993) ⇒ Kostas Tzeras, and Stephan Hartmann. (1993). “Automatic Indexing Based on Bayesian Inference Networks.” In: Proceedings of the ACM SIGIR 1993 Conference. doi:10.1145/160688.160691
1990
- (Shachter & Peot, 1990) ⇒ R. Shachter, and M. Peot. (1990). “Simulation approaches to general probabilistic inference on belief networks.” In: Proceedings of the Fifth Annual Conference on Uncertainty in Artificial Intelligence.
1989
- (Kass & Steffey, 1989) ⇒ R. Kass, and D. Steffey. (1989). “Approximate Bayesian Inference in Conditionally Independent Hierarchical Models (parametric empirical Bayes models).” In: Journal of the American Statistical Association, 84(407).