Belief Propagation (BP) Algorithm
A Belief Propagation (BP) Algorithm is a message passing algorithm for performing inference on graphical models.
- AKA: Sum-Product Message Passing.
- …
- Example(s):
- Counter-Example(s):
- See: Approximate Inference Algorithm, Markov Random Field, Low-Density Parity-Check Codes, Thermodynamic Free Energy.
References
2016
- (Wikipedia, 2016) ⇒ http://wikipedia.org/wiki/belief_propagation Retrieved:2016-2-18.
- Belief propagation, also known as sum-product message passing, is a message passing algorithm for performing inference on graphical models, such as Bayesian networks and Markov random fields. It calculates the marginal distribution for each unobserved node, conditional on any observed nodes. Belief propagation is commonly used in artificial intelligence and information theory and has demonstrated empirical success in numerous applications including low-density parity-check codes, turbo codes, free energy approximation, and satisfiability.
The algorithm was first proposed by Judea Pearl in 1982,[1] who formulated this algorithm on trees, and was later extended to polytrees.[2] It has since been shown to be a useful approximate algorithm on general graphs.[3]
If X={Xi} is a set of discrete random variables with a joint mass function p, the marginal distribution of a single Xi is simply the summation of p over all other variables: : [math]\displaystyle{ p_{X_i}(x_i) = \sum_{\mathbf{x}': x'_i = x_i} p(\mathbf{x}'). }[/math] However, this quickly becomes computationally prohibitive: if there are 100 binary variables, then one needs to sum over 299 ≈ 6.338 × 1029 possible values. By exploiting the polytree structure, belief propagation allows the marginals to be computed much more efficiently.
- Belief propagation, also known as sum-product message passing, is a message passing algorithm for performing inference on graphical models, such as Bayesian networks and Markov random fields. It calculates the marginal distribution for each unobserved node, conditional on any observed nodes. Belief propagation is commonly used in artificial intelligence and information theory and has demonstrated empirical success in numerous applications including low-density parity-check codes, turbo codes, free energy approximation, and satisfiability.
2011b
- (Yedidia, 2011) ⇒ Jonathan S. Yedidia. (2011). “Message-passing Algorithms for Inference and Optimization." Journal of Statistical Physics 145, no. 4 ** QUOTE: … BP and DC Message-passing algorithms are used to solve inference problems, optimization problems, and constraint satisfaction problems.
2006
- (Malioutov et al., 2006) ⇒ Dmitry M. Malioutov, Jason K. Johnson, and Alan S. Willsky. (2006). “Walk-sums and Belief Propagation in Gaussian Graphical Models." The Journal of Machine Learning Research 7 http://jmlr.csail.mit.edu/papers/v7/malioutov06a.html.
2004
- (Yedidia et al., 2004) ⇒ Jonathan S. Yedidia, William T. Freeman, and Yair Weiss. (2004). “Constructing Free Energy Approximations and Generalized Belief Propagation Algorithms." Technical Report TR2004-040, Mitsubishi Electric Research Laboratories.
2003
- (Beal, 2003) ⇒ Matthew J. Beal. (2003). “Variational Algorithms for Approximate Bayesian Inference." Ph.D. thesis, Gatsby Computational Neuroscience Unit, University College London.
- Chapter 1 presents background material on Bayesian inference, graphical models, and propagation algorithms.
2001
- (Kschischang et al., 2001) ⇒ Frank Kschischang, Brendan Frey, and Hans-Andrea Loeliger. (2001). “Factor Graphs and the Sum-product Algorithm.” In: IEEE Transactions on Information Theory, 47 (2). “
- (Weiss & Freeman, 2001) ⇒ Yair Weiss, and William T. Freeman. (2001). “Correctness of Belief Propagation in Gaussian Graphical Models of Arbitrary Topology." Neural computation 13, no. 10 doi:10.1162/089976601750541769.
1982
- (Pearl, 1982) ⇒ Judea Pearl. (1982). “Reverend Bayes on Inference Engines: A Distributed Hierarchical Approach.” In: AAAI-1982.