PyMC3 System
(Redirected from PyMC3)
Jump to navigation
Jump to search
A PyMC3 System is a Python-based Bayesian statistical modeling system.
- Example(s):
- PyMC3, v3.9.3 [1] (~2020-08-11).
- …
- Counter-Example(s):
- Stan System.
- …
- See: Statistical Package, PyMC, Theano Software, MCMC System.
References
2020
- (Wikipedia, 2020) ⇒ https://en.wikipedia.org/wiki/PyMC3 Retrieved:2020-11-4.
- PyMC3 is a Python package for Bayesian statistical modeling and probabilistic machine learning which focuses on advanced Markov chain Monte Carlo and variational fitting algorithms. [1] It is a rewrite from scratch of the previous version of the PyMC software. Unlike PyMC2, which had used Fortran extensions for performing computations, PyMC3 relies on Theano for automatic differentiation and also for computation optimization and dynamic C compilation. . From version 3.8 PyMC3 relies on ArviZ to handle plotting, diagnostics, and statistical checks. PyMC3 and Stan are the two most popular probabilistic programming tools. PyMC3 is an open source project, developed by the community and fiscally sponsored by NumFocus. PyMC3 has been used to solve inference problems in several scientific domains, including astronomy, molecular biology, crystallography, chemistry, ecology and psychology. Previous versions of PyMC were also used widely, for example in climate science, public health, neuroscience, and parasitology. After Theano announced plans to discontinue development in 2017, the PyMC3 team decided in 2018 to develop a new version of PyMC named PyMC4, and pivot to TensorFlow Probability as its computational backend. Until the new version is in beta, PyMC3 will continue to be the primary target of development efforts, and both it, and Theano as its backend, will be supported by the PyMC3 team for an extended period of time.
- ↑ Salvatier J, Wiecki TV, Fonnesbeck C. (2016) Probabilistic programming in Python using PyMC3. PeerJ Computer Science 2:e55 https://doi.org/10.7717/peerj-cs.55
2020
- https://github.com/pymc-devs/pymc3
- QUOTE: Features
- Intuitive model specification syntax, for example, x ~ N(0,1) translates to x = Normal('x',0,1)
- Powerful sampling algorithms, such as the No U-Turn Sampler, allow complex models with thousands of parameters with little specialized knowledge of fitting algorithms.
- Variational inference: ADVI for fast approximate posterior estimation as well as mini-batch ADVI for large data sets.
- Relies on Theano which provides:
- Computation optimization and dynamic C compilation
- Numpy broadcasting and advanced indexing
- Linear algebra operators
- Simple extensibility
- Transparent support for missing value imputation
- QUOTE: Features