Bayesian Information Criterion
A Bayesian Information Criterion is a statistical model selection criterion that is partly based on the likelihood function.
- AKA: BIC, Schwarz Criterion, SBIC.
- …
- Counter-Example(s):
- See: Likelihood Function, Overfitting.
References
2014
- (Wikipedia, 2014) ⇒ http://en.wikipedia.org/wiki/Bayesian_information_criterion Retrieved:2014-9-24.
- In statistics, the Bayesian information criterion (BIC) or Schwarz criterion (also SBC, SBIC) is a criterion for model selection among a finite set of models. It is based, in part, on the likelihood function and it is closely related to the Akaike information criterion (AIC).
When fitting models, it is possible to increase the likelihood by adding parameters, but doing so may result in overfitting. Both BIC and AIC resolve this problem by introducing a penalty term for the number of parameters in the model; the penalty term is larger in BIC than in AIC.
The BIC was developed by Gideon E. Schwarz, who gave a Bayesian argument for adopting it. Akaike was so impressed with Schwarz's Bayesian formalism that he developed his own Bayesian formalism, now often referred to as the ABIC for "a Bayesian Information Criterion" or more casually "Akaike's Bayesian Information Criterion". [1]
- In statistics, the Bayesian information criterion (BIC) or Schwarz criterion (also SBC, SBIC) is a criterion for model selection among a finite set of models. It is based, in part, on the likelihood function and it is closely related to the Akaike information criterion (AIC).
- ↑ Akaike, H., 1977. “On entropy maximization principle". In: Krishnaiah, P.R. (Editor). Applications of Statistics, North-Holland, Amsterdam, pp. 27–41.
2003
- (Myung, 2003) ⇒ In Jae Myung. (2003). “Tutorial on Maximum Likelihood Estimation.” In: Journal of Mathematical Psychology, 47.
- QUOTE: … For example, MLE is a prerequisite for the chi-square test, the G-square test, Bayesian methods, inference with missing data, modeling of random effects, and many model selection criteria such as the Akaike information criterion (Akaike, 1973) and the Bayesian information criteria (Schwarz, 1978).
1978
- (Schwarz, 1978) ⇒ Gideon E. Schwarz. (1978). “Estimating the Dimension of a Model.” In: Annals of Statistics, 6(2). doi:10.1214/aos/1176344136
- ABSTRACT: The problem of selecting one of a number of models of different dimensions is treated by finding its Bayes solution, and evaluating the leading terms of its asymptotic expansion. These terms are a valid large-sample criterion beyond the Bayesian context, since they do not depend on the a priori distribution.