Bayesian Information Criterion

From GM-RKB
Jump to navigation Jump to search

A Bayesian Information Criterion is a statistical model selection criterion that is partly based on the likelihood function.



References

2014

  • (Wikipedia, 2014) ⇒ http://en.wikipedia.org/wiki/Bayesian_information_criterion Retrieved:2014-9-24.
    • In statistics, the Bayesian information criterion (BIC) or Schwarz criterion (also SBC, SBIC) is a criterion for model selection among a finite set of models. It is based, in part, on the likelihood function and it is closely related to the Akaike information criterion (AIC).

      When fitting models, it is possible to increase the likelihood by adding parameters, but doing so may result in overfitting. Both BIC and AIC resolve this problem by introducing a penalty term for the number of parameters in the model; the penalty term is larger in BIC than in AIC.

      The BIC was developed by Gideon E. Schwarz, who gave a Bayesian argument for adopting it. Akaike was so impressed with Schwarz's Bayesian formalism that he developed his own Bayesian formalism, now often referred to as the ABIC for "a Bayesian Information Criterion" or more casually "Akaike's Bayesian Information Criterion". [1]

  1. Akaike, H., 1977. “On entropy maximization principle". In: Krishnaiah, P.R. (Editor). Applications of Statistics, North-Holland, Amsterdam, pp. 27–41.

2003

1978

  • (Schwarz, 1978) ⇒ Gideon E. Schwarz. (1978). “Estimating the Dimension of a Model.” In: Annals of Statistics, 6(2). doi:10.1214/aos/1176344136
    • ABSTRACT: The problem of selecting one of a number of models of different dimensions is treated by finding its Bayes solution, and evaluating the leading terms of its asymptotic expansion. These terms are a valid large-sample criterion beyond the Bayesian context, since they do not depend on the a priori distribution.