Fisher Information
Jump to navigation
Jump to search
See: Information Variance, Maximum-Likelihood Estimation, Fisher Information Matrix, Wald Test, Mathematical Statistics, Variance, Score (Statistics), Expected Value, Observed Information, Bayesian Statistics, Posterior Distribution, Mode (Statistics), Prior Distribution, Bernstein–Von Mises Theorem.
References
2015
- (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/Fisher_information#Matrix_form Retrieved:2015-6-25.
- In mathematical statistics, the Fisher information (sometimes simply called information [1] ) is a way of measuring the amount of information that an observable random variable X carries about an unknown parameter θ upon which the probability of X depends. Formally, it is the variance of the score, or the expected value of the observed information. In Bayesian statistics, the asymptotic distribution of the posterior mode depends on the Fisher information and not on the prior (according to the Bernstein–von Mises theorem, which was anticipated by Laplace for exponential families). [2] The role of the Fisher information in the asymptotic theory of maximum-likelihood estimation was emphasized by the statistician R. A. Fisher (following some initial results by F. Y. Edgeworth). The Fisher information is also used in the calculation of the Jeffreys prior, which is used in Bayesian statistics. The Fisher-information matrix is used to calculate the covariance matrices associated with maximum-likelihood estimates. It can also be used in the formulation of test statistics, such as the Wald test. Statistical systems of a scientific nature (physical, biological, etc.) whose likelihood functions obey shift invariance have been shown to obey maximum Fisher information. [3] The level of the maximum depends upon the nature of the system constraints.
- ↑ Lehmann & Casella, p. 115
- ↑ Lucien Le Cam (1986) Asymptotic Methods in Statistical Decision Theory: Pages 336 and 618–621 (von Mises and Bernstein).
- ↑ Frieden & Gatenby (2013)