Maximum Entropy Principle (MEP)
(Redirected from MEP)
Jump to navigation
Jump to search
A Maximum Entropy Principle (MEP) is a postulate that, subject to testable information, the probability distribution which best represents the current state of knowledge is the one with largest information entropy.
- Context:
- It can be references by a Maximum Entropy-based Learning Algorithm.
- …
- Counter-Example(s):
- See: Information Entropy, Maximum Entropy Network, Maximum Entropy-based Learning Algorithm, Logistic Regression.
References
2011
- http://en.wikipedia.org/wiki/Principle_of_maximum_entropy
- In Bayesian probability, the principle of maximum entropy is a postulate which states that, subject to known constraints (called testable information), the probability distribution which best represents the current state of knowledge is the one with largest entropy.
Let some testable information about a probability distribution function be given. Consider the set of all trial probability distributions that encode this information. Then, the probability distribution that maximizes the information entropy is the true probability distribution with respect to the testable information prescribed.
- In Bayesian probability, the principle of maximum entropy is a postulate which states that, subject to known constraints (called testable information), the probability distribution which best represents the current state of knowledge is the one with largest entropy.
1998
- (Ratnaparkhi, 1998) ⇒ Adwait Ratnaparkhi. (1998). “Maximum Entropy Models for Natural Language Ambiguity Resolution." PhD Thesis, University of Pennsylvania.
- ABSTRACT: This thesis demonstrates that several important kinds of natural language ambiguities can be resolved to state-of-the-art accuracies using a single statistical modeling technique based on the principle of maximum entropy.
1996
1989
- (Kesavan & Kapur, 1989) ⇒ H. K. Kesavan, and J. N. Kapur. (1989). “The Generalized Maximum Entropy Principle.” In: IEEE Transactions on Systems, Man and Cybernetics, 19(5). doi:10.1109/21.44019
- ABSTRACT: Generalizations of the maximum entropy principle (MEP) of E.T. Jaynes (1957) and the minimum discrimination information principle (MDIP) of S. Kullback (1959) are described. The generalizations have been achieved by enunciating the entropy maximization postulate and examining its consequences. The inverse principles which are inherent in the MEP and MDIP are made quite explicit. Several examples are given to illustrate the power and scope of the generalized maximum entropy principle that follows from the entropy maximization postulate.
1957
- (Jaynes, 1957) ⇒ Edwin T. Jaynes. (1957). “Information Theory and Statistical Mechanics.” In: Physical Review, 106(4). doi:10.1103/PhysRev.106.620
- ABSTRACT: Information theory provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference which is called the maximum-entropy estimate. It is the least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information. If one considers statistical mechanics as a form of statistical inference rather than as a physical theory, it is found that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle. In the resulting "subjective statistical mechanics," the usual rules are thus justified independently of any physical argument, and in particular independently of experimental verification; whether or not the results agree with experiment, they still represent the best estimates that could have been made on the basis of the information available. It is concluded that statistical mechanics need not be regarded as a physical theory dependent for its validity on the truth of additional assumptions not contained in the laws of mechanics (such as ergodicity, metric transitivity, equal a priori probabilities, etc.). Furthermore, it is possible to maintain a sharp distinction between its physical and statistical aspects. The former consists only of the correct enumeration of the states of a system and their properties; the latter is a straightforward example of statistical inference.