Logit Function
A Logit Function is an algebraic function of the form [math]\displaystyle{ \operatorname{logit}(p)=\ln\left( \frac{p}{1-p} \right) = \ln(p)-\ln(1-p). \!\, }[/math]
- Context:
- It is an Inverse Function to a Logistic Function.
- It can range from being a Binomial Logit/Binary Logit to being a Multinomial Logit.
- Example(s):
- [math]\displaystyle{ f(x)= 5.4 \times \ln\left( \frac{x}{1-x}\right) }[/math]
- Counter-Example(s):
- See: Logistic Regression, Conditional Logit, Nested Logit, Mixed Logit, Exploded Logit, Ordered Logit, Inverse Function, Sigmoid Function, Logarithm, Logarithmic Unit, Bit, Odds Ratio, Additive Function.
References
2019
- (Wikipedia, 2019) ⇒ https://en.wikipedia.org/wiki/logit Retrieved:2019-4-12.
- In statistics, the logit function or the log-odds is the logarithm of the odds p/(1 − p) where is the probability. . It is a type of function that creates a map of probability values from [math]\displaystyle{ [0, 1] }[/math] to [math]\displaystyle{ [-\infty, +\infty] }[/math] . It is the inverse of the sigmoidal "logistic" function or logistic transform used in mathematics, especially in statistics. In deep learning, the term logits layer is popularly used for the last neuron layer of neural network for classification task which produces raw prediction values as real numbers ranging from [math]\displaystyle{ [-\infty, +\infty] }[/math] .
2015
- (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/logit#Definition Retrieved:2015-1-28.
- The logit of a number p between 0 and 1 is given by the formula:
:[math]\displaystyle{ \operatorname{logit}(p)=\log\left( \frac{p}{1-p} \right) =\log(p)-\log(1-p)=-\log\left( \frac{1}{p} - 1\right). \!\, }[/math]
The base of the logarithm function used is of little importance in the present article, as long as it is greater than 1, but the natural logarithm with base e is the one most often used. The choice of base corresponds to the choice of logarithmic unit for the value: base 2 corresponds to a bit, base e to a nat, and base 10 to a ban (dit, hartley); these units are particularly used in information-theoretic interpretations.
The "logistic" function of any number [math]\displaystyle{ \alpha }[/math] is given by the inverse-logit:
:[math]\displaystyle{ \operatorname{logit}^{-1}(\alpha) = \frac{1}{1 + \operatorname{exp}(-\alpha)} = \frac{\operatorname{exp}(\alpha)}{ \operatorname{exp}(\alpha) + 1} }[/math]
If p is a probability, then p/(1 − p) is the corresponding odds; the logit of the probability is the logarithm of the odds. Similarly, the difference between the logits of two probabilities is the logarithm of the odds ratio (R), thus providing a shorthand for writing the correct combination of odds ratios only by adding and subtracting:
:[math]\displaystyle{ \operatorname{log}(R)=\log\left( \frac{{p_1}/(1-p_1)}{{p_2}/(1-p_2)} \right) =\log\left( \frac{p_1}{1-p_1} \right) - \log\left(\frac{p_2}{1-p_2}\right)=\operatorname{logit}(p_1)-\operatorname{logit}(p_2). \!\, }[/math]
- The logit of a number p between 0 and 1 is given by the formula: