Continuous-Variable Mutual Information Metric
Jump to navigation
Jump to search
A Continuous-Variable Mutual Information Metric is a mutual information metric for continuous random variables ([math]\displaystyle{ X, Y }[/math]).
- Context:
- It can be calculated by [math]\displaystyle{ \int_Y \int_X p(x,y) \log{ \left(\frac{p(x,y)}{p(x)\,p(y)} \right) } \; dx \,dy }[/math].
- Example(s):
- …
- Counter-Example(s):
- See: Pointwise Mutual Information, Continuous Function, Double Integral.
References
2016
- (Wikipedia, 2016) ⇒ https://en.wikipedia.org/wiki/mutual_information#Definition_of_mutual_information Retrieved:2016-5-17.
- … In the case of continuous random variables, the summation is replaced by a definite double integral: : [math]\displaystyle{ I(X;Y) = \int_Y \int_X p(x,y) \log{ \left(\frac{p(x,y)}{p(x)\,p(y)} \right) } \; dx \,dy, }[/math] where p(x,y) is now the joint probability density function of X and Y, and [math]\displaystyle{ p(x) }[/math] and [math]\displaystyle{ p(y) }[/math] are the marginal probability density functions of X and Y respectively.
If the log base 2 is used, the units of mutual information are the bit.
- … In the case of continuous random variables, the summation is replaced by a definite double integral: : [math]\displaystyle{ I(X;Y) = \int_Y \int_X p(x,y) \log{ \left(\frac{p(x,y)}{p(x)\,p(y)} \right) } \; dx \,dy, }[/math] where p(x,y) is now the joint probability density function of X and Y, and [math]\displaystyle{ p(x) }[/math] and [math]\displaystyle{ p(y) }[/math] are the marginal probability density functions of X and Y respectively.