1989 RelativeEntropyMeasuresofMultiv
- (Joe, 1989) ⇒ Harry Joe. (1989). “Relative Entropy Measures of Multivariate Dependence.” In: Journal of the American Statistical Association, 84(405). doi:10.1080/01621459.1989.10478751
Subject Headings:
Notes
Cited By
Quotes
Author Keywords
- Multiple correlation; Nominal-categorical variables; Nonparametric measures of dependence; Ordinal-categorical variables; Partial correlation; Pearson's phi-square
Abstract
There has been a lot of work on measures of dependence or association for bivariate probability distributions or bivariate data. These measures usually assume that the variables are both continuous or both categorical. In comparison, there is very little work on multivariate or conditional measures of dependence. The purpose of this article is to discuss measures of multivariate dependence and measures of conditional dependence based on relative entropies. These measures are conceptually very general, as they can be used for a set of variables that can be a mixture of continuous, ordinal-categorical, and nominal-categorical variables. For continuous or ordinal-categorical variables, a certain transformation of relative entropy to the interval [0,1] leads to generalizations of the correlation, multiple-correlation, and partial-correlation coefficients. If all variables are nominal categorical, the relative entropies are standardized to take a maximum of 1 and then transformed so that in the bivariate case, there is a relative reduction in variability interpretation like that for the correlation coefficient. The relative entropy measures of dependence are compared with commonly used bivariate measures of association such as Kendall's τb and Goodman and Kruskal's λ and with measures of dependence based on Pearson's ϕ2 distance. Examples suggest that these new measures of dependence should be useful additional summary values for nonmonotonic or nonlinear dependence. Assuming that the multivariate data are a random sample, the statistical measures of dependence with estimated probability density or mass functions can be studied asymptotically. Standard errors are obtained when all variables are categorical, and an outline of what must be done in the case of all continuous variables is given.
References
;
Author | volume | Date Value | title | type | journal | titleUrl | doi | note | year | |
---|---|---|---|---|---|---|---|---|---|---|
1989 RelativeEntropyMeasuresofMultiv | Harry Joe | Relative Entropy Measures of Multivariate Dependence | 10.1080/01621459.1989.10478751 | 1989 |