Linear Discriminant Analysis (LDA) Algorithm
A Linear Discriminant Analysis (LDA) Algorithm is a linear generative classification algorithm that assumes the underlying class-conditional density follows a Gaussian distribution and have a common covariances.
- Context:
- It can estimate model parameters by maximizing the full Log Likelihood.
- It can estimate parameters for each Target Class independently of other classes
- It can make use of Marginal Density information.
- It can be generalized as a Multiclass Linear Discriminant Analysis Algorithm.
- It can be implemented by an LDA System (to solve an LDA task).
- Example(s):
- Counter-Example(s):
- See: Fisher Score, Generative Model, ANOVA, Fisher's Linear Discriminant, Kernel-based Learning Algorithm.
References
2020
- (Wikipedia, 2020) ⇒ https://en.wikipedia.org/wiki/Linear_discriminant_analysis Retrieved:2020-3-6.
- Linear discriminant analysis (LDA), normal discriminant analysis (NDA), or discriminant function analysis is a generalization of Fisher's linear discriminant, a method used in statistics, pattern recognition, and machine learning to find a linear combination of features that characterizes or separates two or more classes of objects or events. The resulting combination may be used as a linear classifier, or, more commonly, for dimensionality reduction before later classification.
LDA is closely related to analysis of variance (ANOVA) and regression analysis, which also attempt to express one dependent variable as a linear combination of other features or measurements. However, ANOVA uses categorical independent variables and a continuous dependent variable, whereas discriminant analysis has continuous independent variables and a categorical dependent variable (i.e. the class label). [1] Logistic regression and probit regression are more similar to LDA than ANOVA is, as they also explain a categorical variable by the values of continuous independent variables. These other methods are preferable in applications where it is not reasonable to assume that the independent variables are normally distributed, which is a fundamental assumption of the LDA method. LDA is also closely related to principal component analysis (PCA) and factor analysis in that they both look for linear combinations of variables which best explain the data. LDA explicitly attempts to model the difference between the classes of data. PCA, in contrast, does not take into account any difference in class, and factor analysis builds the feature combinations based on differences rather than similarities. Discriminant analysis is also different from factor analysis in that it is not an interdependence technique: a distinction between independent variables and dependent variables (also called criterion variables) must be made. LDA works when the measurements made on independent variables for each observation are continuous quantities. When dealing with categorical independent variables, the equivalent technique is discriminant correspondence analysis.[2] Discriminant analysis is used when groups are known a priori (unlike in cluster analysis). Each case must have a score on one or more quantitative predictor measures, and a score on a group measure.[3] In simple terms, discriminant function analysis is classification - the act of distributing things into groups, classes or categories of the same type.
- Linear discriminant analysis (LDA), normal discriminant analysis (NDA), or discriminant function analysis is a generalization of Fisher's linear discriminant, a method used in statistics, pattern recognition, and machine learning to find a linear combination of features that characterizes or separates two or more classes of objects or events. The resulting combination may be used as a linear classifier, or, more commonly, for dimensionality reduction before later classification.
- ↑ Analyzing Quantitative Data: An Introduction for Social Researchers, Debra Wetcher-Hendricks, p.288
- ↑ Abdi, H. (2007) "Discriminant correspondence analysis." In: N.J. Salkind (Ed.): Encyclopedia of Measurement and Statistic. Thousand Oaks (CA): Sage. pp. 270–275.
- ↑ BÖKEOĞLU ÇOKLUK, Ö, & BÜYÜKÖZTÜRK, Ş. (2008). Discriminant function analysis: Concept and application. Eğitim araştırmaları dergisi, (33), 73-92.
2011
- http://en.wikipedia.org/wiki/Linear_classifier#Generative_models_vs._discriminative_models
- … Methods of the first class model conditional density functions [math]\displaystyle{ P(\vec x|{\rm class}) }[/math]. Examples of such algorithms include:
- Linear Discriminant Analysis (or Fisher's linear discriminant) (LDA) — assumes Gaussian conditional density models
- … Methods of the first class model conditional density functions [math]\displaystyle{ P(\vec x|{\rm class}) }[/math]. Examples of such algorithms include:
2009
- (SciPy, 2009) ⇒ http://www.scipy.org/Cookbook/LinearClassification
- The first example shows the implementation of Fisher's Linear Classifier for 2-class problem and this algorithm is precisely described in book "Pattern Recognition and Machine Learning" by Christopher M Bishop (p 186, Section 4.1). The main idea of this algorithm is that we try to reduce the dimensionality of input vector X and project it onto 1D space using the equation y=W.T X where W.T - row vector of weights, and we adjust the weight vector W and choose the projection that maximizes the class separation. The following program use the famous data set Iris with 150 number of instances and 4 attributes (4D space), target vector which contains labels: "Iris-setosa", "Iris-virginica", "Iris-versicolor", therefore, we have 3 classes, but, in this case, we may assume that we have class 1 with labels "Iris-setosa" and class 2 with other instances. Iris data set is available here: http://archive.ics.uci.edu/ml/datasets/Iris/ or here (comma separated format) - bezdekIris.data.txt
2004
- (Bouchard & Triggs, 2004) ⇒ Guillaume Bouchard, and Bill Triggs. (2004). “The Trade-off Between Generative and Discriminative Classifiers.” In: Proceedings of COMPSTAT 2004.
- QUOTE: Well known generative-discriminative pairs include Linear Discriminant Analysis (LDA) vs. Linear logistic regression … . Under the assumption that the underlying distributions are Gaussian with equal covariances, it is known that LDA requires less data than its discriminative counterpart, linear logistic regression [3].
1978
- (Press & Wilson, 1978) ⇒ S. James Press, and Sandra Wilson. (1978). “Choosing Between Logistic Regression and Discriminant Analysis.” In: Journal of the American Statistical Association, 73(364). : http://www.jstor.org/stable/2286261
- ABSTRACT: Classifying an observation into one of several populations is discriminant analysis, or classification. Relating qualitative variables to other variables through a logistic cdf functional form is logistic regression. Estimators generated for one of these problems are often used in the other. If the populations are normal with identical covariance matrices, discriminant analysis estimators are preferred to logistic regression estimators for the discriminant analysis problem. In most discriminant analysis applications, however, at least one variable is qualitative (ruling out multivariate normality). Under nonnormality, we prefer the logistic regression model with maximum likelihood estimators for solving both problems. In this article we summarize the related arguments, and report on our own supportive empirical studies.
- KEYWORDS: Logistic regression; Discriminant analysis; Qualitative variables; Classification.
- QUOTE: … The linear discriminant analysis approach, by contrast, is strictly applicable only when the underlying variables are jointly normal with equal covariance matrices
1977
- (Eisenbeis, 1977) ⇒ Robert A. Eisenbeis. (1977). “Pitfalls in the Application of Discriminant Analysis in Business, Finance, and Economics.” In: The Journal of Finance, 32(3).
1975
- (Lachenbruch, 1975) ⇒ P. A. Lachenbruch. (1975). “Discriminunt Analysis." Hafner Press.