Eigenvector Estimation Task
Jump to navigation
Jump to search
See: Estimation Task, Eigenvector, Eigenvector Estimation Algorithm, PCA Algorithm.
References
2012
- (Anandkumar et al., 2012) ⇒ Anima Anandkumar, Rong Ge, Daniel Hsu, Sham M. Kakade, and Matus Telgarsky. (2012) "Tensor decompositions for learning latent variable models." arXiv preprint arXiv:1210.7559.
- This work considers a computationally and statistically efficient parameter estimation method for a wide class of latent variable models -- including Gaussian mixture models, hidden Markov models, and latent Dirichlet allocation -- which exploits a certain tensor structure in their low-order observable moments (typically, of second- and [[third-order moment|third-order]). Specifically, parameter estimation is reduced to the problem of extracting a certain (orthogonal) decomposition of a symmetric tensor derived from the moments; this decomposition can be viewed as a natural generalization of the singular value decomposition for matrices. Although tensor decompositions are generally intractable to compute, the decomposition of these specially structured tensors can be efficiently obtained by a variety of approaches, including power iterations and maximization approaches (similar to the case of matrices). A detailed analysis of a robust tensor power method is provided, establishing an analogue of Wedin's perturbation theorem for the [[singular vectors of matrice]]s. This implies a robust and computationally tractable estimation approach for several popular latent variable models.
- …
- Algorithm 1 Robust tensor power method.
- input: symmetric tensor ~ </math>T \in \Re^{k \times k \times k}, number of iterations L , N .
- output: the estimated eigenvector/eigenvalue pair; the deflated tensor.