Singular Value Decomposition Structure
A Singular Value Decomposition Structure is a matrix decomposition [math]\displaystyle{ UΣV^T }[/math] for [math]\displaystyle{ m \times n }[/math] matrix [math]\displaystyle{ M }[/math], where [math]\displaystyle{ U }[/math] is a [math]\displaystyle{ m \times k }[/math] orthonormal matrix (with left singular vectors), [math]\displaystyle{ \Sigma }[/math] being a [math]\displaystyle{ k \times k }[/math] nonnegative diagonal matrix (of singular values), and [math]\displaystyle{ V^T }[/math] being an [math]\displaystyle{ n \times k }[/math] orthonormal matrix (with right singular vectors).
- Context:
- It can be produced by an SVD Decomposition System (that solves an SVD decomposition task).
- Example(s):
- [math]\displaystyle{ \begin{bmatrix}\frac{1}{\sqrt{5}} & \frac{2}{\sqrt{5}}\\ \frac{2}{\sqrt{5}} & \frac{-1}{\sqrt{5}}\end{bmatrix} \begin{bmatrix}\sqrt{125} & 0\\0 & 0\end{bmatrix} \begin{bmatrix}0.8 & 0.6\\0.6 & -0.8\end{bmatrix} }[/math], for [math]\displaystyle{ \operatorname{SVD}\left(\begin{bmatrix}4 & 3\\8 & 6\end{bmatrix}\right) }[/math].
- [math]\displaystyle{ \begin{bmatrix} 0 & 0 & 1 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 0 & -1 \\ 1 & 0 & 0 & 0 \end{bmatrix} \begin{bmatrix} 4 & 0 & 0 & 0 & 0 \\ 0 & 3 & 0 & 0 & 0 \\ 0 & 0 & \sqrt{5} & 0 & 0 \\ 0 & 0 & 0 & 0 & 0 \end{bmatrix} \begin{bmatrix} 0 & 1 & 0 & 0 & 0 \\ 0 & 0 & 1 & 0 & 0 \\ \sqrt{0.2} & 0 & 0 & 0 & \sqrt{0.8} \\ 0 & 0 & 0 & 1 & 0 \\ -\sqrt{0.8} & 0 & 0 & 0 & \sqrt{0.2} \end{bmatrix} \ , \text{for} \ \operatorname{SVD}\left(\begin{bmatrix} 1 & 0 & 0 & 0 & 2 \\ 0 & 0 & 3 & 0 & 0 \\ 0 & 0 & 0 & 0 & 0 \\ 0 & 4 & 0 & 0 & 0 \end{bmatrix} \right) }[/math]
- [math]\displaystyle{ \begin{bmatrix}\frac{1}{\sqrt{5}} & \frac{2}{\sqrt{5}}\\ \frac{2}{\sqrt{5}} & \frac{-1}{\sqrt{5}}\end{bmatrix} \begin{bmatrix}\sqrt{125} & 0\\0 & 0\end{bmatrix} \begin{bmatrix}0.8 & 0.6\\0.6 & -0.8\end{bmatrix} }[/math], for [math]\displaystyle{ \operatorname{SVD}\left(\begin{bmatrix}4 & 3\\8 & 6\end{bmatrix}\right) }[/math].
- Counter-Example(s):
- See: Singular Value Decomposition Task.
References
2015
- (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/singular_value_decomposition Retrieved:2015-3-1.
- In linear algebra, the singular value decomposition (SVD) is a factorization of a real or complex matrix. It has many useful applications in signal processing and statistics.
Formally, the singular value decomposition of an m × n real or complex matrix M is a factorization of the form M = UΣV∗, where U is an m × m real or complex unitary matrix, Σ is an m × n rectangular diagonal matrix with non-negative real numbers on the diagonal, and V∗ (the conjugate transpose of V, or simply the transpose of V if V is real) is an n × n real or complex unitary matrix. The diagonal entries Σi,i of Σ are known as the singular values of M. The m columns of U and the n columns of V are called the left-singular vectors and right-singular vectors of M, respectively.
The singular value decomposition and the eigendecomposition are closely related. Namely:
- The left-singular vectors of M are eigenvectors of MM∗.
- The right-singular vectors of M are eigenvectors of M∗M.
- The non-zero singular values of M (found on the diagonal entries of Σ) are the square roots of the non-zero eigenvalues of both M∗M and MM∗.
- Applications that employ the SVD include computing the pseudoinverse, least squares fitting of data, multivariable control, matrix approximation, and determining the rank, range and null space of a matrix.
- In linear algebra, the singular value decomposition (SVD) is a factorization of a real or complex matrix. It has many useful applications in signal processing and statistics.
2012
- (Golub & Van Loan, 2012) ⇒ Gene H. Golub, and Charles F. Van Loan. (2012). “Matrix Computations (4th Ed.)." Johns Hopkins University Press. ISBN:1421408597
- QUOTE: If [math]\displaystyle{ A }[/math] is a real m-by-n matrix, then there exist orthogonal matrices :[math]\displaystyle{ U = \bigl [ u_1,...,u_m \bigr] \in \mathbb{R}^{m \times m} \ \text{ and } \ V = \bigl [ v_1,...,v_n \bigr] \in \mathbb{R}^{n \times n} }[/math] such that :[math]\displaystyle{ U^TAV = \Sigma = \operatorname{diag}(\sigma_1,...,\sigma_p) \in \mathbb{R}^{m \times n} \ = \ \operatorname{min}\{m,n\} }[/math] where [math]\displaystyle{ σ_1 ≥ σ_2 ≥ … ≥ σ_p ≥ 0 }[/math]. ...
... The [math]\displaystyle{ σ_i }[/math] are the singular values of [math]\displaystyle{ A }[/math], the [math]\displaystyle{ u_i }[/math] are the left singular vectors of [math]\displaystyle{ A }[/math], and the [math]\displaystyle{ v_i }[/math] are right singular vectors of [math]\displaystyle{ A }[/math]. Separate visualizations of the SVD are required depending upon whether [math]\displaystyle{ A }[/math] has more rows or columns. Here are the 3-by-2 and 2-by-3 examples: :[math]\displaystyle{ \begin{bmatrix} u_{11} & u_{12} & u_{13} \\ u_{21} & u_{22} & u_{23} \\ u_{31} & u_{32} & u_{33} \end{bmatrix}^T \begin{bmatrix} a_{11} & a_{12} \\ a_{21} & a_{22} \\ a_{31} & a_{32} \end{bmatrix} \begin{bmatrix} v_{11} & v_{12} \\ v_{21} & v_{22} \end{bmatrix} = \begin{bmatrix} \sigma_{1} & 0 \\ 0 & \sigma_{2} \\ 0 & 0 \end{bmatrix}. }[/math] :[math]\displaystyle{ \begin{bmatrix} u_{11} & u_{12} \\ u_{21} & u_{22} \end{bmatrix}^T \begin{bmatrix} a_{11} & a_{12} & a_{13} \\ a_{21} & a_{22} & a_{23} \end{bmatrix} \begin{bmatrix} v_{11} & v_{12} & v_{13} \\ v_{21} & v_{22} & v_{23} \\ v_{31} & v_{32} & v_{33} \end{bmatrix} = \begin{bmatrix} \sigma_{1} & 0 & 0 \\ 0 & \sigma_{2} & 0 \end{bmatrix} . }[/math]
- QUOTE: If [math]\displaystyle{ A }[/math] is a real m-by-n matrix, then there exist orthogonal matrices :[math]\displaystyle{ U = \bigl [ u_1,...,u_m \bigr] \in \mathbb{R}^{m \times m} \ \text{ and } \ V = \bigl [ v_1,...,v_n \bigr] \in \mathbb{R}^{n \times n} }[/math] such that :[math]\displaystyle{ U^TAV = \Sigma = \operatorname{diag}(\sigma_1,...,\sigma_p) \in \mathbb{R}^{m \times n} \ = \ \operatorname{min}\{m,n\} }[/math] where [math]\displaystyle{ σ_1 ≥ σ_2 ≥ … ≥ σ_p ≥ 0 }[/math]. ...