Covariance-based Principal Components Analysis (PCA) Algorithm: Difference between revisions

From GM-RKB
Jump to navigation Jump to search
m (Text replacement - "([^\s])([\s]{1,7})\<P\>" to "$1 <P>")
m (Text replacement - "\<P\>([\s]{1,7})([^\s])" to "<P> $2")
 
Line 14: Line 14:
=== 2019 ===
=== 2019 ===
* (Wikipedia, 2019) ⇒ https://en.wikipedia.org/wiki/Principal_component_analysis#Computing_PCA_using_the_covariance_method Retrieved:2019-10-14.
* (Wikipedia, 2019) ⇒ https://en.wikipedia.org/wiki/Principal_component_analysis#Computing_PCA_using_the_covariance_method Retrieved:2019-10-14.
** The following is a detailed description of PCA using the covariance method (see also [http://www.cs.otago.ac.nz/cosc453/student_tutorials/principal_components.pdf here]) as opposed to the correlation method.        <P> The goal is to transform a given data set '''X''' of dimension ''p'' to an alternative data set '''Y''' of smaller dimension ''L''. Equivalently, we are seeking to find the matrix '''Y''', where '''Y''' is the [[Karhunen–Loève theorem|Karhunen–Loève]] transform (KLT) of matrix '''X''': : <math> \mathbf{Y} = \mathbb{KLT} \{ \mathbf{X} \} </math>
** The following is a detailed description of PCA using the covariance method (see also [http://www.cs.otago.ac.nz/cosc453/student_tutorials/principal_components.pdf here]) as opposed to the correlation method.        <P>       The goal is to transform a given data set '''X''' of dimension ''p'' to an alternative data set '''Y''' of smaller dimension ''L''. Equivalently, we are seeking to find the matrix '''Y''', where '''Y''' is the [[Karhunen–Loève theorem|Karhunen–Loève]] transform (KLT) of matrix '''X''': : <math> \mathbf{Y} = \mathbb{KLT} \{ \mathbf{X} \} </math>


----
----

Latest revision as of 04:39, 18 August 2021

A Covariance-based Principal Components Analysis (PCA) Algorithm is a PCA algorithm by means of ...



References

2019