PCA

From Ufldl

Jump to: navigation, search
(Created page with "== Introduction == Principal Components Analysis (PCA) is a dimensionality reduction algorithm that can be used to significantly speed up your unsupervised feature learning algor...")
(Example and Mathematical Background)
Line 51: Line 51:
the top (principal) eigenvector of <math>\textstyle \Sigma</math>, and <math>\textstyle u_2</math> is
the top (principal) eigenvector of <math>\textstyle \Sigma</math>, and <math>\textstyle u_2</math> is
the second eigenvector.\footnote{For a mathematical derivation/formal justification
the second eigenvector.\footnote{For a mathematical derivation/formal justification
-
of this, see the CS229 lecture notes on PCA. http://cs229.stanford.edu/.}  You can use standard numerical linear algebra  
+
of this, see the CS229 lecture notes on PCA. <ref>http://cs229.stanford.edu</ref>  You can use standard numerical linear algebra  
-
software to find these (see Section~\ref{sec-implementation} for details).
+
software to find these (see Implementation Notes).
Concretely, let us compute the eigenvectors of <math>\textstyle \Sigma</math>, and stack
Concretely, let us compute the eigenvectors of <math>\textstyle \Sigma</math>, and stack
the eigenvectors in columns to form the matrix <math>\textstyle U</math>:
the eigenvectors in columns to form the matrix <math>\textstyle U</math>:

Revision as of 05:07, 2 April 2011

Personal tools