Exercise:PCA and Whitening
From Ufldl
for
Exercise:PCA and Whitening
Jump to:
navigation
,
search
== PCA, PCA whitening and ZCA implementation == === Step 0: Load data === [[File:raw_images.png|240px|alt=Raw images|Raw images]] === Step 1: Implement PCA === ==== Step 1a: Implement PCA ==== Implement PCA to obtain xRot, the matrix in which the data is expressed with respect to the eigenbasis of sigma, which is the matrix U. ==== Step 1b: Check covariance ==== The covariance matrix for the data expressed with respect to the basis U should be a diagonal matrix with non-zero entries only along the main diagonal. We will verify this here. Write code to compute the covariance matrix, covar. When visualised as an image, you should see a straight line across the diagonal (non-zero entries) against a blue background (zero entries). Visualise the covariance matrix. You should see a line across the diagonal against a blue background. [[File:pca_covar.png|240px]] === Step 2: Find number of components to retain === Write code to determine k, the number of components to retain in order to retain at least 99% of the variance. === Step 3: PCA with dimension reduction === Now that you have found k, you can reduce the dimension of the data by discarding the remaining dimensions. In this way, you can represent the data in k dimensions instead of the original 196, which will save you computational time when running learning algorithms on the reduced representation. Following the dimension reduction, invert the PCA transformation to produce the matrix xHat, the dimension-reduced data with respect to the original basis. Visualise the data and compare it to the raw data. You will observe that there is little loss due to throwing away the principal components that correspond to dimensions with low variation. Visualise the data, and compare it to the raw data. You should observe that the raw and processed data are of comparable quality. For comparison, you may wish to generate a PCA reduced image which retains only 50% of the variance. [[File:pca_images.png|240px|alt=PCA dimension-reduced images (99% variance)|PCA dimension-reduced images (99% variance)]] [[File:raw_images.png|240px|alt=Raw images|Raw images]] [[File:pca_images_50.png|240px|alt=PCA dimension-reduced images (50% variance)|PCA dimension-reduced images (50% variance)]] === Step 4: PCA with whitening and regularisation === ==== Step 4a: Implement PCA with whitening and regularisation ==== Implement PCA with whitening and regularisation to produce the matrix xPCAWhite. ==== Step 4b: Check covariance ==== PCA with whitening results a covariance matrix that is (approximately) equal to the identity matrix. We will verify this here. Write code to compute the covariance matrix, covar. When visualised as an image, you should see a red line across the diagonal (one entries) against a blue background (zero entries). Visualise the covariance matrix. You should see a red line across the diagonal against a blue background. [[File:pca_whitened_covar.png|240px]] === Step 5: ZCA whitening === Now implement ZCA whitening to produce the matrix xZCAWhite. Visualise the data and compare it to the raw data. You should observe that whitening results in, among other things, enhanced edges. Visualise the data, and compare it to the raw data. You should observe that the whitened images have enhanced edges. [[File:zca_whitened_images.png|240px|alt=ZCA whitened images|ZCA whitened images]] [[File:raw_images.png|240px|alt=Raw images|Raw images]]
Template:PCA
(
view source
)
Return to
Exercise:PCA and Whitening
.
Views
Page
Discussion
View source
History
Personal tools
Log in
ufldl resources
UFLDL Tutorial
Recommended Readings
wiki
Main page
Recent changes
Random page
Help
Search
Toolbox
What links here
Related changes
Special pages