Exercise:PCA and Whitening
From Ufldl
for
Exercise:PCA and Whitening
Jump to:
navigation
,
search
In this exercise set, you will implement PCA, PCA whitening and ZCA whitening. You will build on the MATLAB starter code which we have provided in <tt>[http://ufldl.stanford.edu/wiki/resources/pca_exercise.zip pca_exercise.zip]</tt>. You need only write code at the places indicated by "YOUR CODE HERE" in the files. The only file you need to modify is <tt>pca_gen.m</tt>. === Step 0: Prepare data === ==== Step 0a: Load data ==== The starter code contains code to load some natural images and sample 12x12 patches from them. The raw patches will look something like this: [[File:raw_images.png|240px|alt=Raw patches|Raw patches]] These patches are stored as column vectors <math>x^{(i)} \in \mathbb{R}^{144}</math> in the <math>144 \times 10000</math> matrix <math>x</math>. ==== Step 0b: Zero mean the data ==== You should then center the data at zero by subtracting the per-example mean (separately for each example, compute the mean and subtract this value from that example). === Step 1: Implement PCA === ==== Step 1a: Implement PCA ==== In this step, you will implement PCA to obtain <math>x_{rot}</math>, the matrix in which the data is "rotated" to the basis comprising the principal components (i.e. the eigenbasis of <math>\Sigma</math>). Note that in this part of the exercise, you should ''not'' whiten the data. ==== Step 1b: Check covariance ==== To verify that your implementation of PCA is correct, you should check the covariance matrix for the rotated data. PCA guarantees that the covariance matrix for the rotated data is a diagonal matrix (a matrix with non-zero entries only along the main diagonal). Implement code to compute the covariance matrix and verify this property. One way to do this is to compute the covariance matrix, and visualise it using the MATLAB command <tt>imagesc</tt>. The image should show a coloured diagonal line against a blue background. For this dataset, because of the range of the diagonal entries, the diagonal line may not be apparent. [[File:pca_covar.png|360px]] === Step 2: Find number of components to retain === In the next step, you will find <math>k</math>, the number of components to retain in order to retain at least 99% of the variance. In the step after this, you will discard all but <math>k</math> principal components, reducing the dimension of the original data to <math>k</math>. === Step 3: PCA with dimension reduction === Now that you have found <math>k</math>, you can reduce the dimension of the data by discarding the remaining dimensions. In this way, you can represent the data in <math>k</math> dimensions instead of the original 144, which will save you computational time when running learning algorithms on the reduced representation. To see the effect of dimension reduction, invert the PCA transformation to produce the matrix <math>\hat{x}</math>, the dimension-reduced data with respect to the original basis. Visualise <math>\hat{x}</math> and compare it to the raw data, <math>x</math>. You will observe that there is little loss due to throwing away the principal components that correspond to dimensions with low variation. For comparison, you may also wish to generate and visualise <math>\hat{x}</math> for when only 90% of the variance is retained. <table> <tr> <td>[[File:pca_images.png|240px|alt=PCA dimension-reduced images (99% variance)|PCA dimension-reduced images (99% variance)]]</td> <td>[[File:raw_images.png|240px|alt=Raw images|Raw images]]</td> <td>[[File:pca_images_90.png|240px|alt=PCA dimension-reduced images (90% variance)|PCA dimension-reduced images (50% variance)]]</td> </tr> <tr> <td>PCA dimension-reduced images<br /> (99% variance)</td> <td>Raw images <br /> </td> <td>PCA dimension-reduced images<br /> (90% variance)</td> </tr> </table> === Step 4: PCA with whitening and regularisation === ==== Step 4a: Implement PCA with whitening and regularisation ==== Now implement PCA with whitening and regularisation to produce the matrix <math>x_{PCAWhite}</math> with the following parameters: epsilon = 0.1 ==== Step 4b: Check covariance ==== As with PCA alone, PCA with whitening results in processed data that has a diagonal covariance matrix. However, unlike PCA alone, whitening additionally ensures that the diagonal entries are equal to 1, i.e. that the covariance matrix is the identity matrix. That would be the case if you were doing whitening alone with no regularisation. However, in this case you are whitening with regularisation, in order to avoid amplifying high-frequency noise in the data. As such, the variance of the high-frequency noise components, which can be seen from the appropriate diagonal entries in the covariance matrix, will be much less than 1. To verify that your implementation of PCA whitening with and without regularisation is correct, you can check these properties. Implement code to compute the covariance matrix and verify this property. (How should you check your PCA whitening code without regularisation? Simply set epsilon to 0, or close to 0). As earlier, you can visualise the covariance matrix with <tt>imagesc</tt>. When visualised as an image, for PCA whitening without regularisation you should see a red line across the diagonal (corresponding to the one entries) against a blue background (corresponding to the zero entries); for PCA whitening with regularisation you should see a red line that slowly turns blue across the diagonal (corresponding to the 1 entries slowly becoming smaller). <table> <tr> <td>[[File:pca_whitened_covar.png|360px|alt=Covariance for PCA whitening with regularisation|Covariance for PCA whitening with regularisation]]</td> <td>[[File:pca_whitened_unregularised_covar.png|360px|alt=Covariance for PCA whitening with regularisation|Covariance for PCA whitening without regularisation]]</td> </tr> <tr> <td><center>Covariance for PCA whitening with regularisation</center></td> <td><center>Covariance for PCA whitening without regularisation</center></td> </tr> </table> === Step 5: ZCA whitening === Now implement ZCA whitening to produce the matrix <math>x_{ZCAWhite}</math>. Visualise <math>x_{ZCAWhite}</math> and compare it to the raw data, <math>x</math>. You should observe that whitening results in, among other things, enhanced edges. Try varying epsilon <tt>(1, 0.1, 0.01)</tt> and see what you obtain. <table> <tr> <td> [[File:zca_whitened_images.png|240px|alt=ZCA whitened images|ZCA whitened images]] </td><td> [[File:raw_images.png|240px|alt=Raw images|Raw images]] </td> </tr> <tr> <td>ZCA whitened images</td> <td>Raw images</td> </tr> </table> [[Category:Exercises]]
Template:PCA
(
view source
)
Return to
Exercise:PCA and Whitening
.
Views
Page
Discussion
View source
History
Personal tools
Log in
ufldl resources
UFLDL Tutorial
Recommended Readings
wiki
Main page
Recent changes
Random page
Help
Search
Toolbox
What links here
Related changes
Special pages