UFLDL Tutorial

From Ufldl

(Difference between revisions)
Jump to: navigation, search
 
(33 intermediate revisions not shown)
Line 4: Line 4:
sections II, III, IV (up to Logistic Regression) first.  
sections II, III, IV (up to Logistic Regression) first.  
-
Sparse Autoencoder
+
 
 +
'''Sparse Autoencoder'''
* [[Neural Networks]]
* [[Neural Networks]]
* [[Backpropagation Algorithm]]
* [[Backpropagation Algorithm]]
Line 14: Line 15:
-
Vectorized implementation
+
'''Vectorized implementation'''
* [[Vectorization]]
* [[Vectorization]]
* [[Logistic Regression Vectorization Example]]
* [[Logistic Regression Vectorization Example]]
Line 21: Line 22:
-
Preprocessing: PCA and Whitening
+
'''Preprocessing: PCA and Whitening'''
* [[PCA]]
* [[PCA]]
* [[Whitening]]
* [[Whitening]]
Line 29: Line 30:
-
Softmax Regression
+
'''Softmax Regression'''
* [[Softmax Regression]]
* [[Softmax Regression]]
* [[Exercise:Softmax Regression]]
* [[Exercise:Softmax Regression]]
-
Self-Taught Learning and Unsupervised Feature Learning  
+
'''Self-Taught Learning and Unsupervised Feature Learning'''
* [[Self-Taught Learning]]
* [[Self-Taught Learning]]
* [[Exercise:Self-Taught Learning]]
* [[Exercise:Self-Taught Learning]]
-
Building Deep Networks for Classification
+
'''Building Deep Networks for Classification'''
 +
* [[Self-Taught Learning to Deep Networks | From Self-Taught Learning to Deep Networks]]
* [[Deep Networks: Overview]]
* [[Deep Networks: Overview]]
* [[Stacked Autoencoders]]
* [[Stacked Autoencoders]]
Line 46: Line 48:
-
Working with Large Images
+
'''Linear Decoders with Autoencoders'''
 +
* [[Linear Decoders]]
 +
* [[Exercise:Learning color features with Sparse Autoencoders]]
 +
 
 +
 
 +
'''Working with Large Images'''
* [[Feature extraction using convolution]]
* [[Feature extraction using convolution]]
* [[Pooling]]
* [[Pooling]]
-
* [[Multiple layers of convolution and pooling]]
+
* [[Exercise:Convolution and Pooling]]
-
 
+
----
----
 +
'''Note''': The sections above this line are stable.  The sections below are still under construction, and may change without notice.  Feel free to browse around however, and feedback/suggestions are welcome.
 +
'''Miscellaneous'''
 +
* [[MATLAB Modules]]
 +
* [[Style Guide]]
 +
* [[Useful Links]]
 +
 +
'''Miscellaneous Topics'''
 +
* [[Data Preprocessing]]
 +
* [[Deriving gradients using the backpropagation idea]]
'''Advanced Topics''':
'''Advanced Topics''':
-
[[Restricted Boltzmann Machines]]
+
'''Sparse Coding'''
 +
* [[Sparse Coding]]
 +
* [[Sparse Coding: Autoencoder Interpretation]]
 +
* [[Exercise:Sparse Coding]]
-
[[Deep Belief Networks]]
+
'''ICA Style Models'''
 +
* [[Independent Component Analysis]]
 +
* [[Exercise:Independent Component Analysis]]
-
[[Denoising Autoencoders]]
+
'''Others'''
 +
* [[Convolutional training]]
 +
* [[Restricted Boltzmann Machines]]
 +
* [[Deep Belief Networks]]
 +
* [[Denoising Autoencoders]]
 +
* [[K-means]]
 +
* [[Spatial pyramids / Multiscale]]
 +
* [[Slow Feature Analysis]]
 +
* [[Tiled Convolution Networks]]
-
[[Sparse Coding]]
+
----
-
[[K-means]]
+
Material contributed by: Andrew Ng, Jiquan Ngiam, Chuan Yu Foo, Yifan Mai, Caroline Suen
-
 
+
-
[[Spatial pyramids / Multiscale]]
+
-
 
+
-
[[Slow Feature Analysis]]
+
-
 
+
-
ICA Style Models:
+
-
* [[Independent Component Analysis]]
+
-
* [[Topographic Independent Component Analysis]]
+
-
[[Tiled Convolution Networks]]
 
-
[[Code]]
+
{{Languages|UFLDL教程|中文}}

Latest revision as of 18:22, 7 April 2013

Description: This tutorial will teach you the main ideas of Unsupervised Feature Learning and Deep Learning. By working through it, you will also get to implement several feature learning/deep learning algorithms, get to see them work for yourself, and learn how to apply/adapt these ideas to new problems.

This tutorial assumes a basic knowledge of machine learning (specifically, familiarity with the ideas of supervised learning, logistic regression, gradient descent). If you are not familiar with these ideas, we suggest you go to this Machine Learning course and complete sections II, III, IV (up to Logistic Regression) first.


Sparse Autoencoder


Vectorized implementation


Preprocessing: PCA and Whitening


Softmax Regression


Self-Taught Learning and Unsupervised Feature Learning


Building Deep Networks for Classification


Linear Decoders with Autoencoders


Working with Large Images


Note: The sections above this line are stable. The sections below are still under construction, and may change without notice. Feel free to browse around however, and feedback/suggestions are welcome.

Miscellaneous

Miscellaneous Topics

Advanced Topics:

Sparse Coding

ICA Style Models

Others


Material contributed by: Andrew Ng, Jiquan Ngiam, Chuan Yu Foo, Yifan Mai, Caroline Suen


Language : 中文

Personal tools