# UFLDL Tutorial

### From Ufldl

Line 41: | Line 41: | ||

Building Deep Networks for Classification | Building Deep Networks for Classification | ||

+ | * [[Deep Networks: Overview]] | ||

* [[Stacked Autoencoders]] | * [[Stacked Autoencoders]] | ||

* [[Fine-tuning Stacked AEs]] | * [[Fine-tuning Stacked AEs]] |

## Revision as of 01:23, 21 April 2011

**Description:** This tutorial will teach you the main ideas of Unsupervised Feature Learning and Deep Learning. By working through it, you will also get to implement several feature learning/deep learning algorithms, get to see them work for yourself, and learn how to apply/adapt these ideas to new problems.

This tutorial assumes a basic knowledge of machine learning (specifically, familiarity with the ideas of supervised learning, logistic regression, gradient descent). If you are not familiar with these ideas, we suggest you go to this Machine Learning course and complete sections II, III, IV (up to Logistic Regression) first.

Sparse Autoencoder

- Neural Networks
- Backpropagation Algorithm
- Gradient checking and advanced optimization
- Autoencoders and Sparsity
- Visualizing a Trained Autoencoder
- Sparse Autoencoder Notation Summary
- Exercise:Sparse Autoencoder

Vectorized implementation

- Vectorization
- Logistic Regression Vectorization Example
- Neural Network Vectorization
- Using the MNIST Dataset
- Exercise:Vectorization

Preprocessing: PCA and Whitening

Softmax Regression

Self-Taught Learning and Unsupervised Feature Learning

Building Deep Networks for Classification

- Deep Networks: Overview
- Stacked Autoencoders
- Fine-tuning Stacked AEs
- Exercise: Implement deep networks for digit classification

Working with Large Images

**Advanced Topics**:

ICA Style Models: