Exercise: Implement deep networks for digit classification

From Ufldl

Jump to: navigation, search
(Overview)
Line 5: Line 5:
The code you have already implemented will allow you to stack various layers and perform layer-wise training. However, to perform fine-tuning, you will need to implement back-propogation as well. We will see that fine-tuning significantly improves the model's performance.
The code you have already implemented will allow you to stack various layers and perform layer-wise training. However, to perform fine-tuning, you will need to implement back-propogation as well. We will see that fine-tuning significantly improves the model's performance.
-
In the file <tt>stacked_ae_exercise.zip</tt>, we have provided some starter code. You will need to edit <tt>stackedAECost.m</tt>. You should also read <tt>stackedAETrain.m</tt> and ensure that you understand the steps.
+
In the file <tt>stacked_ae_exercise.zip</tt>, we have provided some starter code [http://ufldl.stanford.edu/wiki/resources/sparseae_exercise.zip]. You will need to edit <tt>stackedAECost.m</tt>. You should also read <tt>stackedAETrain.m</tt> and ensure that you understand the steps.
=== Step 0: Initialize constants and parameters ===
=== Step 0: Initialize constants and parameters ===

Revision as of 21:56, 28 April 2011

Personal tools