Exercise:Softmax Regression

From Ufldl

Jump to: navigation, search
(Step 2: Implement softmaxCost)
(Step 3: Gradient checking)
Line 93: Line 93:
Once you have written the softmax cost function, you should check your gradients numerically. In general, whenever implementing any learning algorithm, you should always check your gradients numerically before proceeding to train the model. The norm of the difference between the numerical gradient and your analytical gradient should be small, on the order of <math>10^{-9}</math>.  
Once you have written the softmax cost function, you should check your gradients numerically. In general, whenever implementing any learning algorithm, you should always check your gradients numerically before proceeding to train the model. The norm of the difference between the numerical gradient and your analytical gradient should be small, on the order of <math>10^{-9}</math>.  
-
'''Implementation tip - faster gradient checking''' - when debugging, you can speed up gradient checking by reducing the number of parameters your model uses. In this case, we have included code for reducing the size of the input data, using the first 8 pixels of the images instead of the full 28x28 images. This code can be used by setting the variable <tt>DEBUG</tt> to true, as described in step 1 of the code.
+
'''Implementation Tip:''' Faster gradient checking - when debugging, you can speed up gradient checking by reducing the number of parameters your model uses. In this case, we have included code for reducing the size of the input data, using the first 8 pixels of the images instead of the full 28x28 images. This code can be used by setting the variable <tt>DEBUG</tt> to true, as described in step 1 of the code.
=== Step 4: Learning parameters ===
=== Step 4: Learning parameters ===

Revision as of 04:33, 4 May 2011

Personal tools