Exercise:Softmax Regression

From Ufldl

Jump to: navigation, search
(Step 5: Testing)
(Step 4: Learning parameters)
Line 95: Line 95:
=== Step 4: Learning parameters ===
=== Step 4: Learning parameters ===
-
Now that you've verified that your gradients are correct, you can train your softmax model using the function <tt>softmaxTrain</tt> in <tt>softmaxTrain.m</tt>. <tt>softmaxTrain</tt> which uses the L-BFGS algorithm, in the function <tt>minFunc</tt>. Training the model on the entire MNIST training set of 60000 28x28 images should be rather quick, and take less than 3 minutes for 100 iterations.
+
Now that you've verified that your gradients are correct, you can train your softmax model using the function <tt>softmaxTrain</tt> in <tt>softmaxTrain.m</tt>. <tt>softmaxTrain</tt> which uses the L-BFGS algorithm, in the function <tt>minFunc</tt>. Training the model on the entire MNIST training set of 60000 28x28 images should be rather quick, and take less than 5 minutes for 100 iterations.
Factoring <tt>softmaxTrain</tt> out as a function means that you will be able to easily reuse it to train softmax models on other data sets in the future by invoking the function with different parameters.
Factoring <tt>softmaxTrain</tt> out as a function means that you will be able to easily reuse it to train softmax models on other data sets in the future by invoking the function with different parameters.

Revision as of 07:30, 1 May 2011

Personal tools