Softmax Regression

From Ufldl

Jump to: navigation, search
(Optimizing Softmax Regression)
Line 54: Line 54:
</math>
</math>
-
Unfortunately, there is no close form solution to this optimization problem (although it is concave), and we usually use an off-the-shelf optimization method (e.g., L-BFGS, stochastic gradient descent) to find the optimal parameters. Using these optimization methods require computing the gradient (<math>\ell(\theta)</math> w.r.t. <math>\theta_{k}</math>), which can can be derived as follows:
+
Unfortunately, there is no closed form solution to this optimization problem (although it is concave), and we usually use an off-the-shelf optimization method (e.g., L-BFGS, stochastic gradient descent) to find the optimal parameters. Using these optimization methods require computing the gradient (<math>\ell(\theta)</math> w.r.t. <math>\theta_{k}</math>), which can can be derived as follows:
<math>
<math>

Revision as of 04:26, 4 May 2011

Personal tools