Gradient checking and advanced optimization

From Ufldl

Jump to: navigation, search
m (Fixed quotation marks)
Line 101: Line 101:
Hessian matrix, so that it can take more rapid steps towards a local optimum (similar to Newton's method).  A full discussion of these
Hessian matrix, so that it can take more rapid steps towards a local optimum (similar to Newton's method).  A full discussion of these
algorithms is beyond the scope of these notes, but one example is
algorithms is beyond the scope of these notes, but one example is
-
the '''L-BFGS''' algorithm.  (Another example is '''conjugate gradient'''.)  You will use one of
+
the '''L-BFGS''' algorithm.  (Another example is the '''conjugate gradient''' algorithm.)  You will use one of
these algorithms in the programming exercise.
these algorithms in the programming exercise.
The main thing you need to provide to these advanced optimization algorithms is that for any <math>\textstyle \theta</math>, you have to be able
The main thing you need to provide to these advanced optimization algorithms is that for any <math>\textstyle \theta</math>, you have to be able

Revision as of 23:14, 22 April 2011

Personal tools