Backpropagation Algorithm

From Ufldl

Jump to: navigation, search
m
Line 85: Line 85:
compute all the activations throughout the network, including the output value
compute all the activations throughout the network, including the output value
of the hypothesis <math>h_{W,b}(x)</math>.  Then, for each node <math>i</math> in layer <math>l</math>, we would like
of the hypothesis <math>h_{W,b}(x)</math>.  Then, for each node <math>i</math> in layer <math>l</math>, we would like
-
to compute an ``error term'' <math>\delta^{(l)}_i</math> that measures how much that node was
+
to compute an "error term" <math>\delta^{(l)}_i</math> that measures how much that node was
-
``responsible'' for any errors in our output.
+
"responsible" for any errors in our output.
For an output node, we can directly measure the difference between the
For an output node, we can directly measure the difference between the
network's activation and the true target value, and use that to define
network's activation and the true target value, and use that to define

Revision as of 01:18, 22 April 2011

Personal tools