Neural Networks

From Ufldl

Jump to: navigation, search
Line 101: Line 101:
to apply to vectors in an element-wise fashion (i.e.,
to apply to vectors in an element-wise fashion (i.e.,
<math>f([z_1, z_2, z_3]) = [f(z_1), f(z_2), f(z_3)]</math>), then we can write
<math>f([z_1, z_2, z_3]) = [f(z_1), f(z_2), f(z_3)]</math>), then we can write
-
Equations~(\ref{eqn-network331a}-\ref{eqn-network331d}) more
+
the equations above more
compactly as:
compactly as:
:<math>\begin{align}
:<math>\begin{align}
Line 109: Line 109:
h_{W,b}(x) &= a^{(3)} = f(z^{(3)})
h_{W,b}(x) &= a^{(3)} = f(z^{(3)})
\end{align}</math>
\end{align}</math>
-
More generally, recalling that we also use <math>a^{(1)} = x</math> to also denote the values from the input layer,
+
We call this step '''forward propagation.'''  More generally, recalling that we also use <math>a^{(1)} = x</math> to also denote the values from the input layer,
then given layer <math>l</math>'s activations <math>a^{(l)}</math>, we can compute layer <math>l+1</math>'s activations <math>a^{(l+1)}</math> as:
then given layer <math>l</math>'s activations <math>a^{(l)}</math>, we can compute layer <math>l+1</math>'s activations <math>a^{(l+1)}</math> as:
:<math>\begin{align}
:<math>\begin{align}
Line 126: Line 126:
layer <math>\textstyle l</math> is densely connected to layer <math>\textstyle l+1</math>.  In this setting, to compute the
layer <math>\textstyle l</math> is densely connected to layer <math>\textstyle l+1</math>.  In this setting, to compute the
output of the network, we can successively compute all the activations in layer
output of the network, we can successively compute all the activations in layer
-
<math>\textstyle L_2</math>, then layer <math>\textstyle L_3</math>, and so on, up to layer <math>\textstyle L_{n_l}</math>, using Equations~(\ref{eqn-forwardprop1}-\ref{eqn-forwardprop2}).  This is one
+
<math>\textstyle L_2</math>, then layer <math>\textstyle L_3</math>, and so on, up to layer <math>\textstyle L_{n_l}</math>, using the equations above that describe the forward propagation step.  This is one
example of a '''feedforward''' neural network, since the connectivity graph
example of a '''feedforward''' neural network, since the connectivity graph
does not have any directed loops or cycles.
does not have any directed loops or cycles.

Revision as of 00:37, 27 February 2011

Personal tools