# Neural Networks

 Revision as of 23:20, 26 February 2011 (view source)Ang (Talk | contribs)← Older edit Revision as of 00:37, 27 February 2011 (view source)Ang (Talk | contribs) Newer edit → Line 101: Line 101: to apply to vectors in an element-wise fashion (i.e., to apply to vectors in an element-wise fashion (i.e., $f([z_1, z_2, z_3]) = [f(z_1), f(z_2), f(z_3)]$), then we can write $f([z_1, z_2, z_3]) = [f(z_1), f(z_2), f(z_3)]$), then we can write - Equations~(\ref{eqn-network331a}-\ref{eqn-network331d}) more + the equations above more compactly as: compactly as: :\begin{align} :[itex]\begin{align} Line 109: Line 109: h_{W,b}(x) &= a^{(3)} = f(z^{(3)}) h_{W,b}(x) &= a^{(3)} = f(z^{(3)}) \end{align} \end{align}[/itex] - More generally, recalling that we also use $a^{(1)} = x$ to also denote the values from the input layer, + We call this step '''forward propagation.'''  More generally, recalling that we also use $a^{(1)} = x$ to also denote the values from the input layer, then given layer $l$'s activations $a^{(l)}$, we can compute layer $l+1$'s activations $a^{(l+1)}$ as: then given layer $l$'s activations $a^{(l)}$, we can compute layer $l+1$'s activations $a^{(l+1)}$ as: :\begin{align} :[itex]\begin{align} Line 126: Line 126: layer [itex]\textstyle l is densely connected to layer $\textstyle l+1$.  In this setting, to compute the layer $\textstyle l$ is densely connected to layer $\textstyle l+1$.  In this setting, to compute the output of the network, we can successively compute all the activations in layer output of the network, we can successively compute all the activations in layer - $\textstyle L_2$, then layer $\textstyle L_3$, and so on, up to layer $\textstyle L_{n_l}$, using Equations~(\ref{eqn-forwardprop1}-\ref{eqn-forwardprop2}).  This is one + $\textstyle L_2$, then layer $\textstyle L_3$, and so on, up to layer $\textstyle L_{n_l}$, using the equations above that describe the forward propagation step.  This is one example of a '''feedforward''' neural network, since the connectivity graph example of a '''feedforward''' neural network, since the connectivity graph does not have any directed loops or cycles. does not have any directed loops or cycles.