Neural Networks

 Revision as of 23:05, 26 February 2011 (view source)Ang (Talk | contribs)← Older edit Revision as of 23:20, 26 February 2011 (view source)Ang (Talk | contribs) Newer edit → Line 117: Line 117: By organizing our parameters in matrices and using matrix-vector operations, we can take By organizing our parameters in matrices and using matrix-vector operations, we can take advantage of fast linear algebra routines to quickly perform calculations in our network. advantage of fast linear algebra routines to quickly perform calculations in our network. + We have so far focused on one example neural network, but one can also build neural We have so far focused on one example neural network, but one can also build neural - networks with other """architectures""" (meaning patterns of connectivity between neurons), including ones with multiple hidden layers. + networks with other {\bf - The most common choice is a $n_l$-layered network + architectures} (meaning patterns of connectivity between neurons), including ones with multiple hidden layers. - where layer $1$ is the input layer, layer $n_l$ is the output layer, and each + The most common choice is a $\textstyle n_l$-layered network - layer $l$ is densely connected to layer $l+1$.  In this setting, to compute the + where layer $\textstyle 1$ is the input layer, layer $\textstyle n_l$ is the output layer, and each + layer $\textstyle l$ is densely connected to layer $\textstyle l+1$.  In this setting, to compute the output of the network, we can successively compute all the activations in layer output of the network, we can successively compute all the activations in layer - $L_2$, then layer $L_3$, and so on, up to layer $L_{n_l}$, using Equations~(\ref{eqn-forwardprop1}-\ref{eqn-forwardprop2}).  This is one + $\textstyle L_2$, then layer $\textstyle L_3$, and so on, up to layer $\textstyle L_{n_l}$, using Equations~(\ref{eqn-forwardprop1}-\ref{eqn-forwardprop2}).  This is one - example of a """feedforward""" neural network, since the connectivity graph + example of a '''feedforward''' neural network, since the connectivity graph does not have any directed loops or cycles. does not have any directed loops or cycles. + %We will write $\textstyle s_l$ to denote the + %number of units in layer $\textstyle l$ of the network (not counting the bias unit). + Neural networks can also have multiple output units.  For example, here is a network Neural networks can also have multiple output units.  For example, here is a network