Neural Networks

From Ufldl

Jump to: navigation, search
Line 117: Line 117:
By organizing our parameters in matrices and using matrix-vector operations, we can take
By organizing our parameters in matrices and using matrix-vector operations, we can take
advantage of fast linear algebra routines to quickly perform calculations in our network.
advantage of fast linear algebra routines to quickly perform calculations in our network.
 +
We have so far focused on one example neural network, but one can also build neural
We have so far focused on one example neural network, but one can also build neural
-
networks with other """architectures""" (meaning patterns of connectivity between neurons), including ones with multiple hidden layers.
+
networks with other {\bf
-
The most common choice is a <math>n_l</math>-layered network
+
architectures} (meaning patterns of connectivity between neurons), including ones with multiple hidden layers.
-
where layer <math>1</math> is the input layer, layer <math>n_l</math> is the output layer, and each
+
The most common choice is a <math>\textstyle n_l</math>-layered network
-
layer <math>l</math> is densely connected to layer <math>l+1</math>.  In this setting, to compute the
+
where layer <math>\textstyle 1</math> is the input layer, layer <math>\textstyle n_l</math> is the output layer, and each
 +
layer <math>\textstyle l</math> is densely connected to layer <math>\textstyle l+1</math>.  In this setting, to compute the
output of the network, we can successively compute all the activations in layer
output of the network, we can successively compute all the activations in layer
-
<math>L_2</math>, then layer <math>L_3</math>, and so on, up to layer <math>L_{n_l}</math>, using Equations~(\ref{eqn-forwardprop1}-\ref{eqn-forwardprop2}).  This is one
+
<math>\textstyle L_2</math>, then layer <math>\textstyle L_3</math>, and so on, up to layer <math>\textstyle L_{n_l}</math>, using Equations~(\ref{eqn-forwardprop1}-\ref{eqn-forwardprop2}).  This is one
-
example of a """feedforward""" neural network, since the connectivity graph
+
example of a '''feedforward''' neural network, since the connectivity graph
does not have any directed loops or cycles.
does not have any directed loops or cycles.
 +
%We will write <math>\textstyle s_l</math> to denote the
 +
%number of units in layer <math>\textstyle l</math> of the network (not counting the bias unit).
 +
Neural networks can also have multiple output units.  For example, here is a network
Neural networks can also have multiple output units.  For example, here is a network

Revision as of 23:20, 26 February 2011

Personal tools