Neural Networks

From Ufldl

Jump to: navigation, search
Line 31: Line 31:
-
 
+
<div align=center>
-
[[Image:Sigmoid_Function.png|400px|center|Sigmoid activation function.]]
+
[[Image:Sigmoid_Function.png|400px|top|Sigmoid activation function.]]
-
[[Image:Tanh_Function.png|400px|center|Tanh activation function.]]
+
[[Image:Tanh_Function.png|400px|top|Tanh activation function.]]
 +
</div>
The <math>\tanh(z)</math> function is a rescaled version of the sigmoid, and its output range is
The <math>\tanh(z)</math> function is a rescaled version of the sigmoid, and its output range is
Line 63: Line 64:
node).  The middle layer of nodes is called the '''hidden layer''', because its
node).  The middle layer of nodes is called the '''hidden layer''', because its
values are not observed in the training set.  We also say that our example
values are not observed in the training set.  We also say that our example
-
neural network has 3 '''input units''' (not counting the bias unit), 3  
+
neural network has 3 '''in put units''' (not counting the bias unit), 3  
'''hidden units''', and 1 '''output unit'''.
'''hidden units''', and 1 '''output unit'''.

Revision as of 06:38, 27 February 2011

Personal tools