Softmax Regression

From Ufldl

Jump to: navigation, search
(Weight Decay)
(Relationship to Logistic Regression)
Line 301: Line 301:
== Relationship to Logistic Regression ==
== Relationship to Logistic Regression ==
-
In the special case where <math>k = 2</math>, one can also show that softmax regression reduces to logistic regression.
+
In the special case where <math>k = 2</math>, one can show that softmax regression reduces to logistic regression.
-
This shows that softmax regression is a generalization of logistic regression.  Concretely, our hypothesis outputs
+
This shows that softmax regression is a generalization of logistic regression.  Concretely, when <math>k=2</math>,
 +
the softmax regression hypothesis outputs
<math>
<math>
\begin{align}
\begin{align}
-
h(x) &=
+
h_\theta(x) &=
\frac{1}{ e^{\theta_1^Tx}  + e^{ \theta_2^T x^{(i)} } }
\frac{1}{ e^{\theta_1^Tx}  + e^{ \theta_2^T x^{(i)} } }
Line 317: Line 318:
Taking advantage of the fact that this hypothesis
Taking advantage of the fact that this hypothesis
-
is overparameterized and setting <math>\psi - =\theta_1</math>,
+
is overparameterized and setting <math>\psi = \theta_1</math>,
we can subtract <math>\theta_1</math> from each of the two parameters, giving us
we can subtract <math>\theta_1</math> from each of the two parameters, giving us
Line 352: Line 353:
<math>1 - \frac{1}{ 1 + e^{ (\theta')^T x^{(i)} } }</math>,
<math>1 - \frac{1}{ 1 + e^{ (\theta')^T x^{(i)} } }</math>,
same as logistic regression.
same as logistic regression.
-
 
== Softmax Regression vs. k Binary Classifiers ==
== Softmax Regression vs. k Binary Classifiers ==

Revision as of 19:09, 10 May 2011

Personal tools