# Exercise:Softmax Regression

### From Ufldl

(→Step 2: Implement softmaxCost) |
|||

Line 85: | Line 85: | ||

% M is the matrix as described in the text | % M is the matrix as described in the text | ||

- | M = bsxfun(@minus, M, max(M)); | + | M = bsxfun(@minus, M, max(M, [], 1)); |

<tt>max(M)</tt> yields a row vector with each element giving the maximum value in that column. <tt>bsxfun</tt> (short for binary singleton expansion function) applies minus along each row of <tt>M</tt>, hence subtracting the maximum of each column from every element in the column. | <tt>max(M)</tt> yields a row vector with each element giving the maximum value in that column. <tt>bsxfun</tt> (short for binary singleton expansion function) applies minus along each row of <tt>M</tt>, hence subtracting the maximum of each column from every element in the column. |