神经网络——损失函数

符号:

\[\left\{ {\left( {{x^{\left( 1 \right)}},{y^{\left( 1 \right)}}} \right),\left( {{x^{\left( 2 \right)}},{y^{\left( 2 \right)}}} \right),...,\left( {{x^{\left( m \right)}},{y^{\left( m \right)}}} \right)} \right\}\]

L=total no. of layers in network

L=神经网络的总层数

sl = no. of units (not counting bias unit) in layer l

sl = 第l层的神经元的个数(不包含bias unit)


Binary classification                          Multi-class classification (K classes)

y = 0 or 1                                          y ∈RK  E.g [1 0 0 0], [0 1 0 0]

1 output unit                                     k output units


\[{\left( {{h_\Theta }\left( x \right)} \right)_i} = 神经网络的第i个输出\]

神经网络的损失函数

\[J\left( \Theta  \right) =  - \frac{1}{m}\left[ {\sum\limits_{i = 1}^m {\sum\limits_{k = 1}^k {y_k^{\left( i \right)}\log {{\left( {{h_\Theta }\left( {{x^{\left( i \right)}}} \right)} \right)}_k} + \left( {1 - y_k^{\left( i \right)}} \right)\log \left( {1 - {{\left( {{h_\Theta }\left( {{x^{\left( i \right)}}} \right)} \right)}_k}} \right)} } } \right] + \frac{\lambda }{{2m}}\sum\limits_{l = 1}^{L - 1} {\sum\limits_{i = 1}^{{s_l}} {\sum\limits_{j = 1}^{{s_{l + 1}}} {{{\left( {\Theta _{ji}^{\left( l \right)}} \right)}^2}} } } \]

 

posted @ 2018-10-29 19:25  qkloveslife  阅读(1202)  评论(0编辑  收藏  举报