反向传播算法2)

 When neuron j is located in a hidden layer of the network, there is no specified desired response for that neuron. For this derivative to exist, we require the function φ(•) to be continuous. In basic terms, differentiability is the only requirement that an activation function has to satisfy. An example of a continuously differentiable nonlinear function commonly used in multilayer perceptrons is sigmoidal nonlinearity, 1) Logistic Function, 2) Hyperbolic tangent function.

posted @ 2018-11-09 10:31  东宫得臣  阅读(114)  评论(0编辑  收藏  举报