机器学习领域的两个名词 ~~BP神经网络~~ & ~~sotfmax loss~~
BP神经网络
我听人说过 、BP神经网络
,我还听过一般的BP神经网络
前馈神经网络
。我感觉他们说的是同一个东西,也就是那种多层全连接的神经网络,也就是MultiLayer Perceptron
(MLP
)。不过,不同的名字里面,“向前”是他,“向后”也是他。这个让人有点困惑。我去搜索了一下 back propagation neural network
。从搜索的结果看来,在英文世界里,BP神经网络
这么一个东西,至少不是一个广泛使用的概念。
我的观点是:是一个不规范的术语。至少我现在知道的什么BP神经网络
MLP
, CNN
, RNN
都需要用到back propagation
。
我的观点是:前馈神经网络
是和循环神经网络
之类的概念对应的概念。前馈神经网络
是那种输入到输出是一个有向无环图的神经网络。
我的观点是:(这个观点看看就行了,还没有其他人做过类似的表述)卷积神经网络
是前馈神经网络
的一种。
这个说法和我的观点比较合拍。
A multilayer perceptron (MLP) is a class of feedforward artificial neural network.
reference:
Multilayer perceptron - Wikipedia
https://en.wikipedia.org/wiki/Multilayer_perceptron
sotfmax loss
- Technically no because “softmax loss” isn’t really a correct term, and “cross-entropy loss” is.
- However, people use the term “softmax loss” when referring to “cross-entropy loss” and because you know what they mean, there’s no reason to annoyingly correct them. Because they are used interchangeably, the two terms are effectively the same.
- Possibly confusing naming conventions. To be precise, the SVM classifier uses the hinge loss, or also sometimes called the max-margin loss. The Softmax classifier uses the cross-entropy loss. The Softmax classifier gets its name from the softmax function, which is used to squash the raw class scores into normalized positive values that sum to one, so that the cross-entropy loss can be applied. In particular, note that technically it doesn’t make sense to talk about the “softmax loss”, since softmax is just the squashing function, but it is a relatively commonly used shorthand.
reference:
Is the softmax loss the same as the cross-entropy loss? - Quora
https://www.quora.com/Is-the-softmax-loss-the-same-as-the-cross-entropy-loss#
CS231n Convolutional Neural Networks for Visual Recognition
http://cs231n.github.io/linear-classify/#softmax