How to choose cross-entropy loss in tensorflow?
machine learning - How to choose cross-entropy loss in tensorflow? - Stack Overflow
https://stackoverflow.com/questions/47034888/how-to-choose-cross-entropy-loss-in-tensorflow
0down votefavorite 1 | Classification problems, such as logistic regression or multinomial logistic regression, optimize a cross-entropy loss. Normally, the cross-entropy layer follows the softmax layer, which produces probability distribution. In tensorflow, there are at least a dozen of different cross-entropy loss functions:
Which work only for binary classification and which are suitable for multi-class problems? When should you use Related (more math-oriented) discussion: cross-entropy jungle. machine-learning tensorflow neural-network logistic-regression cross-entropy
| |||
1 Answer
Preliminary facts
Sigmoid functions family
As stated earlier, The labels must be one-hot encoded or can contain soft class probabilities.
Softmax functions family
These loss functions should be used for multinomial mutually exclusive classification, i.e. pick one out of The labels must be one-hot encoded or can contain soft class probabilities: a particular example can belong to class A with 50% probability and class B with 50% probability. Note that strictly speaking it doesn't mean that it belongs to both classes, but one can interpret the probabilities this way. Just like in Sparse functions family
Like ordinary Like above, Sampled softmax functions familyThese functions provide another alternative for dealing with huge number of classes. Instead of computing and comparing an exact probability distribution, they compute a loss estimate from a random sample. The arguments Like above, Sampled functions are only suitable for training. In test time, it's recommended to use a standard Another alternative loss is |
further reading:
machine learning - How to choose cross-entropy loss in tensorflow? - Stack Overflow
https://stackoverflow.com/questions/47034888/how-to-choose-cross-entropy-loss-in-tensorflow
python - Keras: binary_crossentropy & categorical_crossentropy confusion - Stack Overflow
https://stackoverflow.com/questions/47877083/keras-binary-crossentropy-categorical-crossentropy-confusion
【推荐】国内首个AI IDE,深度理解中文开发场景,立即下载体验Trae
【推荐】编程新体验,更懂你的AI,立即体验豆包MarsCode编程助手
【推荐】抖音旗下AI助手豆包,你的智能百科全书,全免费不限次数
【推荐】轻量又高性能的 SSH 工具 IShell:AI 加持,快人一步
· 开发者必知的日志记录最佳实践
· SQL Server 2025 AI相关能力初探
· Linux系列:如何用 C#调用 C方法造成内存泄露
· AI与.NET技术实操系列(二):开始使用ML.NET
· 记一次.NET内存居高不下排查解决与启示
· 阿里最新开源QwQ-32B,效果媲美deepseek-r1满血版,部署成本又又又降低了!
· 开源Multi-agent AI智能体框架aevatar.ai,欢迎大家贡献代码
· Manus重磅发布:全球首款通用AI代理技术深度解析与实战指南
· 被坑几百块钱后,我竟然真的恢复了删除的微信聊天记录!
· AI技术革命,工作效率10个最佳AI工具