tensorflow 中 softmax_cross_entropy_with_logits 与 sparse_softmax_cross_entropy_with_logits 的区别
http://stackoverflow.com/questions/37312421/tensorflow-whats-the-difference-between-sparse-softmax-cross-entropy-with-logi
Having two different functions is a convenience, as they produce the same result.
The difference is simple:
- For
sparse_softmax_cross_entropy_with_logits
, labels must have the shape [batch_size] and the dtype int32 or int64. Each label is an int in range[0, num_classes-1]
. - For
softmax_cross_entropy_with_logits
, labels must have the shape [batch_size, num_classes] and dtype float32 or float64.
Labels used in softmax_cross_entropy_with_logits
are the one hot version of labels used in sparse_softmax_cross_entropy_with_logits
.
Another tiny difference is that with sparse_softmax_cross_entropy_with_logits
, you can give -1 as a label to have loss 0
on this label.
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
支付宝扫一扫捐赠
支付宝扫一扫捐赠

微信公众号: 共鸣圈
欢迎讨论,邮件: 924948$qq.com 请把$改成@
QQ群:263132197
QQ: 924948
【推荐】编程新体验,更懂你的AI,立即体验豆包MarsCode编程助手
【推荐】凌霞软件回馈社区,博客园 & 1Panel & Halo 联合会员上线
【推荐】抖音旗下AI助手豆包,你的智能百科全书,全免费不限次数
【推荐】博客园社区专享云产品让利特惠,阿里云新客6.5折上折
【推荐】轻量又高性能的 SSH 工具 IShell:AI 加持,快人一步