摘要: 知识蒸馏就是Knowledge Distillation。 Knowledge Distillation:https://arxiv.org/abs/1503.02531 Do Deep Nets Really Need to be Deep?:https://arxiv.org/abs/1312. 阅读全文
posted @ 2021-05-04 09:52 臭咸鱼 阅读(234) 评论(0) 推荐(0) 编辑