摘要: 摘要 Knowledge distillation becomes a de facto standard to improve the performance of small neural networks. 知识蒸馏成为提高小型神经网络性能的事实上的标准。 Most of the previo 阅读全文
posted @ 2023-10-18 19:38 信海 阅读(89) 评论(0) 推荐(0) 编辑