学习率调整
学习率调整
-
Use different learning rate in different layers
optimizer=t.optim.Adam([{'params':model.model.features.parameters()},{'params':model.model.classifier.parameters(),'lr':opt.lr*10}],lr=opt.lr)
https://blog.csdn.net/wangbin12122224/article/details/79949824
- Dynamic change of learning rate
if epoch % 10 == 0 and epoch > 0: for kk, param_groups in enumerate(optimizer.param_groups): if kk == 0: param_groups['lr'] = opt.lr * (0.5 ** (epoch // 10)) print(epoch, param_groups['lr']) if kk == 1: param_groups['lr'] = opt.lr * 10 * (0.5 ** (epoch // 10)) print(epoch, param_groups['lr'])
https://blog.csdn.net/guihuo2889/article/details/84767291
https://blog.csdn.net/qq_34914551/article/details/87699317
http://www.spytensor.com/index.php/archives/32/?nofedw=53lek1&uslsvy=kzot52&oynwnu=09bpo2