设定学习率衰减

方法一:https://www.pytorchtutorial.com/pytorch-learning-rate-decay/

方法二:

# lr_step = [30,80]

if epoch in opt.lr_step:
      save_model(os.path.join(opt.save_dir, 'model_{}.pth'.format(epoch)), 
                 epoch, model, optimizer)
      lr = opt.lr * (0.1 ** (opt.lr_step.index(epoch) + 1)) #进行学习率的衰减
      print('Drop LR to', lr)
      for param_group in optimizer.param_groups:
          param_group['lr'] = lr
posted @ 2020-06-01 09:41  像阳光,像春天  阅读(328)  评论(0编辑  收藏  举报