Pytorch调整学习率

  1. 每隔一定的epoch调整学习率
def adjust_learning_rate(optimizer, epoch):
    """Sets the learning rate to the initial LR decayed by 10 every 30 epochs"""
    lr = args.lr * (0.1 ** (epoch // 30))
    for param_group in optimizer.param_groups:
        param_group['lr'] = lr

for epoch in epochs:
  train(...)
  validate(...)
  adjust_learning_rate(optimizer, epoch)
  

或者from torch.optim import lr_scheduler

adjust_lr_scheduler = lr_scheduler.StepLR(optimizer, step_size=30, gamma=0.1)


for epoch in epochs:
  train(...)
  validate(...)

  adjust_lr_scheduler.step()
 

注意,学习率的更新要放在训练和验证集测试之后进行。

   2.以一定的策略调整学习率

scheduler = torch.optim.lr_scheduler.LambdaLR(optimizer,
                    lambda epoch : (1.0-epoch/epochs) if epochs <= epochs else 0, last_epoch=-1)

for epoch in epochs:
    train(...)
    validate(...)
    scheduler

参考:

https://www.jianshu.com/p/a20d5a7ed6f3

https://pytorch.org/docs/master/optim.html#how-to-adjust-learning-rate

https://blog.csdn.net/qq_34914551/article/details/87699317

https://blog.csdn.net/shanglianlm/article/details/85143614

posted @ 2019-08-22 18:39  嶙羽  阅读(817)  评论(0编辑  收藏  举报