【tensorflow】Decaying the learning rate

退化学习率(Decaying the learning rate)

操作描述
tf.train.exponential_decay(learning_rate, global_step,
decay_steps, decay_rate, staircase=False, name=None)
对学习率进行指数衰退

▷ tf.train.exponential_decay

#该函数返回以下结果
decayed_learning_rate = learning_rate *
         decay_rate ^ (global_step / decay_steps)
##例: 以0.96为基数,每100000 步进行一次学习率的衰退
global_step = tf.Variable(0, trainable=False)
starter_learning_rate = 0.1
learning_rate = tf.train.exponential_decay(starter_learning_rate, global_step,
                                           100000, 0.96, staircase=True)
# Passing global_step to minimize() will increment it at each step.
learning_step = (
    tf.train.GradientDescentOptimizer(learning_rate)
    .minimize(...my loss..., global_step=global_step)
)

 

posted @ 2020-10-17 00:10  Sunshine168  阅读(149)  评论(0编辑  收藏  举报