二、Deep Learning Basics——Lecture 3: Regularization and Optimization

Lecture 3: Regularization and Optimization(正则化和优化)

https://cs231n.github.io/optimization-1/

Regularization(正则化)

Stochastic Gradient Descent(随机梯度下降)

Momentum, AdaGrad, Adam

Learning rate schedules(学习率时间表)

posted @ 2022-07-29 14:26  JaxonYe  阅读(17)  评论(0编辑  收藏  举报