随笔分类 -  Neural Networks

摘要:He K, Zhang X, Ren S, et al. Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification[C]. international conference 阅读全文
posted @ 2020-04-23 13:19 馒头and花卷 阅读(407) 评论(0) 推荐(0) 编辑
摘要:Glorot X, Bengio Y. Understanding the difficulty of training deep feedforward neural networks[C]. international conference on artificial intelligence 阅读全文
posted @ 2020-04-23 10:51 馒头and花卷 阅读(420) 评论(0) 推荐(0) 编辑
摘要:Nakkiran P, Kaplun G, Bansal Y, et al. Deep Double Descent: Where Bigger Models and More Data Hurt[J]. arXiv: Learning, 2019. @article{nakkiran2019dee 阅读全文
posted @ 2020-04-09 22:31 馒头and花卷 阅读(478) 评论(0) 推荐(0) 编辑
摘要:Maclaurin D, Duvenaud D, Adams R P, et al. Gradient-based Hyperparameter Optimization through Reversible Learning[J]. arXiv: Machine Learning, 2015. @ 阅读全文
posted @ 2020-04-08 16:56 馒头and花卷 阅读(376) 评论(0) 推荐(0) 编辑
摘要:[TOC] Augmentation 原图: Flipping 翻转 Grayscale Equalize 均衡直方图 Posterize 减少颜色通道位数 Cropping Rotation Translation Noise injection Hue 色调 Brightness Saturat 阅读全文
posted @ 2020-03-23 21:18 馒头and花卷 阅读(487) 评论(0) 推荐(0) 编辑
摘要:A Deep Neural Network’s Loss Surface Contains Every Low-dimensional Pattern 概 作者关于Loss Surface的情况做了一个理论分析, 即证明足够大的神经网络能够逼近所有的低维损失patterns. 相关工作 loss l 阅读全文
posted @ 2020-02-25 22:14 馒头and花卷 阅读(248) 评论(0) 推荐(0) 编辑
摘要:Skorokhodov I, Burtsev M. Loss Landscape Sightseeing with Multi-Point Optimization.[J]. arXiv: Learning, 2019. @article{skorokhodov2019loss, title={Lo 阅读全文
posted @ 2020-02-25 22:09 馒头and花卷 阅读(292) 评论(0) 推荐(0) 编辑
摘要:Lu Z, Pu H, Wang F, et al. The expressive power of neural networks: a view from the width[C]. neural information processing systems, 2017: 6232-6240. 阅读全文
posted @ 2020-02-24 13:51 馒头and花卷 阅读(392) 评论(0) 推荐(0) 编辑
摘要:Accelerating Deep Learning by Focusing on the Biggest Losers 概 思想很简单, 在训练网络的时候, 每个样本都会产生一个损失$\mathcal{L}(f(x_i),y_i),,\sum_i \ma 阅读全文
posted @ 2020-02-16 21:24 馒头and花卷 阅读(372) 评论(0) 推荐(0) 编辑
摘要:Katharopoulos A, Fleuret F. Not All Samples Are Created Equal: Deep Learning with Importance Sampling[J]. arXiv: Learning, 2018. @article{katharopoulo 阅读全文
posted @ 2020-02-16 20:42 馒头and花卷 阅读(711) 评论(2) 推荐(0) 编辑
摘要:Szegedy C, Liu W, Jia Y, et al. Going deeper with convolutions[C]. computer vision and pattern recognition, 2015: 1-9. @article{szegedy2015going, titl 阅读全文
posted @ 2020-01-13 21:06 馒头and花卷 阅读(427) 评论(0) 推荐(0) 编辑
摘要:He K, Zhang X, Ren S, et al. Deep Residual Learning for Image Recognition[C]. computer vision and pattern recognition, 2016: 770-778. @article{he2016d 阅读全文
posted @ 2020-01-11 23:35 馒头and花卷 阅读(275) 评论(0) 推荐(0) 编辑
摘要:Safran I, Shamir O. Spurious Local Minima are Common in Two-Layer ReLU Neural Networks[J]. arXiv: Learning, 2017. @article{safran2017spurious, title={ 阅读全文
posted @ 2019-12-13 22:49 馒头and花卷 阅读(273) 评论(0) 推荐(0) 编辑
摘要:[TOC] "Cho Y, Saul L K. Kernel Methods for Deep Learning[C]. neural information processing systems, 2009: 342 350." @article{cho2009kernel, title={Ker 阅读全文
posted @ 2019-12-12 22:34 馒头and花卷 阅读(456) 评论(0) 推荐(0) 编辑
摘要:Nguyen Q C, Hein M. Optimization Landscape and Expressivity of Deep CNNs[J]. arXiv: Learning, 2017. BibTex @article{nguyen2017optimization, title={Opt 阅读全文
posted @ 2019-11-22 12:50 馒头and花卷 阅读(500) 评论(0) 推荐(0) 编辑
摘要:[TOC] BP算法的简单实现 首先创建一个父类Fun, 主要定义了 forward: 前向方法,需要子类重定义; Momentum: 一个梯度下降方法; step: 更新系数的方法; zero_grad: 将记录的梯度清空; load: 加载系数; Linear 全连接层 全连接层需要注意的是 $ 阅读全文
posted @ 2019-10-27 15:37 馒头and花卷 阅读(551) 评论(0) 推荐(0) 编辑
摘要:Arora S, Cohen N, Hazan E, et al. On the Optimization of Deep Networks: Implicit Acceleration by Overparameterization[J]. arXiv: Learning, 2018. 引 我很喜 阅读全文
posted @ 2019-10-18 22:10 馒头and花卷 阅读(588) 评论(0) 推荐(0) 编辑
摘要:Draxler F, Veschgini K, Salmhofer M, et al. Essentially No Barriers in Neural Network Energy Landscape[C]. international conference on machine learnin 阅读全文
posted @ 2019-09-14 21:24 馒头and花卷 阅读(361) 评论(0) 推荐(0) 编辑
摘要:Laurent T, Von Brecht J H. Deep linear networks with arbitrary loss: All local minima are global[C]. international conference on machine learning, 201 阅读全文
posted @ 2019-09-11 21:47 馒头and花卷 阅读(318) 评论(0) 推荐(0) 编辑
摘要:[TOC] [Deep Inside Convolutional Networks: Visualising Image Classification Models and Saliency Maps ](https://arxiv.org/abs/1312.6034) 问题 这篇文章和ZFnet相似 阅读全文
posted @ 2019-08-14 22:40 馒头and花卷 阅读(677) 评论(0) 推荐(0) 编辑

点击右上角即可分享
微信分享提示