Resnet BN

【深度学习】深入理解Batch Normalization批标准化

https://www.zhihu.com/topic/20084849/hot

resnet(残差网络)的F(x)究竟长什么样子?

https://www.zhihu.com/question/53224378

如何理解微软的深度残差学习?

https://www.zhihu.com/question/38499534?sort=created

 

SKIP CONNECTIONS ELIMINATE SINGULARITIES

https://arxiv.org/pdf/1701.09175.pdf

 详解残差网络

https://zhuanlan.zhihu.com/p/42706477

 

残差网络原理

https://blog.csdn.net/qq_30478885/article/details/78828734

https://www.coursera.org/lecture/convolutional-neural-networks/why-resnets-work-XAKNO

https://arxiv.org/pdf/1512.03385.pdf

https://www.quora.com/How-does-deep-residual-learning-work

https://arxiv.org/pdf/1603.05027.pdf

Resnet中残差块的作用是完成恒等变换,那这样的恒等变换的意义是什么,在网络中能起到怎样的作用呢?

https://www.zhihu.com/question/293243905

https://zhuanlan.zhihu.com/p/28124810 

https://arxiv.org/pdf/1502.03167v3.pdf

https://zhuanlan.zhihu.com/p/31645196

https://www.coursera.org/lecture/convolutional-neural-networks/why-resnets-work-XAKNO

https://arxiv.org/pdf/1506.01497v3.pdf

https://arxiv.org/pdf/1504.08083.pdf

https://arxiv.org/pdf/1311.2524v5.pdf

https://arxiv.org/pdf/1702.08591.pdf

https://arxiv.org/pdf/1611.05431.pdf

https://arxiv.org/pdf/1607.07032.pdf

 

https://arxiv.org/abs/1605.06431

Residual Networks 理解

协方差

https://www.zhihu.com/question/20852004

ResNet架构可逆!多大等提出性能优越的可逆残差网络

一文简述ResNet及其多种变体

 

TensorFlow 实现 Resnet V2 代码解读

Identity Mapping in ResNet

1. 学习吴恩达在coursera的“深度学习课程”中关于残差网络的内容
2. 读该模型的原版论文:Deep Residual Learning for Image Recognition,如果阅读有难度,可以参考网络上的翻译稿,这里有一篇笔者的翻译稿供参考。
3. 注册github,用于查看和下载残差网络的开源源码。注册地址
4. 复制源代码到本地。源码地址在此

 

【1】He K, Zhang X, Ren S, et al. Deep residual learning for image recognition[C]//Proceedings of the IEEE conference on computer vision and pattern recognition. 2016: 770-778.

【2】Srivastava R K, Greff K, Schmidhuber J. Highway networks[J]. arXiv preprint arXiv:1505.00387, 2015.

【3】Orhan A E, Pitkow X. Skip connections eliminate singularities[J]. arXiv preprint arXiv:1701.09175, 2017.

【4】Shang W, Sohn K, Almeida D, et al. Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units[J]. 2016:2217-2225.

【5】Greff K, Srivastava R K, Schmidhuber J. Highway and Residual Networks learn Unrolled Iterative Estimation[J]. 2017.

【6】Jastrzebski S, Arpit D, Ballas N, et al. Residual connections encourage iterative inference[J]. arXiv preprint arXiv:1710.04773, 2017.

 

posted on 2018-05-06 21:52  暖风的风  阅读(616)  评论(0编辑  收藏  举报

导航