摘要:
[TOC] 论文 : https://arxiv.org/abs/1801.04381 1. 创新点 创新点有两个,分别是: 1. Inverted residuals 通常的residuals block: 1. 先经过一个 1x1 Conv layer,把feature map的通道数“压”下来 阅读全文
摘要:
[TOC] 参考博客: https://cuijiahua.com/blog/2018/02/dl_6.html 1. Depth Separable Convolution A standard convolution both filters and combines inputs into a 阅读全文
摘要:
[TOC] 在下面的结构图中,每一个inception模块中都有一个1∗1的没有激活层的卷积层,用来扩展通道数,从而补偿因为inception模块导致的维度约间。其中Inception ResNet V1的结果与Inception v3相当;Inception ResNet V1与Inception 阅读全文
摘要:
[TOC] 论文: Xception: Deep Learning with Depthwise Separable Convolutions 论文地址: https://arxiv.org/abs/1610.02357 代码地址: 1. Keras: https://github.com/yanc 阅读全文
摘要:
[TOC] 论文: Aggregated Residual Transformations for Deep Neural Networks 论文地址: https://arxiv.org/abs/1611.05431 代码地址 1. Keras: https://github.com/titu19 阅读全文