【593】ResNet残差网络
参考:inception模型和卷积层的残差连接的keras实现
参考:Keras Implementation of ResNet-50 (Residual Networks) Architecture from Scratch
如下图所示,F(x) 是一个或多个卷积层,然后将两者相加 F(x)+x。逐个元素对应相加,而不是连接。
1 2 3 4 5 6 7 8 9 10 11 12 | from keras.layers import Conv2D, Input , Add # input tensor for a 3-channel 256x256 image x = Input (shape = ( 256 , 256 , 3 )) # 3x3 conv with 3 output channels (same as input channels) y = Conv2D( 3 , ( 3 , 3 ), padding = 'same' )(x) # this returns x + y. (SKIP Connection) z = Add()([x, y]) z = Activation( 'relu' )(z) |
1. Identity Block
The identity block is the standard block used in ResNets and corresponds to the case where the input activation has the same dimension as the output activation.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 | def identity_block(X, f, filters, stage, block): conv_name_base = 'res' + str (stage) + block + '_branch' bn_name_base = 'bn' + str (stage) + block + '_branch' F1, F2, F3 = filters X_shortcut = X X = Conv2D(filters = F1, kernel_size = ( 1 , 1 ), strides = ( 1 , 1 ), padding = 'valid' , name = conv_name_base + '2a' , kernel_initializer = glorot_uniform(seed = 0 ))(X) X = BatchNormalization(axis = 3 , name = bn_name_base + '2a' )(X) X = Activation( 'relu' )(X) X = Conv2D(filters = F2, kernel_size = (f, f), strides = ( 1 , 1 ), padding = 'same' , name = conv_name_base + '2b' , kernel_initializer = glorot_uniform(seed = 0 ))(X) X = BatchNormalization(axis = 3 , name = bn_name_base + '2b' )(X) X = Activation( 'relu' )(X) X = Conv2D(filters = F3, kernel_size = ( 1 , 1 ), strides = ( 1 , 1 ), padding = 'valid' , name = conv_name_base + '2c' , kernel_initializer = glorot_uniform(seed = 0 ))(X) X = BatchNormalization(axis = 3 , name = bn_name_base + '2c' )(X) X = Add()([X, X_shortcut]) # SKIP Connection X = Activation( 'relu' )(X) return X |
2. Convolutional Block
We can use this type of block when the input and output dimensions don’t match up. The difference with the identity block is that there is a CONV2D layer in the shortcut path.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 | def convolutional_block(X, f, filters, stage, block, s = 2 ): conv_name_base = 'res' + str (stage) + block + '_branch' bn_name_base = 'bn' + str (stage) + block + '_branch' F1, F2, F3 = filters X_shortcut = X X = Conv2D(filters = F1, kernel_size = ( 1 , 1 ), strides = (s, s), padding = 'valid' , name = conv_name_base + '2a' , kernel_initializer = glorot_uniform(seed = 0 ))(X) X = BatchNormalization(axis = 3 , name = bn_name_base + '2a' )(X) X = Activation( 'relu' )(X) X = Conv2D(filters = F2, kernel_size = (f, f), strides = ( 1 , 1 ), padding = 'same' , name = conv_name_base + '2b' , kernel_initializer = glorot_uniform(seed = 0 ))(X) X = BatchNormalization(axis = 3 , name = bn_name_base + '2b' )(X) X = Activation( 'relu' )(X) X = Conv2D(filters = F3, kernel_size = ( 1 , 1 ), strides = ( 1 , 1 ), padding = 'valid' , name = conv_name_base + '2c' , kernel_initializer = glorot_uniform(seed = 0 ))(X) X = BatchNormalization(axis = 3 , name = bn_name_base + '2c' )(X) X_shortcut = Conv2D(filters = F3, kernel_size = ( 1 , 1 ), strides = (s, s), padding = 'valid' , name = conv_name_base + '1' , kernel_initializer = glorot_uniform(seed = 0 ))(X_shortcut) X_shortcut = BatchNormalization(axis = 3 , name = bn_name_base + '1' )(X_shortcut) X = Add()([X, X_shortcut]) X = Activation( 'relu' )(X) return X |
分类:
Python Study
, AI Related
【推荐】国内首个AI IDE,深度理解中文开发场景,立即下载体验Trae
【推荐】编程新体验,更懂你的AI,立即体验豆包MarsCode编程助手
【推荐】抖音旗下AI助手豆包,你的智能百科全书,全免费不限次数
【推荐】轻量又高性能的 SSH 工具 IShell:AI 加持,快人一步
· AI与.NET技术实操系列(二):开始使用ML.NET
· 记一次.NET内存居高不下排查解决与启示
· 探究高空视频全景AR技术的实现原理
· 理解Rust引用及其生命周期标识(上)
· 浏览器原生「磁吸」效果!Anchor Positioning 锚点定位神器解析
· DeepSeek 开源周回顾「GitHub 热点速览」
· 记一次.NET内存居高不下排查解决与启示
· 物流快递公司核心技术能力-地址解析分单基础技术分享
· .NET 10首个预览版发布:重大改进与新特性概览!
· .NET10 - 预览版1新功能体验(一)
2012-07-05 【055】长江水文数据自动记录程序