深度残差网络+自适应参数化ReLU激活函数(调参记录7)

续上一篇:

深度残差网络+自适应参数化ReLU激活函数(调参记录6)

https://www.cnblogs.com/shisuzanian/p/12907482.html

本文冒着过拟合的风险,将卷积核的个数增加成32个、64个和128个,继续测试Adaptively Parametric ReLU(APReLU)激活函数在Cifar10图像集上的效果。

APReLU激活函数的基本原理如下图所示:

Keras代码如下:

  1 #!/usr/bin/env python3
  2 # -*- coding: utf-8 -*-
  3 """
  4 Created on Tue Apr 14 04:17:45 2020
  5 Implemented using TensorFlow 1.10.0 and Keras 2.2.1
  6 
  7 Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht,
  8 Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, 
  9 IEEE Transactions on Industrial Electronics, 2020,  DOI: 10.1109/TIE.2020.2972458 
 10 
 11 @author: Minghang Zhao
 12 """
 13 
 14 from __future__ import print_function
 15 import keras
 16 import numpy as np
 17 from keras.datasets import cifar10
 18 from keras.layers import Dense, Conv2D, BatchNormalization, Activation, Minimum
 19 from keras.layers import AveragePooling2D, Input, GlobalAveragePooling2D, Concatenate, Reshape
 20 from keras.regularizers import l2
 21 from keras import backend as K
 22 from keras.models import Model
 23 from keras import optimizers
 24 from keras.preprocessing.image import ImageDataGenerator
 25 from keras.callbacks import LearningRateScheduler
 26 K.set_learning_phase(1)
 27 
 28 # The data, split between train and test sets
 29 (x_train, y_train), (x_test, y_test) = cifar10.load_data()
 30 
 31 # Noised data
 32 x_train = x_train.astype('float32') / 255.
 33 x_test = x_test.astype('float32') / 255.
 34 x_test = x_test-np.mean(x_train)
 35 x_train = x_train-np.mean(x_train)
 36 print('x_train shape:', x_train.shape)
 37 print(x_train.shape[0], 'train samples')
 38 print(x_test.shape[0], 'test samples')
 39 
 40 # convert class vectors to binary class matrices
 41 y_train = keras.utils.to_categorical(y_train, 10)
 42 y_test = keras.utils.to_categorical(y_test, 10)
 43 
 44 # Schedule the learning rate, multiply 0.1 every 300 epoches
 45 def scheduler(epoch):
 46     if epoch % 300 == 0 and epoch != 0:
 47         lr = K.get_value(model.optimizer.lr)
 48         K.set_value(model.optimizer.lr, lr * 0.1)
 49         print("lr changed to {}".format(lr * 0.1))
 50     return K.get_value(model.optimizer.lr)
 51 
 52 # An adaptively parametric rectifier linear unit (APReLU)
 53 def aprelu(inputs):
 54     # get the number of channels
 55     channels = inputs.get_shape().as_list()[-1]
 56     # get a zero feature map
 57     zeros_input = keras.layers.subtract([inputs, inputs])
 58     # get a feature map with only positive features
 59     pos_input = Activation('relu')(inputs)
 60     # get a feature map with only negative features
 61     neg_input = Minimum()([inputs,zeros_input])
 62     # define a network to obtain the scaling coefficients
 63     scales_p = GlobalAveragePooling2D()(pos_input)
 64     scales_n = GlobalAveragePooling2D()(neg_input)
 65     scales = Concatenate()([scales_n, scales_p])
 66     scales = Dense(channels, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales)
 67     scales = BatchNormalization()(scales)
 68     scales = Activation('relu')(scales)
 69     scales = Dense(channels, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales)
 70     scales = BatchNormalization()(scales)
 71     scales = Activation('sigmoid')(scales)
 72     scales = Reshape((1,1,channels))(scales)
 73     # apply a paramtetric relu
 74     neg_part = keras.layers.multiply([scales, neg_input])
 75     return keras.layers.add([pos_input, neg_part])
 76 
 77 # Residual Block
 78 def residual_block(incoming, nb_blocks, out_channels, downsample=False,
 79                    downsample_strides=2):
 80     
 81     residual = incoming
 82     in_channels = incoming.get_shape().as_list()[-1]
 83     
 84     for i in range(nb_blocks):
 85         
 86         identity = residual
 87         
 88         if not downsample:
 89             downsample_strides = 1
 90         
 91         residual = BatchNormalization()(residual)
 92         residual = aprelu(residual)
 93         residual = Conv2D(out_channels, 3, strides=(downsample_strides, downsample_strides), 
 94                           padding='same', kernel_initializer='he_normal', 
 95                           kernel_regularizer=l2(1e-4))(residual)
 96         
 97         residual = BatchNormalization()(residual)
 98         residual = aprelu(residual)
 99         residual = Conv2D(out_channels, 3, padding='same', kernel_initializer='he_normal', 
100                           kernel_regularizer=l2(1e-4))(residual)
101         
102         # Downsampling
103         if downsample_strides > 1:
104             identity = AveragePooling2D(pool_size=(1,1), strides=(2,2))(identity)
105             
106         # Zero_padding to match channels
107         if in_channels != out_channels:
108             zeros_identity = keras.layers.subtract([identity, identity])
109             identity = keras.layers.concatenate([identity, zeros_identity])
110             in_channels = out_channels
111         
112         residual = keras.layers.add([residual, identity])
113     
114     return residual
115 
116 
117 # define and train a model
118 inputs = Input(shape=(32, 32, 3))
119 net = Conv2D(32, 3, padding='same', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(inputs)
120 net = residual_block(net, 9, 32, downsample=False)
121 net = residual_block(net, 1, 64, downsample=True)
122 net = residual_block(net, 8, 64, downsample=False)
123 net = residual_block(net, 1, 128, downsample=True)
124 net = residual_block(net, 8, 128, downsample=False)
125 net = BatchNormalization()(net)
126 net = Activation('relu')(net)
127 net = GlobalAveragePooling2D()(net)
128 outputs = Dense(10, activation='softmax', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(net)
129 model = Model(inputs=inputs, outputs=outputs)
130 sgd = optimizers.SGD(lr=0.1, decay=0., momentum=0.9, nesterov=True)
131 model.compile(loss='categorical_crossentropy', optimizer=sgd, metrics=['accuracy'])
132 
133 # data augmentation
134 datagen = ImageDataGenerator(
135     # randomly rotate images in the range (deg 0 to 180)
136     rotation_range=30,
137     # randomly flip images
138     horizontal_flip=True,
139     # randomly shift images horizontally
140     width_shift_range=0.125,
141     # randomly shift images vertically
142     height_shift_range=0.125)
143 
144 reduce_lr = LearningRateScheduler(scheduler)
145 # fit the model on the batches generated by datagen.flow().
146 model.fit_generator(datagen.flow(x_train, y_train, batch_size=100),
147                     validation_data=(x_test, y_test), epochs=1000, 
148                     verbose=1, callbacks=[reduce_lr], workers=4)
149 
150 # get results
151 K.set_learning_phase(0)
152 DRSN_train_score = model.evaluate(x_train, y_train, batch_size=100, verbose=0)
153 print('Train loss:', DRSN_train_score[0])
154 print('Train accuracy:', DRSN_train_score[1])
155 DRSN_test_score = model.evaluate(x_test, y_test, batch_size=100, verbose=0)
156 print('Test loss:', DRSN_test_score[0])
157 print('Test accuracy:', DRSN_test_score[1])

先复制一次spyder窗口里的实验结果:

  1 Epoch 270/1000
  2 91s 182ms/step - loss: 0.5576 - acc: 0.9245 - val_loss: 0.6619 - val_acc: 0.8960
  3 Epoch 271/1000
  4 91s 182ms/step - loss: 0.5605 - acc: 0.9250 - val_loss: 0.6675 - val_acc: 0.8908
  5 Epoch 272/1000
  6 91s 182ms/step - loss: 0.5578 - acc: 0.9244 - val_loss: 0.6578 - val_acc: 0.8951
  7 Epoch 273/1000
  8 91s 182ms/step - loss: 0.5625 - acc: 0.9232 - val_loss: 0.6663 - val_acc: 0.8907
  9 Epoch 274/1000
 10 91s 182ms/step - loss: 0.5598 - acc: 0.9246 - val_loss: 0.6435 - val_acc: 0.9059
 11 Epoch 275/1000
 12 91s 182ms/step - loss: 0.5567 - acc: 0.9265 - val_loss: 0.6589 - val_acc: 0.8949
 13 Epoch 276/1000
 14 91s 182ms/step - loss: 0.5616 - acc: 0.9235 - val_loss: 0.6439 - val_acc: 0.9002
 15 Epoch 277/1000
 16 91s 182ms/step - loss: 0.5568 - acc: 0.9258 - val_loss: 0.6731 - val_acc: 0.8913 ETA: 16s - loss: 0.5542 - acc: 0.9269
 17 Epoch 278/1000
 18 91s 182ms/step - loss: 0.5582 - acc: 0.9254 - val_loss: 0.6437 - val_acc: 0.8995
 19 Epoch 279/1000
 20 91s 182ms/step - loss: 0.5530 - acc: 0.9270 - val_loss: 0.6416 - val_acc: 0.9002
 21 Epoch 280/1000
 22 91s 182ms/step - loss: 0.5603 - acc: 0.9245 - val_loss: 0.6566 - val_acc: 0.8960
 23 Epoch 281/1000
 24 91s 182ms/step - loss: 0.5613 - acc: 0.9241 - val_loss: 0.6432 - val_acc: 0.9003
 25 Epoch 282/1000
 26 91s 182ms/step - loss: 0.5568 - acc: 0.9250 - val_loss: 0.6573 - val_acc: 0.8950
 27 Epoch 283/1000
 28 91s 182ms/step - loss: 0.5580 - acc: 0.9253 - val_loss: 0.6518 - val_acc: 0.8961 ETA: 10s - loss: 0.5551 - acc: 0.9260
 29 Epoch 284/1000
 30 91s 182ms/step - loss: 0.5495 - acc: 0.9276 - val_loss: 0.6736 - val_acc: 0.8918
 31 Epoch 285/1000
 32 91s 182ms/step - loss: 0.5611 - acc: 0.9238 - val_loss: 0.6538 - val_acc: 0.8962
 33 Epoch 286/1000
 34 91s 182ms/step - loss: 0.5590 - acc: 0.9250 - val_loss: 0.6563 - val_acc: 0.8965
 35 Epoch 287/1000
 36 91s 182ms/step - loss: 0.5581 - acc: 0.9245 - val_loss: 0.6482 - val_acc: 0.9035
 37 Epoch 288/1000
 38 91s 182ms/step - loss: 0.5607 - acc: 0.9233 - val_loss: 0.6516 - val_acc: 0.8984
 39 Epoch 289/1000
 40 91s 182ms/step - loss: 0.5608 - acc: 0.9252 - val_loss: 0.6562 - val_acc: 0.8984
 41 Epoch 290/1000
 42 91s 182ms/step - loss: 0.5599 - acc: 0.9240 - val_loss: 0.6941 - val_acc: 0.8847
 43 Epoch 291/1000
 44 91s 182ms/step - loss: 0.5600 - acc: 0.9244 - val_loss: 0.6695 - val_acc: 0.8902
 45 Epoch 292/1000
 46 91s 182ms/step - loss: 0.5628 - acc: 0.9232 - val_loss: 0.6580 - val_acc: 0.8979
 47 Epoch 293/1000
 48 91s 182ms/step - loss: 0.5602 - acc: 0.9242 - val_loss: 0.6726 - val_acc: 0.8913
 49 Epoch 294/1000
 50 91s 182ms/step - loss: 0.5582 - acc: 0.9249 - val_loss: 0.6917 - val_acc: 0.8901
 51 Epoch 295/1000
 52 91s 182ms/step - loss: 0.5559 - acc: 0.9265 - val_loss: 0.6805 - val_acc: 0.88967/500 [======================>.......] - ETA: 19s - loss: 0.5537 - acc: 0.9275
 53 Epoch 296/1000
 54 91s 182ms/step - loss: 0.5570 - acc: 0.9265 - val_loss: 0.6315 - val_acc: 0.9039
 55 Epoch 297/1000
 56 91s 182ms/step - loss: 0.5572 - acc: 0.9244 - val_loss: 0.6647 - val_acc: 0.8918
 57 Epoch 298/1000
 58 91s 182ms/step - loss: 0.5555 - acc: 0.9259 - val_loss: 0.6540 - val_acc: 0.8960
 59 Epoch 299/1000
 60 91s 182ms/step - loss: 0.5575 - acc: 0.9266 - val_loss: 0.6648 - val_acc: 0.8941
 61 Epoch 300/1000
 62 91s 182ms/step - loss: 0.5517 - acc: 0.9277 - val_loss: 0.6555 - val_acc: 0.8975
 63 Epoch 301/1000
 64 lr changed to 0.010000000149011612
 65 91s 182ms/step - loss: 0.4683 - acc: 0.9572 - val_loss: 0.5677 - val_acc: 0.9248
 66 Epoch 302/1000
 67 91s 182ms/step - loss: 0.4174 - acc: 0.9735 - val_loss: 0.5622 - val_acc: 0.9256
 68 Epoch 303/1000
 69 91s 182ms/step - loss: 0.3968 - acc: 0.9785 - val_loss: 0.5500 - val_acc: 0.9291
 70 Epoch 304/1000
 71 91s 182ms/step - loss: 0.3806 - acc: 0.9814 - val_loss: 0.5520 - val_acc: 0.9283
 72 Epoch 305/1000
 73 91s 181ms/step - loss: 0.3687 - acc: 0.9832 - val_loss: 0.5442 - val_acc: 0.9306
 74 Epoch 306/1000
 75 91s 181ms/step - loss: 0.3555 - acc: 0.9864 - val_loss: 0.5454 - val_acc: 0.9284
 76 Epoch 307/1000
 77 91s 182ms/step - loss: 0.3485 - acc: 0.9863 - val_loss: 0.5409 - val_acc: 0.9286
 78 Epoch 308/1000
 79 91s 181ms/step - loss: 0.3379 - acc: 0.9885 - val_loss: 0.5383 - val_acc: 0.9305
 80 Epoch 309/1000
 81 91s 181ms/step - loss: 0.3272 - acc: 0.9904 - val_loss: 0.5344 - val_acc: 0.9309
 82 Epoch 310/1000
 83 90s 181ms/step - loss: 0.3213 - acc: 0.9900 - val_loss: 0.5333 - val_acc: 0.9298
 84 Epoch 311/1000
 85 90s 181ms/step - loss: 0.3143 - acc: 0.9909 - val_loss: 0.5365 - val_acc: 0.9283
 86 Epoch 312/1000
 87 90s 181ms/step - loss: 0.3092 - acc: 0.9910 - val_loss: 0.5287 - val_acc: 0.9311
 88 Epoch 313/1000
 89 90s 181ms/step - loss: 0.3006 - acc: 0.9919 - val_loss: 0.5324 - val_acc: 0.9283
 90 Epoch 314/1000
 91 90s 181ms/step - loss: 0.2945 - acc: 0.9916 - val_loss: 0.5286 - val_acc: 0.9300
 92 Epoch 315/1000
 93 90s 181ms/step - loss: 0.2886 - acc: 0.9923 - val_loss: 0.5181 - val_acc: 0.9323
 94 Epoch 316/1000
 95 91s 181ms/step - loss: 0.2823 - acc: 0.9932 - val_loss: 0.5212 - val_acc: 0.9286
 96 Epoch 317/1000
 97 90s 181ms/step - loss: 0.2778 - acc: 0.9930 - val_loss: 0.5182 - val_acc: 0.9296
 98 Epoch 318/1000
 99 91s 181ms/step - loss: 0.2720 - acc: 0.9936 - val_loss: 0.5122 - val_acc: 0.9287
100 Epoch 319/1000
101 91s 181ms/step - loss: 0.2662 - acc: 0.9940 - val_loss: 0.5083 - val_acc: 0.9277
102 Epoch 320/1000
103 91s 181ms/step - loss: 0.2597 - acc: 0.9944 - val_loss: 0.5018 - val_acc: 0.9315
104 Epoch 321/1000
105 91s 181ms/step - loss: 0.2560 - acc: 0.9944 - val_loss: 0.5086 - val_acc: 0.9296
106 Epoch 322/1000
107 90s 181ms/step - loss: 0.2526 - acc: 0.9939 - val_loss: 0.5059 - val_acc: 0.9274
108 Epoch 323/1000
109 90s 181ms/step - loss: 0.2466 - acc: 0.9945 - val_loss: 0.4991 - val_acc: 0.9302
110 Epoch 324/1000
111 90s 181ms/step - loss: 0.2431 - acc: 0.9945 - val_loss: 0.5006 - val_acc: 0.9273
112 Epoch 325/1000
113 90s 181ms/step - loss: 0.2384 - acc: 0.9947 - val_loss: 0.4914 - val_acc: 0.9296
114 Epoch 326/1000
115 91s 181ms/step - loss: 0.2334 - acc: 0.9948 - val_loss: 0.4938 - val_acc: 0.9291
116 Epoch 327/1000
117 91s 181ms/step - loss: 0.2301 - acc: 0.9949 - val_loss: 0.4869 - val_acc: 0.9303
118 Epoch 328/1000
119 90s 181ms/step - loss: 0.2253 - acc: 0.9952 - val_loss: 0.4850 - val_acc: 0.9293
120 Epoch 329/1000
121 91s 181ms/step - loss: 0.2219 - acc: 0.9953 - val_loss: 0.4858 - val_acc: 0.9272
122 Epoch 330/1000
123 90s 181ms/step - loss: 0.2170 - acc: 0.9959 - val_loss: 0.4834 - val_acc: 0.9277
124 Epoch 331/1000
125 90s 181ms/step - loss: 0.2140 - acc: 0.9953 - val_loss: 0.4814 - val_acc: 0.9276
126 Epoch 332/1000
127 90s 181ms/step - loss: 0.2118 - acc: 0.9951 - val_loss: 0.4767 - val_acc: 0.9273
128 Epoch 333/1000
129 90s 181ms/step - loss: 0.2077 - acc: 0.9953 - val_loss: 0.4709 - val_acc: 0.9303
130 Epoch 334/1000
131 91s 181ms/step - loss: 0.2042 - acc: 0.9952 - val_loss: 0.4808 - val_acc: 0.9257
132 Epoch 335/1000
133 90s 181ms/step - loss: 0.2015 - acc: 0.9951 - val_loss: 0.4691 - val_acc: 0.9287
134 Epoch 336/1000
135 90s 181ms/step - loss: 0.1988 - acc: 0.9952 - val_loss: 0.4659 - val_acc: 0.9273
136 Epoch 337/1000
137 90s 181ms/step - loss: 0.1930 - acc: 0.9961 - val_loss: 0.4667 - val_acc: 0.9293
138 Epoch 338/1000
139 90s 181ms/step - loss: 0.1901 - acc: 0.9961 - val_loss: 0.4559 - val_acc: 0.9299
140 Epoch 339/1000
141 90s 181ms/step - loss: 0.1872 - acc: 0.9962 - val_loss: 0.4676 - val_acc: 0.9269
142 Epoch 340/1000
143 90s 181ms/step - loss: 0.1890 - acc: 0.9940 - val_loss: 0.4556 - val_acc: 0.9291
144 Epoch 341/1000
145 90s 181ms/step - loss: 0.1832 - acc: 0.9954 - val_loss: 0.4552 - val_acc: 0.9268
146 Epoch 342/1000
147 90s 181ms/step - loss: 0.1798 - acc: 0.9954 - val_loss: 0.4556 - val_acc: 0.9294
148 Epoch 343/1000
149 90s 181ms/step - loss: 0.1782 - acc: 0.9950 - val_loss: 0.4498 - val_acc: 0.9255
150 Epoch 344/1000
151 91s 181ms/step - loss: 0.1775 - acc: 0.9943 - val_loss: 0.4522 - val_acc: 0.9263
152 Epoch 345/1000
153 90s 181ms/step - loss: 0.1747 - acc: 0.9950 - val_loss: 0.4376 - val_acc: 0.9258
154 Epoch 346/1000
155 90s 181ms/step - loss: 0.1702 - acc: 0.9955 - val_loss: 0.4464 - val_acc: 0.9263
156 Epoch 347/1000
157 90s 181ms/step - loss: 0.1693 - acc: 0.9949 - val_loss: 0.4515 - val_acc: 0.9269
158 Epoch 348/1000
159 90s 181ms/step - loss: 0.1654 - acc: 0.9951 - val_loss: 0.4452 - val_acc: 0.9249
160 Epoch 349/1000
161 90s 181ms/step - loss: 0.1649 - acc: 0.9948 - val_loss: 0.4461 - val_acc: 0.9249
162 Epoch 350/1000
163 90s 181ms/step - loss: 0.1632 - acc: 0.9944 - val_loss: 0.4301 - val_acc: 0.9291
164 Epoch 351/1000
165 91s 181ms/step - loss: 0.1616 - acc: 0.9941 - val_loss: 0.4411 - val_acc: 0.9237
166 Epoch 352/1000
167 90s 181ms/step - loss: 0.1594 - acc: 0.9948 - val_loss: 0.4301 - val_acc: 0.9308
168 Epoch 353/1000
169 90s 181ms/step - loss: 0.1593 - acc: 0.9937 - val_loss: 0.4230 - val_acc: 0.9265
170 Epoch 354/1000
171 90s 181ms/step - loss: 0.1565 - acc: 0.9942 - val_loss: 0.4243 - val_acc: 0.9272
172 Epoch 355/1000
173 90s 181ms/step - loss: 0.1532 - acc: 0.9946 - val_loss: 0.4290 - val_acc: 0.9258
174 Epoch 356/1000
175 90s 181ms/step - loss: 0.1525 - acc: 0.9945 - val_loss: 0.4171 - val_acc: 0.9294
176 Epoch 357/1000
177 90s 181ms/step - loss: 0.1505 - acc: 0.9943 - val_loss: 0.4205 - val_acc: 0.9273
178 Epoch 358/1000
179 90s 181ms/step - loss: 0.1481 - acc: 0.9945 - val_loss: 0.4295 - val_acc: 0.9227
180 Epoch 359/1000
181 90s 181ms/step - loss: 0.1487 - acc: 0.9938 - val_loss: 0.4185 - val_acc: 0.9248
182 Epoch 360/1000
183 91s 181ms/step - loss: 0.1452 - acc: 0.9946 - val_loss: 0.4244 - val_acc: 0.9256
184 Epoch 361/1000
185 90s 181ms/step - loss: 0.1481 - acc: 0.9925 - val_loss: 0.4267 - val_acc: 0.9220
186 Epoch 362/1000
187 91s 181ms/step - loss: 0.1468 - acc: 0.9929 - val_loss: 0.4009 - val_acc: 0.9265
188 Epoch 363/1000
189 90s 181ms/step - loss: 0.1433 - acc: 0.9941 - val_loss: 0.4098 - val_acc: 0.9259
190 Epoch 364/1000
191 91s 181ms/step - loss: 0.1441 - acc: 0.9928 - val_loss: 0.4189 - val_acc: 0.9234
192 Epoch 365/1000
193 91s 181ms/step - loss: 0.1426 - acc: 0.9934 - val_loss: 0.4099 - val_acc: 0.9251
194 Epoch 366/1000
195 90s 181ms/step - loss: 0.1383 - acc: 0.9941 - val_loss: 0.4007 - val_acc: 0.9256
196 Epoch 367/1000
197 91s 181ms/step - loss: 0.1395 - acc: 0.9933 - val_loss: 0.3938 - val_acc: 0.9269
198 Epoch 368/1000
199 90s 181ms/step - loss: 0.1379 - acc: 0.9934 - val_loss: 0.4024 - val_acc: 0.9253
200 Epoch 369/1000
201 90s 181ms/step - loss: 0.1359 - acc: 0.9935 - val_loss: 0.4021 - val_acc: 0.9265
202 Epoch 370/1000
203 90s 181ms/step - loss: 0.1370 - acc: 0.9928 - val_loss: 0.3925 - val_acc: 0.9270
204 Epoch 371/1000
205 90s 181ms/step - loss: 0.1373 - acc: 0.9924 - val_loss: 0.3932 - val_acc: 0.9259
206 Epoch 372/1000
207 90s 181ms/step - loss: 0.1349 - acc: 0.9926 - val_loss: 0.4055 - val_acc: 0.9254
208 Epoch 373/1000
209 90s 181ms/step - loss: 0.1342 - acc: 0.9927 - val_loss: 0.3934 - val_acc: 0.9289
210 Epoch 374/1000
211 90s 181ms/step - loss: 0.1352 - acc: 0.9919 - val_loss: 0.4131 - val_acc: 0.9225
212 Epoch 375/1000
213 91s 181ms/step - loss: 0.1351 - acc: 0.9917 - val_loss: 0.3916 - val_acc: 0.9249
214 Epoch 376/1000
215 90s 181ms/step - loss: 0.1317 - acc: 0.9929 - val_loss: 0.4016 - val_acc: 0.9237
216 Epoch 377/1000
217 90s 181ms/step - loss: 0.1316 - acc: 0.9930 - val_loss: 0.3906 - val_acc: 0.9259
218 Epoch 378/1000
219 90s 181ms/step - loss: 0.1307 - acc: 0.9925 - val_loss: 0.3954 - val_acc: 0.9248
220 Epoch 379/1000
221 90s 181ms/step - loss: 0.1328 - acc: 0.9914 - val_loss: 0.3997 - val_acc: 0.9221
222 Epoch 380/1000
223 90s 181ms/step - loss: 0.1345 - acc: 0.9902 - val_loss: 0.3934 - val_acc: 0.9260
224 Epoch 381/1000
225 90s 181ms/step - loss: 0.1319 - acc: 0.9915 - val_loss: 0.3973 - val_acc: 0.9232
226 Epoch 382/1000
227 90s 181ms/step - loss: 0.1307 - acc: 0.9920 - val_loss: 0.4105 - val_acc: 0.9220
228 Epoch 383/1000
229 90s 181ms/step - loss: 0.1281 - acc: 0.9924 - val_loss: 0.3980 - val_acc: 0.9242
230 Epoch 384/1000
231 90s 181ms/step - loss: 0.1305 - acc: 0.9911 - val_loss: 0.4200 - val_acc: 0.9194
232 Epoch 385/1000
233 90s 181ms/step - loss: 0.1311 - acc: 0.9910 - val_loss: 0.4101 - val_acc: 0.9184
234 Epoch 386/1000
235 91s 181ms/step - loss: 0.1291 - acc: 0.9913 - val_loss: 0.4074 - val_acc: 0.9225
236 Epoch 387/1000
237 90s 181ms/step - loss: 0.1316 - acc: 0.9902 - val_loss: 0.4087 - val_acc: 0.9180
238 Epoch 388/1000
239 90s 181ms/step - loss: 0.1306 - acc: 0.9906 - val_loss: 0.4021 - val_acc: 0.9192
240 Epoch 389/1000
241 90s 181ms/step - loss: 0.1295 - acc: 0.9910 - val_loss: 0.3877 - val_acc: 0.9250
242 Epoch 390/1000
243 90s 181ms/step - loss: 0.1285 - acc: 0.9913 - val_loss: 0.3914 - val_acc: 0.9208
244 Epoch 391/1000
245 90s 181ms/step - loss: 0.1284 - acc: 0.9911 - val_loss: 0.3887 - val_acc: 0.9221
246 Epoch 392/1000
247 90s 181ms/step - loss: 0.1289 - acc: 0.9911 - val_loss: 0.3992 - val_acc: 0.9262
248 Epoch 393/1000
249 90s 181ms/step - loss: 0.1265 - acc: 0.9919 - val_loss: 0.4006 - val_acc: 0.9213
250 Epoch 394/1000
251 90s 181ms/step - loss: 0.1261 - acc: 0.9911 - val_loss: 0.3943 - val_acc: 0.9238
252 Epoch 395/1000
253 90s 181ms/step - loss: 0.1277 - acc: 0.9908 - val_loss: 0.3963 - val_acc: 0.9236
254 Epoch 396/1000
255 90s 181ms/step - loss: 0.1286 - acc: 0.9902 - val_loss: 0.4147 - val_acc: 0.9194
256 Epoch 397/1000
257 90s 181ms/step - loss: 0.1309 - acc: 0.9894 - val_loss: 0.3996 - val_acc: 0.9192
258 Epoch 398/1000
259 91s 181ms/step - loss: 0.1268 - acc: 0.9912 - val_loss: 0.3952 - val_acc: 0.9225
260 Epoch 399/1000
261 90s 181ms/step - loss: 0.1255 - acc: 0.9911 - val_loss: 0.4084 - val_acc: 0.9204
262 Epoch 400/1000
263 91s 181ms/step - loss: 0.1268 - acc: 0.9902 - val_loss: 0.3954 - val_acc: 0.9209
264 Epoch 401/1000
265 90s 181ms/step - loss: 0.1263 - acc: 0.9902 - val_loss: 0.4022 - val_acc: 0.9224
266 Epoch 402/1000
267 90s 181ms/step - loss: 0.1270 - acc: 0.9904 - val_loss: 0.3891 - val_acc: 0.9246
268 Epoch 403/1000
269 90s 181ms/step - loss: 0.1272 - acc: 0.9899 - val_loss: 0.4038 - val_acc: 0.9202
270 Epoch 404/1000
271 91s 181ms/step - loss: 0.1307 - acc: 0.9885 - val_loss: 0.4022 - val_acc: 0.9205
272 Epoch 405/1000
273 91s 181ms/step - loss: 0.1298 - acc: 0.9891 - val_loss: 0.3900 - val_acc: 0.9213
274 Epoch 406/1000
275 90s 181ms/step - loss: 0.1277 - acc: 0.9897 - val_loss: 0.3946 - val_acc: 0.9209
276 Epoch 407/1000
277 90s 181ms/step - loss: 0.1257 - acc: 0.9905 - val_loss: 0.3962 - val_acc: 0.9216
278 Epoch 408/1000
279 91s 181ms/step - loss: 0.1262 - acc: 0.9906 - val_loss: 0.4070 - val_acc: 0.9205
280 Epoch 409/1000
281 90s 181ms/step - loss: 0.1273 - acc: 0.9899 - val_loss: 0.3869 - val_acc: 0.9249
282 Epoch 410/1000
283 90s 181ms/step - loss: 0.1268 - acc: 0.9902 - val_loss: 0.4044 - val_acc: 0.9201
284 Epoch 411/1000
285 90s 181ms/step - loss: 0.1264 - acc: 0.9900 - val_loss: 0.4039 - val_acc: 0.9214
286 Epoch 412/1000
287 91s 181ms/step - loss: 0.1278 - acc: 0.9896 - val_loss: 0.4072 - val_acc: 0.9187
288 Epoch 413/1000
289 90s 181ms/step - loss: 0.1267 - acc: 0.9900 - val_loss: 0.4132 - val_acc: 0.9174
290 Epoch 414/1000
291 90s 181ms/step - loss: 0.1294 - acc: 0.9890 - val_loss: 0.3933 - val_acc: 0.9214
292 Epoch 415/1000
293 90s 181ms/step - loss: 0.1236 - acc: 0.9911 - val_loss: 0.4097 - val_acc: 0.9205
294 Epoch 416/1000
295 90s 181ms/step - loss: 0.1279 - acc: 0.9896 - val_loss: 0.3939 - val_acc: 0.9206
296 Epoch 417/1000
297 90s 181ms/step - loss: 0.1243 - acc: 0.9907 - val_loss: 0.4011 - val_acc: 0.9213
298 Epoch 418/1000
299 90s 181ms/step - loss: 0.1255 - acc: 0.9904 - val_loss: 0.4279 - val_acc: 0.9141
300 Epoch 419/1000
301 91s 181ms/step - loss: 0.1267 - acc: 0.9905 - val_loss: 0.4297 - val_acc: 0.9130
302 Epoch 420/1000
303 90s 181ms/step - loss: 0.1245 - acc: 0.9907 - val_loss: 0.4141 - val_acc: 0.9166
304 Epoch 421/1000
305 90s 181ms/step - loss: 0.1270 - acc: 0.9897 - val_loss: 0.3903 - val_acc: 0.9203
306 Epoch 422/1000
307 90s 181ms/step - loss: 0.1213 - acc: 0.9916 - val_loss: 0.4057 - val_acc: 0.9199
308 Epoch 423/1000
309 91s 181ms/step - loss: 0.1213 - acc: 0.9915 - val_loss: 0.3929 - val_acc: 0.9192
310 Epoch 424/1000
311 90s 181ms/step - loss: 0.1215 - acc: 0.9916 - val_loss: 0.3834 - val_acc: 0.9251
312 Epoch 425/1000
313 90s 181ms/step - loss: 0.1224 - acc: 0.9905 - val_loss: 0.4071 - val_acc: 0.9215
314 Epoch 426/1000
315 91s 181ms/step - loss: 0.1280 - acc: 0.9891 - val_loss: 0.4023 - val_acc: 0.9208
316 Epoch 427/1000
317 91s 181ms/step - loss: 0.1274 - acc: 0.9893 - val_loss: 0.3839 - val_acc: 0.9223
318 Epoch 428/1000
319 90s 181ms/step - loss: 0.1244 - acc: 0.9904 - val_loss: 0.3948 - val_acc: 0.9215
320 Epoch 429/1000
321 90s 181ms/step - loss: 0.1247 - acc: 0.9899 - val_loss: 0.4135 - val_acc: 0.9181
322 Epoch 430/1000
323 91s 181ms/step - loss: 0.1218 - acc: 0.9915 - val_loss: 0.3810 - val_acc: 0.9256
324 Epoch 431/1000
325 90s 181ms/step - loss: 0.1230 - acc: 0.9905 - val_loss: 0.3961 - val_acc: 0.9203
326 Epoch 432/1000
327 91s 182ms/step - loss: 0.1262 - acc: 0.9894 - val_loss: 0.3939 - val_acc: 0.9213
328 Epoch 433/1000
329 91s 182ms/step - loss: 0.1273 - acc: 0.9889 - val_loss: 0.4070 - val_acc: 0.9139
330 Epoch 434/1000
331 91s 182ms/step - loss: 0.1228 - acc: 0.9911 - val_loss: 0.3896 - val_acc: 0.9214
332 Epoch 435/1000
333 91s 182ms/step - loss: 0.1252 - acc: 0.9900 - val_loss: 0.3858 - val_acc: 0.9217
334 Epoch 436/1000
335 91s 182ms/step - loss: 0.1246 - acc: 0.9905 - val_loss: 0.3926 - val_acc: 0.9214
336 Epoch 437/1000
337 91s 182ms/step - loss: 0.1254 - acc: 0.9897 - val_loss: 0.3927 - val_acc: 0.9247
338 Epoch 438/1000
339 91s 182ms/step - loss: 0.1238 - acc: 0.9903 - val_loss: 0.4091 - val_acc: 0.9155
340 Epoch 439/1000
341 91s 182ms/step - loss: 0.1259 - acc: 0.9895 - val_loss: 0.4237 - val_acc: 0.9116
342 Epoch 440/1000
343 91s 182ms/step - loss: 0.1263 - acc: 0.9896 - val_loss: 0.4008 - val_acc: 0.9178
344 Epoch 441/1000
345 91s 182ms/step - loss: 0.1268 - acc: 0.9892 - val_loss: 0.4129 - val_acc: 0.9141
346 Epoch 442/1000
347 91s 182ms/step - loss: 0.1261 - acc: 0.9902 - val_loss: 0.3831 - val_acc: 0.9238
348 Epoch 443/1000
349 91s 182ms/step - loss: 0.1234 - acc: 0.9905 - val_loss: 0.4066 - val_acc: 0.9175
350 Epoch 444/1000
351 91s 182ms/step - loss: 0.1258 - acc: 0.9903 - val_loss: 0.4081 - val_acc: 0.9177
352 Epoch 445/1000
353 91s 182ms/step - loss: 0.1279 - acc: 0.9889 - val_loss: 0.3980 - val_acc: 0.9208
354 Epoch 446/1000
355 91s 181ms/step - loss: 0.1257 - acc: 0.9896 - val_loss: 0.3887 - val_acc: 0.9220
356 Epoch 447/1000
357 91s 182ms/step - loss: 0.1240 - acc: 0.9905 - val_loss: 0.4044 - val_acc: 0.9180
358 Epoch 448/1000
359 91s 182ms/step - loss: 0.1270 - acc: 0.9895 - val_loss: 0.4061 - val_acc: 0.9189
360 Epoch 449/1000
361 91s 182ms/step - loss: 0.1229 - acc: 0.9911 - val_loss: 0.3971 - val_acc: 0.9220
362 Epoch 450/1000
363 91s 182ms/step - loss: 0.1217 - acc: 0.9918 - val_loss: 0.4036 - val_acc: 0.9227
364 Epoch 451/1000
365 91s 182ms/step - loss: 0.1240 - acc: 0.9906 - val_loss: 0.4011 - val_acc: 0.9216
366 Epoch 452/1000
367 91s 182ms/step - loss: 0.1239 - acc: 0.9901 - val_loss: 0.4079 - val_acc: 0.9173
368 Epoch 453/1000
369 91s 182ms/step - loss: 0.1224 - acc: 0.9906 - val_loss: 0.3917 - val_acc: 0.9240
370 Epoch 454/1000
371 91s 182ms/step - loss: 0.1265 - acc: 0.9891 - val_loss: 0.3877 - val_acc: 0.9235 ETA: 34s - loss: 0.1254 - acc: 0.9893
372 Epoch 455/1000
373 91s 182ms/step - loss: 0.1233 - acc: 0.9910 - val_loss: 0.4031 - val_acc: 0.9177
374 Epoch 456/1000
375 91s 182ms/step - loss: 0.1239 - acc: 0.9904 - val_loss: 0.4203 - val_acc: 0.9185
376 Epoch 457/1000
377 91s 182ms/step - loss: 0.1240 - acc: 0.9905 - val_loss: 0.3918 - val_acc: 0.9247
378 Epoch 458/1000
379 91s 182ms/step - loss: 0.1247 - acc: 0.9898 - val_loss: 0.4155 - val_acc: 0.9176
380 Epoch 459/1000
381 91s 182ms/step - loss: 0.1239 - acc: 0.9900 - val_loss: 0.3980 - val_acc: 0.9207
382 Epoch 460/1000
383 91s 182ms/step - loss: 0.1296 - acc: 0.9881 - val_loss: 0.3954 - val_acc: 0.9190 ETA: 1:14 - loss: 0.1236 - acc: 0.9908 - ETA: 1:09 - loss: 0.1257 - acc: 0.9903
384 Epoch 461/1000
385 91s 182ms/step - loss: 0.1232 - acc: 0.9908 - val_loss: 0.4039 - val_acc: 0.9223
386 Epoch 462/1000
387 91s 182ms/step - loss: 0.1283 - acc: 0.9888 - val_loss: 0.4285 - val_acc: 0.9136
388 Epoch 463/1000
389 91s 182ms/step - loss: 0.1264 - acc: 0.9899 - val_loss: 0.4025 - val_acc: 0.9191
390 Epoch 464/1000
391 91s 181ms/step - loss: 0.1236 - acc: 0.9909 - val_loss: 0.3952 - val_acc: 0.9205
392 Epoch 465/1000
393 91s 182ms/step - loss: 0.1204 - acc: 0.9921 - val_loss: 0.4008 - val_acc: 0.9207 ETA: 33s - loss: 0.1189 - acc: 0.9928
394 Epoch 466/1000
395 90s 181ms/step - loss: 0.1233 - acc: 0.9905 - val_loss: 0.4098 - val_acc: 0.9158
396 Epoch 467/1000
397 91s 182ms/step - loss: 0.1207 - acc: 0.9916 - val_loss: 0.4012 - val_acc: 0.9160
398 Epoch 468/1000
399 91s 182ms/step - loss: 0.1231 - acc: 0.9910 - val_loss: 0.3880 - val_acc: 0.9248
400 Epoch 469/1000
401 91s 182ms/step - loss: 0.1241 - acc: 0.9900 - val_loss: 0.4136 - val_acc: 0.9175
402 Epoch 470/1000
403 91s 182ms/step - loss: 0.1255 - acc: 0.9894 - val_loss: 0.4084 - val_acc: 0.9202
404 Epoch 471/1000
405 91s 182ms/step - loss: 0.1253 - acc: 0.9902 - val_loss: 0.3892 - val_acc: 0.9225
406 Epoch 472/1000
407 91s 182ms/step - loss: 0.1269 - acc: 0.9891 - val_loss: 0.4101 - val_acc: 0.9201
408 Epoch 473/1000
409 91s 182ms/step - loss: 0.1226 - acc: 0.9913 - val_loss: 0.4143 - val_acc: 0.9167
410 Epoch 474/1000
411 91s 182ms/step - loss: 0.1230 - acc: 0.9911 - val_loss: 0.4019 - val_acc: 0.9184
412 Epoch 475/1000
413 91s 182ms/step - loss: 0.1242 - acc: 0.9902 - val_loss: 0.4229 - val_acc: 0.9181
414 Epoch 476/1000
415 91s 182ms/step - loss: 0.1251 - acc: 0.9905 - val_loss: 0.3879 - val_acc: 0.9241
416 Epoch 477/1000
417 91s 182ms/step - loss: 0.1243 - acc: 0.9899 - val_loss: 0.4191 - val_acc: 0.9172
418 Epoch 478/1000
419 91s 182ms/step - loss: 0.1240 - acc: 0.9907 - val_loss: 0.3942 - val_acc: 0.9230
420 Epoch 479/1000
421 91s 182ms/step - loss: 0.1230 - acc: 0.9909 - val_loss: 0.3843 - val_acc: 0.9274
422 Epoch 480/1000
423 91s 182ms/step - loss: 0.1207 - acc: 0.9918 - val_loss: 0.4098 - val_acc: 0.9196 ETA: 2s - loss: 0.1208 - acc: 0.9918
424 Epoch 481/1000
425 91s 182ms/step - loss: 0.1244 - acc: 0.9905 - val_loss: 0.4048 - val_acc: 0.9172
426 Epoch 482/1000
427 91s 182ms/step - loss: 0.1250 - acc: 0.9902 - val_loss: 0.4160 - val_acc: 0.9203
428 Epoch 483/1000
429 91s 182ms/step - loss: 0.1226 - acc: 0.9908 - val_loss: 0.4054 - val_acc: 0.9196
430 Epoch 484/1000
431 91s 182ms/step - loss: 0.1206 - acc: 0.9917 - val_loss: 0.4020 - val_acc: 0.9218
432 Epoch 485/1000
433 91s 182ms/step - loss: 0.1277 - acc: 0.9889 - val_loss: 0.3926 - val_acc: 0.9208 ETA: 3s - loss: 0.1276 - acc: 0.9889
434 Epoch 486/1000
435 91s 182ms/step - loss: 0.1244 - acc: 0.9901 - val_loss: 0.3976 - val_acc: 0.9179 ETA: 46s - loss: 0.1215 - acc: 0.9913
436 Epoch 487/1000
437 91s 182ms/step - loss: 0.1216 - acc: 0.9915 - val_loss: 0.4025 - val_acc: 0.9215
438 Epoch 488/1000
439 91s 182ms/step - loss: 0.1231 - acc: 0.9908 - val_loss: 0.4037 - val_acc: 0.9223
440 Epoch 489/1000
441 91s 182ms/step - loss: 0.1259 - acc: 0.9901 - val_loss: 0.4109 - val_acc: 0.9187
442 Epoch 490/1000
443 91s 182ms/step - loss: 0.1247 - acc: 0.9908 - val_loss: 0.4085 - val_acc: 0.9206
444 Epoch 491/1000
445 91s 182ms/step - loss: 0.1214 - acc: 0.9916 - val_loss: 0.4054 - val_acc: 0.9242 ETA: 7s - loss: 0.1211 - acc: 0.9916
446 Epoch 492/1000
447 91s 181ms/step - loss: 0.1253 - acc: 0.9898 - val_loss: 0.4153 - val_acc: 0.9198
448 Epoch 493/1000
449 91s 181ms/step - loss: 0.1203 - acc: 0.9919 - val_loss: 0.3971 - val_acc: 0.9212
450 Epoch 494/1000
451 91s 181ms/step - loss: 0.1243 - acc: 0.9903 - val_loss: 0.4066 - val_acc: 0.9195
452 Epoch 495/1000
453 91s 181ms/step - loss: 0.1256 - acc: 0.9904 - val_loss: 0.4061 - val_acc: 0.9199
454 Epoch 496/1000
455 91s 181ms/step - loss: 0.1266 - acc: 0.9890 - val_loss: 0.3927 - val_acc: 0.9221
456 Epoch 497/1000
457 90s 181ms/step - loss: 0.1256 - acc: 0.9897 - val_loss: 0.4104 - val_acc: 0.9210
458 Epoch 498/1000
459 90s 181ms/step - loss: 0.1218 - acc: 0.9909 - val_loss: 0.4074 - val_acc: 0.9176
460 Epoch 499/1000
461 90s 181ms/step - loss: 0.1225 - acc: 0.9907 - val_loss: 0.3931 - val_acc: 0.9219
462 Epoch 500/1000
463 90s 181ms/step - loss: 0.1238 - acc: 0.9908 - val_loss: 0.3940 - val_acc: 0.9262
464 Epoch 501/1000
465 90s 181ms/step - loss: 0.1217 - acc: 0.9912 - val_loss: 0.4017 - val_acc: 0.9238
466 Epoch 502/1000
467 91s 181ms/step - loss: 0.1239 - acc: 0.9906 - val_loss: 0.4000 - val_acc: 0.9217
468 Epoch 503/1000
469 91s 181ms/step - loss: 0.1219 - acc: 0.9915 - val_loss: 0.4070 - val_acc: 0.9199
470 Epoch 504/1000
471 91s 182ms/step - loss: 0.1237 - acc: 0.9907 - val_loss: 0.4045 - val_acc: 0.9205
472 Epoch 505/1000
473 91s 182ms/step - loss: 0.1291 - acc: 0.9884 - val_loss: 0.3828 - val_acc: 0.9203
474 Epoch 506/1000
475 91s 182ms/step - loss: 0.1250 - acc: 0.9899 - val_loss: 0.4053 - val_acc: 0.9232
476 Epoch 507/1000
477 91s 182ms/step - loss: 0.1248 - acc: 0.9907 - val_loss: 0.4098 - val_acc: 0.9204
478 Epoch 508/1000
479 91s 182ms/step - loss: 0.1212 - acc: 0.9920 - val_loss: 0.3999 - val_acc: 0.9222
480 Epoch 509/1000
481 91s 182ms/step - loss: 0.1223 - acc: 0.9918 - val_loss: 0.4083 - val_acc: 0.9183
482 Epoch 510/1000
483 91s 182ms/step - loss: 0.1250 - acc: 0.9900 - val_loss: 0.3959 - val_acc: 0.9209
484 Epoch 511/1000
485 91s 182ms/step - loss: 0.1190 - acc: 0.9919 - val_loss: 0.4029 - val_acc: 0.9237
486 Epoch 512/1000
487 91s 182ms/step - loss: 0.1191 - acc: 0.9924 - val_loss: 0.4040 - val_acc: 0.9221
488 Epoch 513/1000
489 91s 182ms/step - loss: 0.1229 - acc: 0.9906 - val_loss: 0.3949 - val_acc: 0.9251
490 Epoch 514/1000
491 91s 182ms/step - loss: 0.1263 - acc: 0.9895 - val_loss: 0.4191 - val_acc: 0.9186
492 Epoch 515/1000
493 91s 182ms/step - loss: 0.1240 - acc: 0.9904 - val_loss: 0.3939 - val_acc: 0.9208
494 Epoch 516/1000
495 91s 181ms/step - loss: 0.1240 - acc: 0.9906 - val_loss: 0.3991 - val_acc: 0.9181
496 Epoch 517/1000
497 91s 181ms/step - loss: 0.1209 - acc: 0.9915 - val_loss: 0.3953 - val_acc: 0.9216
498 Epoch 518/1000
499 91s 182ms/step - loss: 0.1215 - acc: 0.9910 - val_loss: 0.4056 - val_acc: 0.9219
500 Epoch 519/1000
501 91s 182ms/step - loss: 0.1232 - acc: 0.9905 - val_loss: 0.4092 - val_acc: 0.9187
502 Epoch 520/1000
503 91s 182ms/step - loss: 0.1252 - acc: 0.9899 - val_loss: 0.4108 - val_acc: 0.9190
504 Epoch 521/1000
505 91s 182ms/step - loss: 0.1215 - acc: 0.9912 - val_loss: 0.4031 - val_acc: 0.9191
506 Epoch 522/1000
507 91s 182ms/step - loss: 0.1236 - acc: 0.9903 - val_loss: 0.3995 - val_acc: 0.9201
508 Epoch 523/1000
509 91s 182ms/step - loss: 0.1226 - acc: 0.9916 - val_loss: 0.3823 - val_acc: 0.9264
510 Epoch 524/1000
511 91s 182ms/step - loss: 0.1229 - acc: 0.9913 - val_loss: 0.3882 - val_acc: 0.9237

本来想着早晨过来看看程序跑得怎么样了,却发现不知道为什么spyder自动退出了。

Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht, Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, IEEE Transactions on Industrial Electronics, 2020, DOI: 10.1109/TIE.2020.2972458

https://ieeexplore.ieee.org/document/8998530

posted @ 2020-05-17 23:04  世俗杂念  阅读(490)  评论(0编辑  收藏  举报