前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >【哈工大团队】动态ReLU:自适应参数化ReLU及Keras代码(调参记录12)

【哈工大团队】动态ReLU:自适应参数化ReLU及Keras代码(调参记录12)

作者头像
用户7368967
修改2020-05-27 18:13:07
3970
修改2020-05-27 18:13:07
举报
文章被收录于专栏:深度学习知识深度学习知识

本文介绍哈工大团队提出的一种动态ReLU(Dynamic ReLU)激活函数,即自适应参数化ReLU激活函数,原本是应用在基于一维振动信号的故障诊断,能够让每个样本有自己独特的ReLU参数,在2019年5月3日投稿至IEEE Transactions on Industrial Electronics,2020年1月24日录用,2020年2月13日在IEEE官网公布

本文在调参记录10的基础上,在数据增强部分添加了zoom_range = 0.2,将训练迭代次数增加到5000个epoch,批量大小改成了625,测试自适应参数化ReLU激活函数在Cifar10图像集上的效果。

自适应参数化ReLU激活函数的原理如下:

自适应参数化ReLU:一种Dynamic ReLU激活函数
自适应参数化ReLU:一种Dynamic ReLU激活函数

Keras程序:

代码语言:python
复制
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Tue Apr 14 04:17:45 2020
Implemented using TensorFlow 1.0.1 and Keras 2.2.1

Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht,
Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, 
IEEE Transactions on Industrial Electronics, DOI: 10.1109/TIE.2020.2972458,
Date of Publication: 13 February 2020

@author: Minghang Zhao
"""

from __future__ import print_function
import keras
import numpy as np
from keras.datasets import cifar10
from keras.layers import Dense, Conv2D, BatchNormalization, Activation, Minimum
from keras.layers import AveragePooling2D, Input, GlobalAveragePooling2D, Concatenate, Reshape
from keras.regularizers import l2
from keras import backend as K
from keras.models import Model
from keras import optimizers
from keras.preprocessing.image import ImageDataGenerator
from keras.callbacks import LearningRateScheduler
K.set_learning_phase(1)

# The data, split between train and test sets
(x_train, y_train), (x_test, y_test) = cifar10.load_data()
x_train = x_train.astype('float32') / 255.
x_test = x_test.astype('float32') / 255.
x_test = x_test-np.mean(x_train)
x_train = x_train-np.mean(x_train)
print('x_train shape:', x_train.shape)
print(x_train.shape[0], 'train samples')
print(x_test.shape[0], 'test samples')

# convert class vectors to binary class matrices
y_train = keras.utils.to_categorical(y_train, 10)
y_test = keras.utils.to_categorical(y_test, 10)

# Schedule the learning rate, multiply 0.1 every 1500 epoches
def scheduler(epoch):
  if epoch % 1500 == 0 and epoch != 0:
    lr = K.get_value(model.optimizer.lr)
    K.set_value(model.optimizer.lr, lr * 0.1)
    print("lr changed to {}".format(lr * 0.1))
  return K.get_value(model.optimizer.lr)

# An adaptively parametric rectifier linear unit (APReLU)
def aprelu(inputs):
  # get the number of channels
  channels = inputs.get_shape().as_list()[-1]
  # get a zero feature map
  zeros_input = keras.layers.subtract([inputs, inputs])
  # get a feature map with only positive features
  pos_input = Activation('relu')(inputs)
  # get a feature map with only negative features
  neg_input = Minimum()([inputs,zeros_input])
  # define a network to obtain the scaling coefficients
  scales_p = GlobalAveragePooling2D()(pos_input)
  scales_n = GlobalAveragePooling2D()(neg_input)
  scales = Concatenate()([scales_n, scales_p])
  scales = Dense(channels, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales)
  scales = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(scales)
  scales = Activation('relu')(scales)
  scales = Dense(channels, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales)
  scales = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(scales)
  scales = Activation('sigmoid')(scales)
  scales = Reshape((1,1,channels))(scales)
  # apply a paramtetric relu
  neg_part = keras.layers.multiply([scales, neg_input])
  return keras.layers.add([pos_input, neg_part])

# Residual Block
def residual_block(incoming, nb_blocks, out_channels, downsample=False,
          downsample_strides=2):
  
  residual = incoming
  in_channels = incoming.get_shape().as_list()[-1]
  
  for i in range(nb_blocks):
    
    identity = residual
    
    if not downsample:
      downsample_strides = 1
    
    residual = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(residual)
    residual = aprelu(residual)
    residual = Conv2D(out_channels, 3, strides=(downsample_strides, downsample_strides), 
             padding='same', kernel_initializer='he_normal', 
             kernel_regularizer=l2(1e-4))(residual)
    
    residual = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(residual)
    residual = aprelu(residual)
    residual = Conv2D(out_channels, 3, padding='same', kernel_initializer='he_normal', 
             kernel_regularizer=l2(1e-4))(residual)
    
    # Downsampling
    if downsample_strides > 1:
      identity = AveragePooling2D(pool_size=(1,1), strides=(2,2))(identity)
      
    # Zero_padding to match channels
    if in_channels != out_channels:
      zeros_identity = keras.layers.subtract([identity, identity])
      identity = keras.layers.concatenate([identity, zeros_identity])
      in_channels = out_channels
    
    residual = keras.layers.add([residual, identity])
  
  return residual


# define and train a model
inputs = Input(shape=(32, 32, 3))
net = Conv2D(16, 3, padding='same', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(inputs)
net = residual_block(net, 9, 16, downsample=False)
net = residual_block(net, 1, 32, downsample=True)
net = residual_block(net, 8, 32, downsample=False)
net = residual_block(net, 1, 64, downsample=True)
net = residual_block(net, 8, 64, downsample=False)
net = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(net)
net = Activation('relu')(net)
net = GlobalAveragePooling2D()(net)
outputs = Dense(10, activation='softmax', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(net)
model = Model(inputs=inputs, outputs=outputs)
sgd = optimizers.SGD(lr=0.1, decay=0., momentum=0.9, nesterov=True)
model.compile(loss='categorical_crossentropy', optimizer=sgd, metrics=['accuracy'])

# data augmentation
datagen = ImageDataGenerator(
  # randomly rotate images in the range (deg 0 to 180)
  rotation_range=30,
  # Range for random zoom
  zoom_range = 0.2,
  # shear angle in counter-clockwise direction in degrees
  shear_range = 30,
  # randomly flip images
  horizontal_flip=True,
  # randomly shift images horizontally
  width_shift_range=0.125,
  # randomly shift images vertically
  height_shift_range=0.125)

reduce_lr = LearningRateScheduler(scheduler)
# fit the model on the batches generated by datagen.flow().
model.fit_generator(datagen.flow(x_train, y_train, batch_size=625),
          validation_data=(x_test, y_test), epochs=5000, 
          verbose=1, callbacks=[reduce_lr], workers=10)

# get results
K.set_learning_phase(0)
DRSN_train_score = model.evaluate(x_train, y_train, batch_size=625, verbose=0)
print('Train loss:', DRSN_train_score[0])
print('Train accuracy:', DRSN_train_score[1])
DRSN_test_score = model.evaluate(x_test, y_test, batch_size=625, verbose=0)
print('Test loss:', DRSN_test_score[0])
print('Test accuracy:', DRSN_test_score[1])

实验结果:

代码语言:javascript
复制
Epoch 2500/5000
80/80 - 32s 399ms/step - loss: 0.0996 - acc: 0.9930 - val_loss: 0.4074 - val_acc: 0.9203
Epoch 2501/5000
80/80 - 32s 399ms/step - loss: 0.0986 - acc: 0.9937 - val_loss: 0.3996 - val_acc: 0.9202
Epoch 2502/5000
80/80 - 32s 399ms/step - loss: 0.0980 - acc: 0.9933 - val_loss: 0.3950 - val_acc: 0.9236
Epoch 2503/5000
80/80 - 32s 400ms/step - loss: 0.0997 - acc: 0.9928 - val_loss: 0.4173 - val_acc: 0.9190
Epoch 2504/5000
80/80 - 32s 400ms/step - loss: 0.1003 - acc: 0.9927 - val_loss: 0.4020 - val_acc: 0.9210
Epoch 2505/5000
80/80 - 32s 400ms/step - loss: 0.0983 - acc: 0.9935 - val_loss: 0.4114 - val_acc: 0.9209
Epoch 2506/5000
80/80 - 32s 400ms/step - loss: 0.0992 - acc: 0.9929 - val_loss: 0.4048 - val_acc: 0.9212
Epoch 2507/5000
80/80 - 32s 400ms/step - loss: 0.0991 - acc: 0.9935 - val_loss: 0.4153 - val_acc: 0.9205
Epoch 2508/5000
80/80 - 32s 400ms/step - loss: 0.0982 - acc: 0.9939 - val_loss: 0.4106 - val_acc: 0.9202
Epoch 2509/5000
80/80 - 32s 400ms/step - loss: 0.0978 - acc: 0.9937 - val_loss: 0.4142 - val_acc: 0.9214
Epoch 2510/5000
80/80 - 32s 400ms/step - loss: 0.0986 - acc: 0.9932 - val_loss: 0.4161 - val_acc: 0.9208
Epoch 2511/5000
80/80 - 32s 400ms/step - loss: 0.0996 - acc: 0.9931 - val_loss: 0.4168 - val_acc: 0.9213
Epoch 2512/5000
80/80 - 32s 401ms/step - loss: 0.0993 - acc: 0.9933 - val_loss: 0.4125 - val_acc: 0.9202
Epoch 2513/5000
80/80 - 32s 400ms/step - loss: 0.0991 - acc: 0.9932 - val_loss: 0.4072 - val_acc: 0.9203
Epoch 2514/5000
80/80 - 32s 399ms/step - loss: 0.0989 - acc: 0.9930 - val_loss: 0.4077 - val_acc: 0.9190
Epoch 2515/5000
80/80 - 32s 400ms/step - loss: 0.0983 - acc: 0.9933 - val_loss: 0.4031 - val_acc: 0.9212
Epoch 2516/5000
80/80 - 32s 400ms/step - loss: 0.0995 - acc: 0.9927 - val_loss: 0.4060 - val_acc: 0.9219
Epoch 2517/5000
80/80 - 32s 400ms/step - loss: 0.0975 - acc: 0.9933 - val_loss: 0.3955 - val_acc: 0.9247
Epoch 2518/5000
80/80 - 32s 400ms/step - loss: 0.0998 - acc: 0.9931 - val_loss: 0.4099 - val_acc: 0.9195
Epoch 2519/5000
80/80 - 32s 400ms/step - loss: 0.1007 - acc: 0.9924 - val_loss: 0.4077 - val_acc: 0.9195
Epoch 2520/5000
80/80 - 32s 399ms/step - loss: 0.0981 - acc: 0.9930 - val_loss: 0.4108 - val_acc: 0.9205
Epoch 2521/5000
80/80 - 32s 400ms/step - loss: 0.0997 - acc: 0.9931 - val_loss: 0.4017 - val_acc: 0.9233
Epoch 2522/5000
80/80 - 32s 399ms/step - loss: 0.0988 - acc: 0.9933 - val_loss: 0.4079 - val_acc: 0.9202
Epoch 2523/5000
80/80 - 32s 400ms/step - loss: 0.0989 - acc: 0.9932 - val_loss: 0.4085 - val_acc: 0.9224
Epoch 2524/5000
80/80 - 32s 399ms/step - loss: 0.0999 - acc: 0.9931 - val_loss: 0.4265 - val_acc: 0.9156
Epoch 2525/5000
80/80 - 32s 399ms/step - loss: 0.0991 - acc: 0.9928 - val_loss: 0.4078 - val_acc: 0.9198
Epoch 2526/5000
80/80 - 32s 400ms/step - loss: 0.0993 - acc: 0.9929 - val_loss: 0.4057 - val_acc: 0.9213
Epoch 2527/5000
80/80 - 32s 400ms/step - loss: 0.1016 - acc: 0.9925 - val_loss: 0.3981 - val_acc: 0.9214
Epoch 2528/5000
80/80 - 32s 399ms/step - loss: 0.1004 - acc: 0.9925 - val_loss: 0.4142 - val_acc: 0.9188
Epoch 2529/5000
80/80 - 32s 400ms/step - loss: 0.0980 - acc: 0.9938 - val_loss: 0.3881 - val_acc: 0.9244
Epoch 2530/5000
80/80 - 32s 399ms/step - loss: 0.0970 - acc: 0.9938 - val_loss: 0.4089 - val_acc: 0.9218
Epoch 2531/5000
80/80 - 32s 399ms/step - loss: 0.0982 - acc: 0.9935 - val_loss: 0.4139 - val_acc: 0.9185
Epoch 2532/5000
80/80 - 32s 400ms/step - loss: 0.1003 - acc: 0.9927 - val_loss: 0.3969 - val_acc: 0.9206
Epoch 2533/5000
80/80 - 32s 400ms/step - loss: 0.0994 - acc: 0.9927 - val_loss: 0.4123 - val_acc: 0.9180
Epoch 2534/5000
80/80 - 32s 399ms/step - loss: 0.0993 - acc: 0.9929 - val_loss: 0.4002 - val_acc: 0.9226
Epoch 2535/5000
80/80 - 32s 399ms/step - loss: 0.0957 - acc: 0.9943 - val_loss: 0.4116 - val_acc: 0.9192
Epoch 2536/5000
80/80 - 32s 399ms/step - loss: 0.0985 - acc: 0.9933 - val_loss: 0.4214 - val_acc: 0.9164
Epoch 2537/5000
80/80 - 32s 399ms/step - loss: 0.0967 - acc: 0.9940 - val_loss: 0.4219 - val_acc: 0.9163
Epoch 2538/5000
80/80 - 32s 400ms/step - loss: 0.0963 - acc: 0.9940 - val_loss: 0.4312 - val_acc: 0.9154
Epoch 2539/5000
80/80 - 32s 399ms/step - loss: 0.0959 - acc: 0.9943 - val_loss: 0.4308 - val_acc: 0.9166
Epoch 2540/5000
80/80 - 32s 400ms/step - loss: 0.0983 - acc: 0.9933 - val_loss: 0.4163 - val_acc: 0.9174
Epoch 2541/5000
80/80 - 32s 400ms/step - loss: 0.0973 - acc: 0.9935 - val_loss: 0.4336 - val_acc: 0.9172
Epoch 2542/5000
80/80 - 32s 400ms/step - loss: 0.0987 - acc: 0.9936 - val_loss: 0.4197 - val_acc: 0.9201
Epoch 2543/5000
80/80 - 32s 400ms/step - loss: 0.1006 - acc: 0.9925 - val_loss: 0.4098 - val_acc: 0.9219
Epoch 2544/5000
80/80 - 32s 400ms/step - loss: 0.0985 - acc: 0.9931 - val_loss: 0.4044 - val_acc: 0.9214
Epoch 2545/5000
80/80 - 32s 399ms/step - loss: 0.0997 - acc: 0.9929 - val_loss: 0.3958 - val_acc: 0.9251
Epoch 2546/5000
80/80 - 32s 400ms/step - loss: 0.0980 - acc: 0.9936 - val_loss: 0.3987 - val_acc: 0.9220
Epoch 2547/5000
80/80 - 32s 399ms/step - loss: 0.0985 - acc: 0.9930 - val_loss: 0.4155 - val_acc: 0.9144
Epoch 2548/5000
80/80 - 32s 399ms/step - loss: 0.0987 - acc: 0.9929 - val_loss: 0.4052 - val_acc: 0.9216
Epoch 2549/5000
80/80 - 32s 399ms/step - loss: 0.0961 - acc: 0.9943 - val_loss: 0.4019 - val_acc: 0.9248
Epoch 2550/5000
80/80 - 32s 399ms/step - loss: 0.0990 - acc: 0.9932 - val_loss: 0.4112 - val_acc: 0.9186
Epoch 2551/5000
80/80 - 32s 399ms/step - loss: 0.0970 - acc: 0.9937 - val_loss: 0.4091 - val_acc: 0.9190
Epoch 2552/5000
80/80 - 32s 399ms/step - loss: 0.0972 - acc: 0.9942 - val_loss: 0.4164 - val_acc: 0.9191
Epoch 2553/5000
80/80 - 32s 400ms/step - loss: 0.0971 - acc: 0.9939 - val_loss: 0.4206 - val_acc: 0.9156
Epoch 2554/5000
80/80 - 32s 400ms/step - loss: 0.0971 - acc: 0.9937 - val_loss: 0.4245 - val_acc: 0.9163
Epoch 2555/5000
80/80 - 32s 400ms/step - loss: 0.0988 - acc: 0.9927 - val_loss: 0.4192 - val_acc: 0.9191
Epoch 2556/5000
80/80 - 32s 399ms/step - loss: 0.0990 - acc: 0.9931 - val_loss: 0.4244 - val_acc: 0.9177
Epoch 2557/5000
80/80 - 32s 399ms/step - loss: 0.1005 - acc: 0.9928 - val_loss: 0.4160 - val_acc: 0.9195
Epoch 2558/5000
80/80 - 32s 399ms/step - loss: 0.0993 - acc: 0.9930 - val_loss: 0.4175 - val_acc: 0.9154
Epoch 2559/5000
80/80 - 32s 399ms/step - loss: 0.0994 - acc: 0.9929 - val_loss: 0.4214 - val_acc: 0.9157
Epoch 2560/5000
80/80 - 32s 399ms/step - loss: 0.0993 - acc: 0.9934 - val_loss: 0.4160 - val_acc: 0.9203
Epoch 2561/5000
80/80 - 32s 400ms/step - loss: 0.0990 - acc: 0.9931 - val_loss: 0.4251 - val_acc: 0.9156
Epoch 2562/5000
80/80 - 32s 399ms/step - loss: 0.0998 - acc: 0.9929 - val_loss: 0.4163 - val_acc: 0.9160
Epoch 2563/5000
80/80 - 32s 400ms/step - loss: 0.0980 - acc: 0.9933 - val_loss: 0.4083 - val_acc: 0.9204
Epoch 2564/5000
80/80 - 32s 400ms/step - loss: 0.0986 - acc: 0.9933 - val_loss: 0.4211 - val_acc: 0.9177
Epoch 2565/5000
80/80 - 32s 399ms/step - loss: 0.0982 - acc: 0.9929 - val_loss: 0.3990 - val_acc: 0.9198
Epoch 2566/5000
80/80 - 32s 400ms/step - loss: 0.1006 - acc: 0.9928 - val_loss: 0.4059 - val_acc: 0.9200
Epoch 2567/5000
80/80 - 32s 400ms/step - loss: 0.0982 - acc: 0.9936 - val_loss: 0.4255 - val_acc: 0.9172
Epoch 2568/5000
80/80 - 32s 399ms/step - loss: 0.0988 - acc: 0.9928 - val_loss: 0.4167 - val_acc: 0.9186
Epoch 2569/5000
80/80 - 32s 400ms/step - loss: 0.0982 - acc: 0.9937 - val_loss: 0.4128 - val_acc: 0.9216
Epoch 2570/5000
80/80 - 32s 400ms/step - loss: 0.1007 - acc: 0.9925 - val_loss: 0.4126 - val_acc: 0.9193
Epoch 2571/5000
80/80 - 32s 399ms/step - loss: 0.1005 - acc: 0.9923 - val_loss: 0.4234 - val_acc: 0.9143
Epoch 2572/5000
80/80 - 32s 400ms/step - loss: 0.0999 - acc: 0.9928 - val_loss: 0.4101 - val_acc: 0.9206
...
Epoch 2862/5000
80/80 - 32s 399ms/step - loss: 0.0955 - acc: 0.9940 - val_loss: 0.4041 - val_acc: 0.9193
Epoch 2863/5000
80/80 - 32s 400ms/step - loss: 0.0962 - acc: 0.9938 - val_loss: 0.4191 - val_acc: 0.9191
Epoch 2864/5000
80/80 - 32s 400ms/step - loss: 0.0973 - acc: 0.9931 - val_loss: 0.4045 - val_acc: 0.9201
Epoch 2865/5000
80/80 - 32s 399ms/step - loss: 0.0954 - acc: 0.9943 - val_loss: 0.3977 - val_acc: 0.9211
Epoch 2866/5000
80/80 - 32s 400ms/step - loss: 0.0961 - acc: 0.9938 - val_loss: 0.3989 - val_acc: 0.9227
Epoch 2867/5000
80/80 - 32s 400ms/step - loss: 0.0953 - acc: 0.9938 - val_loss: 0.3939 - val_acc: 0.9250
Epoch 2868/5000
80/80 - 32s 400ms/step - loss: 0.0979 - acc: 0.9930 - val_loss: 0.4125 - val_acc: 0.9216
Epoch 2869/5000
80/80 - 32s 400ms/step - loss: 0.0980 - acc: 0.9930 - val_loss: 0.4048 - val_acc: 0.9215
Epoch 2870/5000
80/80 - 32s 400ms/step - loss: 0.0993 - acc: 0.9924 - val_loss: 0.4179 - val_acc: 0.9196
Epoch 2871/5000
80/80 - 32s 399ms/step - loss: 0.0972 - acc: 0.9932 - val_loss: 0.4137 - val_acc: 0.9187
Epoch 2872/5000
80/80 - 32s 399ms/step - loss: 0.0984 - acc: 0.9931 - val_loss: 0.4157 - val_acc: 0.9195
Epoch 2873/5000
80/80 - 32s 400ms/step - loss: 0.0962 - acc: 0.9936 - val_loss: 0.4047 - val_acc: 0.9201
Epoch 2874/5000
80/80 - 32s 400ms/step - loss: 0.0961 - acc: 0.9935 - val_loss: 0.4069 - val_acc: 0.9214
Epoch 2875/5000
80/80 - 32s 400ms/step - loss: 0.0955 - acc: 0.9936 - val_loss: 0.4101 - val_acc: 0.9211
Epoch 2876/5000
80/80 - 32s 400ms/step - loss: 0.0973 - acc: 0.9933 - val_loss: 0.4001 - val_acc: 0.9206
Epoch 2877/5000
80/80 - 32s 400ms/step - loss: 0.0964 - acc: 0.9935 - val_loss: 0.4101 - val_acc: 0.9204
Epoch 2878/5000
80/80 - 32s 400ms/step - loss: 0.0967 - acc: 0.9934 - val_loss: 0.4193 - val_acc: 0.9174
Epoch 2879/5000
80/80 - 32s 400ms/step - loss: 0.0972 - acc: 0.9933 - val_loss: 0.4158 - val_acc: 0.9175
Epoch 2880/5000
80/80 - 32s 399ms/step - loss: 0.0972 - acc: 0.9932 - val_loss: 0.4232 - val_acc: 0.9165
Epoch 2881/5000
80/80 - 32s 399ms/step - loss: 0.0967 - acc: 0.9937 - val_loss: 0.4191 - val_acc: 0.9158
Epoch 2882/5000
80/80 - 32s 400ms/step - loss: 0.0976 - acc: 0.9931 - val_loss: 0.4062 - val_acc: 0.9201
Epoch 2883/5000
80/80 - 32s 400ms/step - loss: 0.0976 - acc: 0.9929 - val_loss: 0.4061 - val_acc: 0.9226
Epoch 2884/5000
80/80 - 32s 400ms/step - loss: 0.0970 - acc: 0.9933 - val_loss: 0.4124 - val_acc: 0.9204
Epoch 2885/5000
80/80 - 32s 400ms/step - loss: 0.0980 - acc: 0.9929 - val_loss: 0.4139 - val_acc: 0.9208
Epoch 2886/5000
80/80 - 32s 400ms/step - loss: 0.0999 - acc: 0.9925 - val_loss: 0.3985 - val_acc: 0.9198
Epoch 2887/5000
80/80 - 32s 399ms/step - loss: 0.0984 - acc: 0.9931 - val_loss: 0.4021 - val_acc: 0.9228
Epoch 2888/5000
80/80 - 32s 399ms/step - loss: 0.0941 - acc: 0.9943 - val_loss: 0.3999 - val_acc: 0.9217
Epoch 2889/5000
80/80 - 32s 400ms/step - loss: 0.0963 - acc: 0.9938 - val_loss: 0.4054 - val_acc: 0.9195
Epoch 2890/5000
80/80 - 32s 400ms/step - loss: 0.0997 - acc: 0.9925 - val_loss: 0.4106 - val_acc: 0.9215
Epoch 2891/5000
80/80 - 32s 400ms/step - loss: 0.0988 - acc: 0.9930 - val_loss: 0.4006 - val_acc: 0.9206
Epoch 2892/5000
80/80 - 32s 400ms/step - loss: 0.0969 - acc: 0.9934 - val_loss: 0.4031 - val_acc: 0.9217
Epoch 2893/5000
80/80 - 32s 400ms/step - loss: 0.0981 - acc: 0.9931 - val_loss: 0.4196 - val_acc: 0.9153
Epoch 2894/5000
80/80 - 32s 399ms/step - loss: 0.0960 - acc: 0.9937 - val_loss: 0.4029 - val_acc: 0.9210
Epoch 2895/5000
80/80 - 32s 399ms/step - loss: 0.0943 - acc: 0.9942 - val_loss: 0.4019 - val_acc: 0.9227
Epoch 2896/5000
80/80 - 32s 400ms/step - loss: 0.0972 - acc: 0.9936 - val_loss: 0.4055 - val_acc: 0.9214
Epoch 2897/5000
80/80 - 32s 400ms/step - loss: 0.0954 - acc: 0.9936 - val_loss: 0.4086 - val_acc: 0.9233
Epoch 2898/5000
80/80 - 32s 400ms/step - loss: 0.0970 - acc: 0.9931 - val_loss: 0.4137 - val_acc: 0.9195
Epoch 2899/5000
80/80 - 32s 399ms/step - loss: 0.0993 - acc: 0.9927 - val_loss: 0.3915 - val_acc: 0.9229
Epoch 2900/5000
80/80 - 32s 400ms/step - loss: 0.0993 - acc: 0.9928 - val_loss: 0.3793 - val_acc: 0.9265
Epoch 2901/5000
80/80 - 32s 399ms/step - loss: 0.0956 - acc: 0.9936 - val_loss: 0.3990 - val_acc: 0.9229
Epoch 2902/5000
80/80 - 32s 399ms/step - loss: 0.0976 - acc: 0.9929 - val_loss: 0.4090 - val_acc: 0.9201
Epoch 2903/5000
80/80 - 32s 400ms/step - loss: 0.0959 - acc: 0.9937 - val_loss: 0.4035 - val_acc: 0.9232
Epoch 2904/5000
80/80 - 32s 399ms/step - loss: 0.0982 - acc: 0.9930 - val_loss: 0.4085 - val_acc: 0.9210
Epoch 2905/5000
80/80 - 32s 401ms/step - loss: 0.0955 - acc: 0.9936 - val_loss: 0.4115 - val_acc: 0.9212
Epoch 2906/5000
80/80 - 32s 400ms/step - loss: 0.0993 - acc: 0.9926 - val_loss: 0.4004 - val_acc: 0.9202
Epoch 2907/5000
80/80 - 32s 399ms/step - loss: 0.0966 - acc: 0.9938 - val_loss: 0.3916 - val_acc: 0.9217
Epoch 2908/5000
80/80 - 32s 399ms/step - loss: 0.0946 - acc: 0.9940 - val_loss: 0.4052 - val_acc: 0.9215
Epoch 2909/5000
80/80 - 32s 400ms/step - loss: 0.0953 - acc: 0.9941 - val_loss: 0.4041 - val_acc: 0.9218
Epoch 2910/5000
80/80 - 32s 400ms/step - loss: 0.0957 - acc: 0.9938 - val_loss: 0.4080 - val_acc: 0.9225
Epoch 2911/5000
80/80 - 32s 400ms/step - loss: 0.0984 - acc: 0.9928 - val_loss: 0.4084 - val_acc: 0.9204
Epoch 2912/5000
80/80 - 32s 400ms/step - loss: 0.0957 - acc: 0.9937 - val_loss: 0.4140 - val_acc: 0.9189
Epoch 2913/5000
80/80 - 32s 400ms/step - loss: 0.0974 - acc: 0.9932 - val_loss: 0.3915 - val_acc: 0.9228
Epoch 2914/5000
80/80 - 32s 400ms/step - loss: 0.0968 - acc: 0.9939 - val_loss: 0.3995 - val_acc: 0.9218
Epoch 2915/5000
80/80 - 32s 400ms/step - loss: 0.0965 - acc: 0.9936 - val_loss: 0.4201 - val_acc: 0.9200
Epoch 2916/5000
80/80 - 32s 400ms/step - loss: 0.0967 - acc: 0.9937 - val_loss: 0.3995 - val_acc: 0.9223
Epoch 2917/5000
80/80 - 32s 400ms/step - loss: 0.0959 - acc: 0.9937 - val_loss: 0.4102 - val_acc: 0.9200
Epoch 2918/5000
80/80 - 32s 400ms/step - loss: 0.0955 - acc: 0.9939 - val_loss: 0.4134 - val_acc: 0.9197
Epoch 2919/5000
80/80 - 32s 400ms/step - loss: 0.0966 - acc: 0.9936 - val_loss: 0.4088 - val_acc: 0.9232
Epoch 2920/5000
80/80 - 32s 399ms/step - loss: 0.0969 - acc: 0.9930 - val_loss: 0.4175 - val_acc: 0.9202
Epoch 2921/5000
80/80 - 32s 399ms/step - loss: 0.0966 - acc: 0.9935 - val_loss: 0.4059 - val_acc: 0.9207
Epoch 2922/5000
80/80 - 32s 400ms/step - loss: 0.0983 - acc: 0.9927 - val_loss: 0.4131 - val_acc: 0.9187
Epoch 2923/5000
80/80 - 32s 400ms/step - loss: 0.0975 - acc: 0.9933 - val_loss: 0.4185 - val_acc: 0.9204
Epoch 2924/5000
80/80 - 32s 404ms/step - loss: 0.0952 - acc: 0.9938 - val_loss: 0.4081 - val_acc: 0.9242
Epoch 2925/5000
80/80 - 32s 400ms/step - loss: 0.0971 - acc: 0.9933 - val_loss: 0.4269 - val_acc: 0.9153
Epoch 2926/5000
80/80 - 32s 400ms/step - loss: 0.0981 - acc: 0.9926 - val_loss: 0.4256 - val_acc: 0.9169
Epoch 2927/5000
80/80 - 32s 400ms/step - loss: 0.0960 - acc: 0.9934 - val_loss: 0.4056 - val_acc: 0.9221
Epoch 2928/5000
80/80 - 32s 400ms/step - loss: 0.0971 - acc: 0.9932 - val_loss: 0.4175 - val_acc: 0.9180
Epoch 2929/5000
80/80 - 32s 400ms/step - loss: 0.0962 - acc: 0.9937 - val_loss: 0.4009 - val_acc: 0.9214
Epoch 2930/5000
80/80 - 32s 400ms/step - loss: 0.0960 - acc: 0.9935 - val_loss: 0.3967 - val_acc: 0.9223
Epoch 2931/5000
80/80 - 32s 400ms/step - loss: 0.0962 - acc: 0.9940 - val_loss: 0.4220 - val_acc: 0.9176
Epoch 2932/5000
80/80 - 32s 401ms/step - loss: 0.0975 - acc: 0.9928 - val_loss: 0.4027 - val_acc: 0.9214
Epoch 2933/5000
80/80 - 32s 400ms/step - loss: 0.0983 - acc: 0.9927 - val_loss: 0.4145 - val_acc: 0.9176
Epoch 2934/5000
80/80 - 32s 400ms/step - loss: 0.0943 - acc: 0.9946 - val_loss: 0.4156 - val_acc: 0.9179
Epoch 2935/5000
80/80 - 32s 400ms/step - loss: 0.0959 - acc: 0.9933 - val_loss: 0.4066 - val_acc: 0.9221
Epoch 2936/5000
80/80 - 32s 400ms/step - loss: 0.0967 - acc: 0.9934 - val_loss: 0.4068 - val_acc: 0.9212
Epoch 2937/5000
80/80 - 32s 400ms/step - loss: 0.0978 - acc: 0.9927 - val_loss: 0.4241 - val_acc: 0.9167
Epoch 2938/5000
80/80 - 32s 400ms/step - loss: 0.0980 - acc: 0.9931 - val_loss: 0.4394 - val_acc: 0.9145
Epoch 2939/5000
80/80 - 32s 400ms/step - loss: 0.0990 - acc: 0.9931 - val_loss: 0.4005 - val_acc: 0.9216
Epoch 2940/5000
80/80 - 32s 399ms/step - loss: 0.0986 - acc: 0.9927 - val_loss: 0.4044 - val_acc: 0.9193
Epoch 2941/5000
80/80 - 32s 400ms/step - loss: 0.0954 - acc: 0.9942 - val_loss: 0.4181 - val_acc: 0.9186
Epoch 2942/5000
80/80 - 32s 400ms/step - loss: 0.0958 - acc: 0.9937 - val_loss: 0.4147 - val_acc: 0.9203
Epoch 2943/5000
80/80 - 32s 399ms/step - loss: 0.0951 - acc: 0.9942 - val_loss: 0.4275 - val_acc: 0.9175
Epoch 2944/5000
80/80 - 32s 400ms/step - loss: 0.0939 - acc: 0.9944 - val_loss: 0.4291 - val_acc: 0.9166
Epoch 2945/5000
80/80 - 32s 400ms/step - loss: 0.0982 - acc: 0.9928 - val_loss: 0.4092 - val_acc: 0.9196
Epoch 2946/5000
80/80 - 32s 399ms/step - loss: 0.0978 - acc: 0.9932 - val_loss: 0.4252 - val_acc: 0.9205
Epoch 2947/5000
80/80 - 32s 400ms/step - loss: 0.0966 - acc: 0.9937 - val_loss: 0.4080 - val_acc: 0.9209
Epoch 2948/5000
80/80 - 32s 400ms/step - loss: 0.0977 - acc: 0.9930 - val_loss: 0.4041 - val_acc: 0.9204
Epoch 2949/5000
80/80 - 32s 399ms/step - loss: 0.0955 - acc: 0.9936 - val_loss: 0.4146 - val_acc: 0.9192
Epoch 2950/5000
80/80 - 32s 399ms/step - loss: 0.1007 - acc: 0.9920 - val_loss: 0.4123 - val_acc: 0.9200
Epoch 2951/5000
80/80 - 32s 399ms/step - loss: 0.0988 - acc: 0.9930 - val_loss: 0.4133 - val_acc: 0.9194
Epoch 2952/5000
80/80 - 32s 399ms/step - loss: 0.0972 - acc: 0.9929 - val_loss: 0.4094 - val_acc: 0.9190
Epoch 2953/5000
80/80 - 32s 400ms/step - loss: 0.0969 - acc: 0.9933 - val_loss: 0.3973 - val_acc: 0.9224
Epoch 2954/5000
80/80 - 32s 399ms/step - loss: 0.0951 - acc: 0.9938 - val_loss: 0.4130 - val_acc: 0.9218
Epoch 2955/5000
80/80 - 32s 400ms/step - loss: 0.0978 - acc: 0.9930 - val_loss: 0.4078 - val_acc: 0.9207
Epoch 2956/5000
80/80 - 32s 399ms/step - loss: 0.0967 - acc: 0.9937 - val_loss: 0.4102 - val_acc: 0.9188
Epoch 2957/5000
80/80 - 32s 400ms/step - loss: 0.1000 - acc: 0.9921 - val_loss: 0.4272 - val_acc: 0.9164
Epoch 2958/5000
80/80 - 32s 400ms/step - loss: 0.0979 - acc: 0.9932 - val_loss: 0.4090 - val_acc: 0.9187
Epoch 2959/5000
80/80 - 32s 400ms/step - loss: 0.0963 - acc: 0.9932 - val_loss: 0.4071 - val_acc: 0.9222
Epoch 2960/5000
80/80 - 32s 400ms/step - loss: 0.0958 - acc: 0.9940 - val_loss: 0.4108 - val_acc: 0.9208
Epoch 2961/5000
80/80 - 32s 399ms/step - loss: 0.0949 - acc: 0.9942 - val_loss: 0.4117 - val_acc: 0.9184
Epoch 2962/5000
80/80 - 32s 399ms/step - loss: 0.0939 - acc: 0.9948 - val_loss: 0.4099 - val_acc: 0.9209
Epoch 2963/5000
80/80 - 32s 400ms/step - loss: 0.0957 - acc: 0.9938 - val_loss: 0.4137 - val_acc: 0.9195
Epoch 2964/5000
80/80 - 32s 400ms/step - loss: 0.0970 - acc: 0.9934 - val_loss: 0.3981 - val_acc: 0.9202
Epoch 2965/5000
80/80 - 32s 399ms/step - loss: 0.0961 - acc: 0.9941 - val_loss: 0.4239 - val_acc: 0.9195
Epoch 2966/5000
80/80 - 32s 399ms/step - loss: 0.0958 - acc: 0.9939 - val_loss: 0.4232 - val_acc: 0.9203
Epoch 2967/5000
80/80 - 32s 400ms/step - loss: 0.0972 - acc: 0.9934 - val_loss: 0.4074 - val_acc: 0.9216
Epoch 2968/5000
80/80 - 32s 400ms/step - loss: 0.0953 - acc: 0.9936 - val_loss: 0.4118 - val_acc: 0.9193
Epoch 2969/5000
80/80 - 32s 399ms/step - loss: 0.0973 - acc: 0.9930 - val_loss: 0.4131 - val_acc: 0.9198
Epoch 2970/5000
80/80 - 32s 400ms/step - loss: 0.0958 - acc: 0.9940 - val_loss: 0.4189 - val_acc: 0.9160
Epoch 2971/5000
80/80 - 32s 400ms/step - loss: 0.0980 - acc: 0.9930 - val_loss: 0.4057 - val_acc: 0.9209
Epoch 2972/5000
80/80 - 32s 400ms/step - loss: 0.0961 - acc: 0.9931 - val_loss: 0.4205 - val_acc: 0.9179
Epoch 2973/5000
80/80 - 32s 399ms/step - loss: 0.0967 - acc: 0.9937 - val_loss: 0.4125 - val_acc: 0.9224
Epoch 2974/5000
80/80 - 32s 400ms/step - loss: 0.0972 - acc: 0.9934 - val_loss: 0.3937 - val_acc: 0.9228
Epoch 2975/5000
80/80 - 32s 399ms/step - loss: 0.0983 - acc: 0.9927 - val_loss: 0.4062 - val_acc: 0.9212
Epoch 2976/5000
80/80 - 32s 399ms/step - loss: 0.0957 - acc: 0.9937 - val_loss: 0.4155 - val_acc: 0.9193
Epoch 2977/5000
80/80 - 32s 400ms/step - loss: 0.0947 - acc: 0.9940 - val_loss: 0.4231 - val_acc: 0.9165
Epoch 2978/5000
80/80 - 32s 400ms/step - loss: 0.0957 - acc: 0.9934 - val_loss: 0.4168 - val_acc: 0.9203
Epoch 2979/5000
80/80 - 32s 400ms/step - loss: 0.0962 - acc: 0.9938 - val_loss: 0.3969 - val_acc: 0.9249
Epoch 2980/5000
80/80 - 32s 400ms/step - loss: 0.0966 - acc: 0.9933 - val_loss: 0.4029 - val_acc: 0.9222
Epoch 2981/5000
80/80 - 32s 400ms/step - loss: 0.0958 - acc: 0.9936 - val_loss: 0.4147 - val_acc: 0.9203
Epoch 2982/5000
80/80 - 32s 400ms/step - loss: 0.0953 - acc: 0.9937 - val_loss: 0.4105 - val_acc: 0.9237
Epoch 2983/5000
80/80 - 32s 400ms/step - loss: 0.0978 - acc: 0.9933 - val_loss: 0.4073 - val_acc: 0.9219
Epoch 2984/5000
80/80 - 32s 399ms/step - loss: 0.0992 - acc: 0.9927 - val_loss: 0.4007 - val_acc: 0.9206
Epoch 2985/5000
80/80 - 32s 400ms/step - loss: 0.0979 - acc: 0.9926 - val_loss: 0.3869 - val_acc: 0.9237
Epoch 2986/5000
80/80 - 32s 399ms/step - loss: 0.0965 - acc: 0.9934 - val_loss: 0.4219 - val_acc: 0.9190
Epoch 2987/5000
80/80 - 32s 399ms/step - loss: 0.0957 - acc: 0.9937 - val_loss: 0.4208 - val_acc: 0.9186
Epoch 2988/5000
80/80 - 32s 399ms/step - loss: 0.0973 - acc: 0.9932 - val_loss: 0.4147 - val_acc: 0.9178
Epoch 2989/5000
80/80 - 32s 399ms/step - loss: 0.0952 - acc: 0.9937 - val_loss: 0.4166 - val_acc: 0.9193
Epoch 2990/5000
80/80 - 32s 400ms/step - loss: 0.0975 - acc: 0.9930 - val_loss: 0.4258 - val_acc: 0.9171
Epoch 2991/5000
80/80 - 32s 399ms/step - loss: 0.0943 - acc: 0.9941 - val_loss: 0.4181 - val_acc: 0.9171
Epoch 2992/5000
80/80 - 32s 400ms/step - loss: 0.0953 - acc: 0.9939 - val_loss: 0.4198 - val_acc: 0.9200
Epoch 2993/5000
80/80 - 32s 399ms/step - loss: 0.0972 - acc: 0.9930 - val_loss: 0.4107 - val_acc: 0.9197
Epoch 2994/5000
80/80 - 32s 399ms/step - loss: 0.0968 - acc: 0.9933 - val_loss: 0.4157 - val_acc: 0.9169
Epoch 2995/5000
80/80 - 32s 399ms/step - loss: 0.0942 - acc: 0.9946 - val_loss: 0.4168 - val_acc: 0.9182
Epoch 2996/5000
80/80 - 32s 400ms/step - loss: 0.0985 - acc: 0.9928 - val_loss: 0.3975 - val_acc: 0.9207
Epoch 2997/5000
80/80 - 32s 399ms/step - loss: 0.0959 - acc: 0.9935 - val_loss: 0.3998 - val_acc: 0.9207
Epoch 2998/5000
80/80 - 32s 400ms/step - loss: 0.0964 - acc: 0.9934 - val_loss: 0.4053 - val_acc: 0.9188
Epoch 2999/5000
80/80 - 32s 400ms/step - loss: 0.0964 - acc: 0.9937 - val_loss: 0.4064 - val_acc: 0.9224
Epoch 3000/5000
80/80 - 32s 400ms/step - loss: 0.0980 - acc: 0.9927 - val_loss: 0.3968 - val_acc: 0.9236
Epoch 3001/5000
lr changed to 0.0009999999776482583
80/80 - 32s 400ms/step - loss: 0.0968 - acc: 0.9934 - val_loss: 0.3838 - val_acc: 0.9253
Epoch 3002/5000
80/80 - 32s 400ms/step - loss: 0.0884 - acc: 0.9963 - val_loss: 0.3789 - val_acc: 0.9271
Epoch 3003/5000
80/80 - 32s 400ms/step - loss: 0.0875 - acc: 0.9969 - val_loss: 0.3780 - val_acc: 0.9271
Epoch 3004/5000
80/80 - 32s 400ms/step - loss: 0.0847 - acc: 0.9976 - val_loss: 0.3759 - val_acc: 0.9278
Epoch 3005/5000
80/80 - 32s 400ms/step - loss: 0.0851 - acc: 0.9976 - val_loss: 0.3754 - val_acc: 0.9282
Epoch 3006/5000
80/80 - 32s 400ms/step - loss: 0.0834 - acc: 0.9981 - val_loss: 0.3740 - val_acc: 0.9288
Epoch 3007/5000
80/80 - 32s 400ms/step - loss: 0.0830 - acc: 0.9982 - val_loss: 0.3733 - val_acc: 0.9300
Epoch 3008/5000
80/80 - 32s 399ms/step - loss: 0.0831 - acc: 0.9982 - val_loss: 0.3756 - val_acc: 0.9301
Epoch 3009/5000
80/80 - 32s 400ms/step - loss: 0.0821 - acc: 0.9984 - val_loss: 0.3762 - val_acc: 0.9308
Epoch 3010/5000
80/80 - 32s 399ms/step - loss: 0.0824 - acc: 0.9982 - val_loss: 0.3773 - val_acc: 0.9300
Epoch 3011/5000
80/80 - 32s 400ms/step - loss: 0.0818 - acc: 0.9984 - val_loss: 0.3797 - val_acc: 0.9299
Epoch 3012/5000
80/80 - 32s 400ms/step - loss: 0.0821 - acc: 0.9985 - val_loss: 0.3804 - val_acc: 0.9294
Epoch 3013/5000
80/80 - 32s 400ms/step - loss: 0.0812 - acc: 0.9988 - val_loss: 0.3811 - val_acc: 0.9288
Epoch 3014/5000
80/80 - 32s 400ms/step - loss: 0.0819 - acc: 0.9984 - val_loss: 0.3810 - val_acc: 0.9300
Epoch 3015/5000
80/80 - 32s 400ms/step - loss: 0.0812 - acc: 0.9987 - val_loss: 0.3800 - val_acc: 0.9302
Epoch 3016/5000
80/80 - 32s 400ms/step - loss: 0.0813 - acc: 0.9989 - val_loss: 0.3814 - val_acc: 0.9299
Epoch 3017/5000
80/80 - 32s 400ms/step - loss: 0.0811 - acc: 0.9988 - val_loss: 0.3826 - val_acc: 0.9297
Epoch 3018/5000
80/80 - 32s 400ms/step - loss: 0.0807 - acc: 0.9988 - val_loss: 0.3823 - val_acc: 0.9307
Epoch 3019/5000
80/80 - 32s 400ms/step - loss: 0.0808 - acc: 0.9987 - val_loss: 0.3813 - val_acc: 0.9305
Epoch 3020/5000
80/80 - 32s 400ms/step - loss: 0.0810 - acc: 0.9988 - val_loss: 0.3836 - val_acc: 0.9296
Epoch 3021/5000
80/80 - 32s 400ms/step - loss: 0.0809 - acc: 0.9987 - val_loss: 0.3835 - val_acc: 0.9307
Epoch 3022/5000
80/80 - 32s 400ms/step - loss: 0.0803 - acc: 0.9988 - val_loss: 0.3824 - val_acc: 0.9301
Epoch 3023/5000
80/80 - 32s 399ms/step - loss: 0.0807 - acc: 0.9989 - val_loss: 0.3825 - val_acc: 0.9303
Epoch 3024/5000
80/80 - 32s 404ms/step - loss: 0.0802 - acc: 0.9990 - val_loss: 0.3835 - val_acc: 0.9301
Epoch 3025/5000
80/80 - 32s 400ms/step - loss: 0.0801 - acc: 0.9989 - val_loss: 0.3835 - val_acc: 0.9294
Epoch 3026/5000
80/80 - 32s 400ms/step - loss: 0.0806 - acc: 0.9986 - val_loss: 0.3824 - val_acc: 0.9306
Epoch 3027/5000
80/80 - 32s 400ms/step - loss: 0.0805 - acc: 0.9989 - val_loss: 0.3806 - val_acc: 0.9304
Epoch 3028/5000
80/80 - 32s 400ms/step - loss: 0.0799 - acc: 0.9991 - val_loss: 0.3837 - val_acc: 0.9303
Epoch 3029/5000
80/80 - 32s 400ms/step - loss: 0.0799 - acc: 0.9990 - val_loss: 0.3835 - val_acc: 0.9302
Epoch 3030/5000
80/80 - 32s 400ms/step - loss: 0.0795 - acc: 0.9991 - val_loss: 0.3831 - val_acc: 0.9312
Epoch 3031/5000
80/80 - 32s 400ms/step - loss: 0.0803 - acc: 0.9989 - val_loss: 0.3833 - val_acc: 0.9312
Epoch 3032/5000
80/80 - 32s 401ms/step - loss: 0.0803 - acc: 0.9989 - val_loss: 0.3868 - val_acc: 0.9298
Epoch 3033/5000
80/80 - 32s 400ms/step - loss: 0.0792 - acc: 0.9991 - val_loss: 0.3878 - val_acc: 0.9300
Epoch 3034/5000
80/80 - 32s 400ms/step - loss: 0.0795 - acc: 0.9990 - val_loss: 0.3866 - val_acc: 0.9302
Epoch 3035/5000
80/80 - 32s 400ms/step - loss: 0.0801 - acc: 0.9988 - val_loss: 0.3845 - val_acc: 0.9308
Epoch 3036/5000
80/80 - 32s 400ms/step - loss: 0.0791 - acc: 0.9993 - val_loss: 0.3815 - val_acc: 0.9313
Epoch 3037/5000
80/80 - 32s 400ms/step - loss: 0.0790 - acc: 0.9992 - val_loss: 0.3807 - val_acc: 0.9312
Epoch 3038/5000
80/80 - 32s 400ms/step - loss: 0.0797 - acc: 0.9990 - val_loss: 0.3808 - val_acc: 0.9317
Epoch 3039/5000
80/80 - 32s 400ms/step - loss: 0.0795 - acc: 0.9991 - val_loss: 0.3818 - val_acc: 0.9311
Epoch 3040/5000
80/80 - 32s 400ms/step - loss: 0.0794 - acc: 0.9990 - val_loss: 0.3828 - val_acc: 0.9319
Epoch 3041/5000
80/80 - 32s 400ms/step - loss: 0.0797 - acc: 0.9988 - val_loss: 0.3847 - val_acc: 0.9315
Epoch 3042/5000
80/80 - 32s 399ms/step - loss: 0.0792 - acc: 0.9991 - val_loss: 0.3845 - val_acc: 0.9312
Epoch 3043/5000
80/80 - 32s 400ms/step - loss: 0.0790 - acc: 0.9992 - val_loss: 0.3861 - val_acc: 0.9310
Epoch 3044/5000
80/80 - 32s 400ms/step - loss: 0.0791 - acc: 0.9990 - val_loss: 0.3872 - val_acc: 0.9309
Epoch 3045/5000
80/80 - 32s 399ms/step - loss: 0.0799 - acc: 0.9988 - val_loss: 0.3869 - val_acc: 0.9317
Epoch 3046/5000
80/80 - 32s 398ms/step - loss: 0.0791 - acc: 0.9991 - val_loss: 0.3870 - val_acc: 0.9324
Epoch 3047/5000
80/80 - 32s 399ms/step - loss: 0.0789 - acc: 0.9992 - val_loss: 0.3876 - val_acc: 0.9326
Epoch 3048/5000
80/80 - 32s 400ms/step - loss: 0.0787 - acc: 0.9991 - val_loss: 0.3884 - val_acc: 0.9321
Epoch 3049/5000
80/80 - 32s 400ms/step - loss: 0.0786 - acc: 0.9993 - val_loss: 0.3874 - val_acc: 0.9322
Epoch 3050/5000
80/80 - 32s 400ms/step - loss: 0.0789 - acc: 0.9991 - val_loss: 0.3894 - val_acc: 0.9316
Epoch 3051/5000
80/80 - 32s 400ms/step - loss: 0.0791 - acc: 0.9992 - val_loss: 0.3875 - val_acc: 0.9311
Epoch 3052/5000
80/80 - 32s 399ms/step - loss: 0.0787 - acc: 0.9991 - val_loss: 0.3849 - val_acc: 0.9318
Epoch 3053/5000
80/80 - 32s 399ms/step - loss: 0.0787 - acc: 0.9991 - val_loss: 0.3865 - val_acc: 0.9325
Epoch 3054/5000
80/80 - 32s 400ms/step - loss: 0.0785 - acc: 0.9992 - val_loss: 0.3880 - val_acc: 0.9316
Epoch 3055/5000
80/80 - 32s 399ms/step - loss: 0.0782 - acc: 0.9994 - val_loss: 0.3899 - val_acc: 0.9322
Epoch 3056/5000
80/80 - 32s 400ms/step - loss: 0.0787 - acc: 0.9991 - val_loss: 0.3896 - val_acc: 0.9315
Epoch 3057/5000
80/80 - 32s 399ms/step - loss: 0.0783 - acc: 0.9992 - val_loss: 0.3895 - val_acc: 0.9322
Epoch 3058/5000
80/80 - 32s 400ms/step - loss: 0.0781 - acc: 0.9994 - val_loss: 0.3903 - val_acc: 0.9310
Epoch 3059/5000
80/80 - 32s 399ms/step - loss: 0.0785 - acc: 0.9991 - val_loss: 0.3909 - val_acc: 0.9315
Epoch 3060/5000
80/80 - 32s 399ms/step - loss: 0.0785 - acc: 0.9991 - val_loss: 0.3904 - val_acc: 0.9320
Epoch 3061/5000
80/80 - 32s 400ms/step - loss: 0.0783 - acc: 0.9992 - val_loss: 0.3901 - val_acc: 0.9324
Epoch 3062/5000
80/80 - 32s 399ms/step - loss: 0.0779 - acc: 0.9995 - val_loss: 0.3888 - val_acc: 0.9322
Epoch 3063/5000
80/80 - 32s 400ms/step - loss: 0.0783 - acc: 0.9993 - val_loss: 0.3861 - val_acc: 0.9325
Epoch 3064/5000
80/80 - 32s 400ms/step - loss: 0.0779 - acc: 0.9993 - val_loss: 0.3867 - val_acc: 0.9321
Epoch 3065/5000
80/80 - 32s 399ms/step - loss: 0.0781 - acc: 0.9993 - val_loss: 0.3863 - val_acc: 0.9322
Epoch 3066/5000
80/80 - 32s 400ms/step - loss: 0.0785 - acc: 0.9991 - val_loss: 0.3847 - val_acc: 0.9322
Epoch 3067/5000
80/80 - 32s 399ms/step - loss: 0.0785 - acc: 0.9991 - val_loss: 0.3853 - val_acc: 0.9319
Epoch 3068/5000
80/80 - 32s 400ms/step - loss: 0.0778 - acc: 0.9993 - val_loss: 0.3843 - val_acc: 0.9317
Epoch 3069/5000
80/80 - 32s 399ms/step - loss: 0.0780 - acc: 0.9993 - val_loss: 0.3830 - val_acc: 0.9318
Epoch 3070/5000
80/80 - 32s 400ms/step - loss: 0.0780 - acc: 0.9993 - val_loss: 0.3841 - val_acc: 0.9317
Epoch 3071/5000
80/80 - 32s 400ms/step - loss: 0.0777 - acc: 0.9994 - val_loss: 0.3858 - val_acc: 0.9317
Epoch 3072/5000
80/80 - 32s 399ms/step - loss: 0.0780 - acc: 0.9990 - val_loss: 0.3869 - val_acc: 0.9318
Epoch 3073/5000
80/80 - 32s 399ms/step - loss: 0.0783 - acc: 0.9991 - val_loss: 0.3884 - val_acc: 0.9320
Epoch 3074/5000
80/80 - 32s 400ms/step - loss: 0.0776 - acc: 0.9994 - val_loss: 0.3898 - val_acc: 0.9320
Epoch 3075/5000
80/80 - 32s 399ms/step - loss: 0.0777 - acc: 0.9994 - val_loss: 0.3903 - val_acc: 0.9321
Epoch 3076/5000
80/80 - 32s 399ms/step - loss: 0.0784 - acc: 0.9992 - val_loss: 0.3896 - val_acc: 0.9320
Epoch 3077/5000
80/80 - 32s 400ms/step - loss: 0.0774 - acc: 0.9993 - val_loss: 0.3913 - val_acc: 0.9311
Epoch 3078/5000
80/80 - 32s 403ms/step - loss: 0.0776 - acc: 0.9994 - val_loss: 0.3905 - val_acc: 0.9317
Epoch 3079/5000
80/80 - 32s 400ms/step - loss: 0.0779 - acc: 0.9991 - val_loss: 0.3911 - val_acc: 0.9309
Epoch 3080/5000
80/80 - 32s 399ms/step - loss: 0.0780 - acc: 0.9991 - val_loss: 0.3926 - val_acc: 0.9318
Epoch 3081/5000
80/80 - 32s 400ms/step - loss: 0.0779 - acc: 0.9991 - val_loss: 0.3899 - val_acc: 0.9317
Epoch 3082/5000
80/80 - 32s 400ms/step - loss: 0.0775 - acc: 0.9994 - val_loss: 0.3904 - val_acc: 0.9313
Epoch 3083/5000
80/80 - 32s 400ms/step - loss: 0.0774 - acc: 0.9993 - val_loss: 0.3914 - val_acc: 0.9317
Epoch 3084/5000
80/80 - 32s 400ms/step - loss: 0.0780 - acc: 0.9991 - val_loss: 0.3894 - val_acc: 0.9310
Epoch 3085/5000
80/80 - 32s 400ms/step - loss: 0.0771 - acc: 0.9995 - val_loss: 0.3894 - val_acc: 0.9316
Epoch 3086/5000
80/80 - 32s 400ms/step - loss: 0.0774 - acc: 0.9993 - val_loss: 0.3907 - val_acc: 0.9314
Epoch 3087/5000
80/80 - 32s 400ms/step - loss: 0.0773 - acc: 0.9993 - val_loss: 0.3931 - val_acc: 0.9309
Epoch 3088/5000
80/80 - 32s 400ms/step - loss: 0.0768 - acc: 0.9996 - val_loss: 0.3946 - val_acc: 0.9305
Epoch 3089/5000
80/80 - 32s 399ms/step - loss: 0.0772 - acc: 0.9994 - val_loss: 0.3949 - val_acc: 0.9299
Epoch 3090/5000
80/80 - 32s 400ms/step - loss: 0.0776 - acc: 0.9993 - val_loss: 0.3971 - val_acc: 0.9312
Epoch 3091/5000
80/80 - 32s 400ms/step - loss: 0.0771 - acc: 0.9995 - val_loss: 0.3960 - val_acc: 0.9310
Epoch 3092/5000
80/80 - 32s 400ms/step - loss: 0.0773 - acc: 0.9994 - val_loss: 0.3976 - val_acc: 0.9300
Epoch 3093/5000
80/80 - 32s 400ms/step - loss: 0.0773 - acc: 0.9993 - val_loss: 0.3980 - val_acc: 0.9296
Epoch 3094/5000
80/80 - 32s 400ms/step - loss: 0.0772 - acc: 0.9994 - val_loss: 0.3988 - val_acc: 0.9306
Epoch 3095/5000
80/80 - 32s 400ms/step - loss: 0.0772 - acc: 0.9993 - val_loss: 0.4012 - val_acc: 0.9295
Epoch 3096/5000
80/80 - 32s 400ms/step - loss: 0.0768 - acc: 0.9995 - val_loss: 0.3996 - val_acc: 0.9296
Epoch 3097/5000
80/80 - 32s 400ms/step - loss: 0.0773 - acc: 0.9993 - val_loss: 0.3985 - val_acc: 0.9294
Epoch 3098/5000
80/80 - 32s 401ms/step - loss: 0.0773 - acc: 0.9993 - val_loss: 0.3982 - val_acc: 0.9299
Epoch 3099/5000
80/80 - 32s 400ms/step - loss: 0.0774 - acc: 0.9992 - val_loss: 0.3972 - val_acc: 0.9299
Epoch 3100/5000
80/80 - 32s 400ms/step - loss: 0.0771 - acc: 0.9994 - val_loss: 0.3970 - val_acc: 0.9306
Epoch 3101/5000
80/80 - 32s 400ms/step - loss: 0.0771 - acc: 0.9993 - val_loss: 0.3939 - val_acc: 0.9304
Epoch 3102/5000
80/80 - 32s 400ms/step - loss: 0.0772 - acc: 0.9994 - val_loss: 0.3938 - val_acc: 0.9311
Epoch 3103/5000
80/80 - 32s 400ms/step - loss: 0.0770 - acc: 0.9993 - val_loss: 0.3923 - val_acc: 0.9308
Epoch 3104/5000
80/80 - 32s 400ms/step - loss: 0.0768 - acc: 0.9993 - val_loss: 0.3923 - val_acc: 0.9313
Epoch 3105/5000
80/80 - 32s 400ms/step - loss: 0.0771 - acc: 0.9993 - val_loss: 0.3946 - val_acc: 0.9304
Epoch 3106/5000
80/80 - 32s 399ms/step - loss: 0.0766 - acc: 0.9995 - val_loss: 0.3959 - val_acc: 0.9296
Epoch 3107/5000
80/80 - 32s 399ms/step - loss: 0.0774 - acc: 0.9990 - val_loss: 0.3940 - val_acc: 0.9304
Epoch 3108/5000
80/80 - 32s 399ms/step - loss: 0.0767 - acc: 0.9994 - val_loss: 0.3944 - val_acc: 0.9307
Epoch 3109/5000
80/80 - 32s 399ms/step - loss: 0.0771 - acc: 0.9992 - val_loss: 0.3942 - val_acc: 0.9296
Epoch 3110/5000
80/80 - 32s 400ms/step - loss: 0.0760 - acc: 0.9996 - val_loss: 0.3949 - val_acc: 0.9303
Epoch 3111/5000
80/80 - 32s 399ms/step - loss: 0.0767 - acc: 0.9994 - val_loss: 0.3933 - val_acc: 0.9306
Epoch 3112/5000
80/80 - 32s 399ms/step - loss: 0.0768 - acc: 0.9993 - val_loss: 0.3945 - val_acc: 0.9310
Epoch 3113/5000
80/80 - 32s 399ms/step - loss: 0.0768 - acc: 0.9993 - val_loss: 0.3940 - val_acc: 0.9314
Epoch 3114/5000
80/80 - 32s 399ms/step - loss: 0.0773 - acc: 0.9993 - val_loss: 0.3902 - val_acc: 0.9315
Epoch 3115/5000
80/80 - 32s 399ms/step - loss: 0.0763 - acc: 0.9995 - val_loss: 0.3896 - val_acc: 0.9325
Epoch 3116/5000
80/80 - 32s 400ms/step - loss: 0.0763 - acc: 0.9995 - val_loss: 0.3923 - val_acc: 0.9328
Epoch 3117/5000
80/80 - 32s 400ms/step - loss: 0.0764 - acc: 0.9994 - val_loss: 0.3917 - val_acc: 0.9330
Epoch 3118/5000
80/80 - 32s 400ms/step - loss: 0.0768 - acc: 0.9994 - val_loss: 0.3927 - val_acc: 0.9325
Epoch 3119/5000
80/80 - 32s 400ms/step - loss: 0.0763 - acc: 0.9994 - val_loss: 0.3945 - val_acc: 0.9310
Epoch 3120/5000
80/80 - 32s 399ms/step - loss: 0.0771 - acc: 0.9993 - val_loss: 0.3959 - val_acc: 0.9309
Epoch 3121/5000
80/80 - 32s 399ms/step - loss: 0.0767 - acc: 0.9994 - val_loss: 0.3965 - val_acc: 0.9311
Epoch 3122/5000
80/80 - 32s 399ms/step - loss: 0.0766 - acc: 0.9994 - val_loss: 0.3950 - val_acc: 0.9312
Epoch 3123/5000
80/80 - 32s 400ms/step - loss: 0.0764 - acc: 0.9994 - val_loss: 0.3932 - val_acc: 0.9316
Epoch 3124/5000
80/80 - 32s 400ms/step - loss: 0.0764 - acc: 0.9994 - val_loss: 0.3934 - val_acc: 0.9309
Epoch 3125/5000
80/80 - 32s 400ms/step - loss: 0.0767 - acc: 0.9994 - val_loss: 0.3906 - val_acc: 0.9316
Epoch 3126/5000
80/80 - 32s 400ms/step - loss: 0.0762 - acc: 0.9995 - val_loss: 0.3904 - val_acc: 0.9318
Epoch 3127/5000
80/80 - 32s 399ms/step - loss: 0.0762 - acc: 0.9994 - val_loss: 0.3919 - val_acc: 0.9312
Epoch 3128/5000
80/80 - 32s 399ms/step - loss: 0.0764 - acc: 0.9993 - val_loss: 0.3937 - val_acc: 0.9309
Epoch 3129/5000
80/80 - 32s 400ms/step - loss: 0.0762 - acc: 0.9995 - val_loss: 0.3957 - val_acc: 0.9301
Epoch 3130/5000
80/80 - 32s 400ms/step - loss: 0.0765 - acc: 0.9992 - val_loss: 0.3955 - val_acc: 0.9303
Epoch 3131/5000
80/80 - 32s 399ms/step - loss: 0.0764 - acc: 0.9993 - val_loss: 0.3974 - val_acc: 0.9298
Epoch 3132/5000
80/80 - 32s 400ms/step - loss: 0.0764 - acc: 0.9993 - val_loss: 0.3986 - val_acc: 0.9291
Epoch 3133/5000
80/80 - 32s 400ms/step - loss: 0.0759 - acc: 0.9995 - val_loss: 0.3973 - val_acc: 0.9287
Epoch 3134/5000
80/80 - 32s 399ms/step - loss: 0.0761 - acc: 0.9994 - val_loss: 0.3968 - val_acc: 0.9302
Epoch 3135/5000
80/80 - 32s 399ms/step - loss: 0.0761 - acc: 0.9994 - val_loss: 0.3973 - val_acc: 0.9292
Epoch 3136/5000
80/80 - 32s 399ms/step - loss: 0.0761 - acc: 0.9994 - val_loss: 0.3970 - val_acc: 0.9290
Epoch 3137/5000
80/80 - 32s 399ms/step - loss: 0.0763 - acc: 0.9994 - val_loss: 0.3973 - val_acc: 0.9297
Epoch 3138/5000
80/80 - 32s 399ms/step - loss: 0.0763 - acc: 0.9994 - val_loss: 0.3962 - val_acc: 0.9303
...
Epoch 3532/5000
80/80 - 32s 398ms/step - loss: 0.0682 - acc: 0.9995 - val_loss: 0.4004 - val_acc: 0.9325
Epoch 3533/5000
80/80 - 32s 398ms/step - loss: 0.0683 - acc: 0.9996 - val_loss: 0.4012 - val_acc: 0.9320
Epoch 3534/5000
80/80 - 32s 398ms/step - loss: 0.0681 - acc: 0.9996 - val_loss: 0.4006 - val_acc: 0.9320
Epoch 3535/5000
80/80 - 32s 398ms/step - loss: 0.0678 - acc: 0.9996 - val_loss: 0.4026 - val_acc: 0.9308
Epoch 3536/5000
80/80 - 32s 398ms/step - loss: 0.0675 - acc: 0.9997 - val_loss: 0.4037 - val_acc: 0.9306
Epoch 3537/5000
80/80 - 32s 398ms/step - loss: 0.0683 - acc: 0.9995 - val_loss: 0.4056 - val_acc: 0.9296
Epoch 3538/5000
80/80 - 32s 398ms/step - loss: 0.0684 - acc: 0.9995 - val_loss: 0.4053 - val_acc: 0.9313
Epoch 3539/5000
80/80 - 32s 400ms/step - loss: 0.0682 - acc: 0.9995 - val_loss: 0.4037 - val_acc: 0.9305
Epoch 3540/5000
15/80 [====>.........................] - ETA: 24s - loss: 0.0678 - acc: 0.9997Traceback (most recent call last):

没跑完就把程序给中断了。训练准确率已经接近100%,测试准确率一直在93%波动,似乎已经进入过拟合阶段了,再继续训练也没什么用处。

怎么抑制过拟合呢?数据增强部分已经用了很多种方式了,下一步减小网络规模?

Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht, Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, IEEE Transactions on Industrial Electronics, DOI: 10.1109/TIE.2020.2972458, Date of Publication: 13 February 2020

https://ieeexplore.ieee.org/document/8998530

本文系转载,前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文系转载前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

评论
作者已关闭评论
0 条评论
热度
最新
推荐阅读
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档