前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >【哈工大】Dynamic ReLU:自适应参数化ReLU及Keras代码(调参记录24)Cifar10~95.80%

【哈工大】Dynamic ReLU:自适应参数化ReLU及Keras代码(调参记录24)Cifar10~95.80%

作者头像
用户7368967
修改2020-05-28 14:14:16
4030
修改2020-05-28 14:14:16
举报
文章被收录于专栏:深度学习知识深度学习知识

本文介绍哈工大团队提出的一种动态ReLU(Dynamic ReLU)激活函数,即自适应参数化ReLU激活函数,原本是应用在基于一维振动信号的故障诊断,能够让每个样本有自己独特的ReLU参数,在2019年5月3日投稿至IEEE Transactions on Industrial Electronics,2020年1月24日录用,2020年2月13日在IEEE官网公布

本文在调参记录23的基础上,增加卷积核的个数,最少是64个,最多是256个,继续测试深度残差网络+自适应参数化ReLU激活函数在cifar10数据集上的效果。

自适应参数化ReLU激活函数被放在了残差模块的第二个卷积层之后,它的基本原理如下:

自适应参数化ReLU:一种Dynamic ReLU激活函数
自适应参数化ReLU:一种Dynamic ReLU激活函数

Keras程序:

代码语言:python
复制
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Tue Apr 14 04:17:45 2020
Implemented using TensorFlow 1.0.1 and Keras 2.2.1

Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht,
Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, 
IEEE Transactions on Industrial Electronics, DOI: 10.1109/TIE.2020.2972458,
Date of Publication: 13 February 2020

@author: Minghang Zhao
"""

from __future__ import print_function
import keras
import numpy as np
from keras.datasets import cifar10
from keras.layers import Dense, Conv2D, BatchNormalization, Activation, Minimum
from keras.layers import AveragePooling2D, Input, GlobalAveragePooling2D, Concatenate, Reshape
from keras.regularizers import l2
from keras import backend as K
from keras.models import Model
from keras import optimizers
from keras.preprocessing.image import ImageDataGenerator
from keras.callbacks import LearningRateScheduler
K.set_learning_phase(1)

# The data, split between train and test sets
(x_train, y_train), (x_test, y_test) = cifar10.load_data()
x_train = x_train.astype('float32') / 255.
x_test = x_test.astype('float32') / 255.
x_test = x_test-np.mean(x_train)
x_train = x_train-np.mean(x_train)
print('x_train shape:', x_train.shape)
print(x_train.shape[0], 'train samples')
print(x_test.shape[0], 'test samples')

# convert class vectors to binary class matrices
y_train = keras.utils.to_categorical(y_train, 10)
y_test = keras.utils.to_categorical(y_test, 10)

# Schedule the learning rate, multiply 0.1 every 150 epoches
def scheduler(epoch):
    if epoch % 150 == 0 and epoch != 0:
        lr = K.get_value(model.optimizer.lr)
        K.set_value(model.optimizer.lr, lr * 0.1)
        print("lr changed to {}".format(lr * 0.1))
    return K.get_value(model.optimizer.lr)

# An adaptively parametric rectifier linear unit (APReLU)
def aprelu(inputs):
    # get the number of channels
    channels = inputs.get_shape().as_list()[-1]
    # get a zero feature map
    zeros_input = keras.layers.subtract([inputs, inputs])
    # get a feature map with only positive features
    pos_input = Activation('relu')(inputs)
    # get a feature map with only negative features
    neg_input = Minimum()([inputs,zeros_input])
    # define a network to obtain the scaling coefficients
    scales_p = GlobalAveragePooling2D()(pos_input)
    scales_n = GlobalAveragePooling2D()(neg_input)
    scales = Concatenate()([scales_n, scales_p])
    scales = Dense(channels//16, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales)
    scales = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(scales)
    scales = Activation('relu')(scales)
    scales = Dense(channels, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales)
    scales = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(scales)
    scales = Activation('sigmoid')(scales)
    scales = Reshape((1,1,channels))(scales)
    # apply a paramtetric relu
    neg_part = keras.layers.multiply([scales, neg_input])
    return keras.layers.add([pos_input, neg_part])

# Residual Block
def residual_block(incoming, nb_blocks, out_channels, downsample=False,
                   downsample_strides=2):
    
    residual = incoming
    in_channels = incoming.get_shape().as_list()[-1]
    
    for i in range(nb_blocks):
        
        identity = residual
        
        if not downsample:
            downsample_strides = 1
        
        residual = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(residual)
        residual = Activation('relu')(residual)
        residual = Conv2D(out_channels, 3, strides=(downsample_strides, downsample_strides), 
                          padding='same', kernel_initializer='he_normal', 
                          kernel_regularizer=l2(1e-4))(residual)
        
        residual = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(residual)
        residual = Activation('relu')(residual)
        residual = Conv2D(out_channels, 3, padding='same', kernel_initializer='he_normal', 
                          kernel_regularizer=l2(1e-4))(residual)
        
        residual = aprelu(residual)
        
        # Downsampling
        if downsample_strides > 1:
            identity = AveragePooling2D(pool_size=(1,1), strides=(2,2))(identity)
            
        # Zero_padding to match channels
        if in_channels != out_channels:
            zeros_identity = keras.layers.subtract([identity, identity])
            identity = keras.layers.concatenate([identity, zeros_identity])
            in_channels = out_channels
        
        residual = keras.layers.add([residual, identity])
    
    return residual


# define and train a model
inputs = Input(shape=(32, 32, 3))
net = Conv2D(64, 3, padding='same', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(inputs)
net = residual_block(net, 20, 64, downsample=False)
net = residual_block(net, 1,  128, downsample=True)
net = residual_block(net, 19, 128, downsample=False)
net = residual_block(net, 1,  256, downsample=True)
net = residual_block(net, 19, 256, downsample=False)
net = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(net)
net = Activation('relu')(net)
net = GlobalAveragePooling2D()(net)
outputs = Dense(10, activation='softmax', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(net)
model = Model(inputs=inputs, outputs=outputs)
sgd = optimizers.SGD(lr=0.1, decay=0., momentum=0.9, nesterov=True)
model.compile(loss='categorical_crossentropy', optimizer=sgd, metrics=['accuracy'])

# data augmentation
datagen = ImageDataGenerator(
    # randomly rotate images in the range (deg 0 to 180)
    rotation_range=30,
    # Range for random zoom
    zoom_range = 0.2,
    # shear angle in counter-clockwise direction in degrees
    shear_range = 30,
    # randomly flip images
    horizontal_flip=True,
    # randomly shift images horizontally
    width_shift_range=0.125,
    # randomly shift images vertically
    height_shift_range=0.125)

reduce_lr = LearningRateScheduler(scheduler)
# fit the model on the batches generated by datagen.flow().
model.fit_generator(datagen.flow(x_train, y_train, batch_size=100),
                    validation_data=(x_test, y_test), epochs=500, 
                    verbose=1, callbacks=[reduce_lr], workers=4)

# get results
K.set_learning_phase(0)
DRSN_train_score = model.evaluate(x_train, y_train, batch_size=100, verbose=0)
print('Train loss:', DRSN_train_score[0])
print('Train accuracy:', DRSN_train_score[1])
DRSN_test_score = model.evaluate(x_test, y_test, batch_size=100, verbose=0)
print('Test loss:', DRSN_test_score[0])
print('Test accuracy:', DRSN_test_score[1])

实验结果:

代码语言:javascript
复制
Using TensorFlow backend.
x_train shape: (50000, 32, 32, 3)
50000 train samples
10000 test samples
Epoch 1/500
281s 562ms/step - loss: 9.6683 - acc: 0.2858 - val_loss: 8.5491 - val_acc: 0.4224
Epoch 2/500
236s 471ms/step - loss: 7.8652 - acc: 0.4406 - val_loss: 7.0180 - val_acc: 0.5270
Epoch 3/500
235s 471ms/step - loss: 6.5241 - acc: 0.5264 - val_loss: 5.7927 - val_acc: 0.6159
Epoch 4/500
235s 471ms/step - loss: 5.4217 - acc: 0.6013 - val_loss: 4.7898 - val_acc: 0.6878
Epoch 5/500
235s 471ms/step - loss: 4.5434 - acc: 0.6542 - val_loss: 4.0362 - val_acc: 0.7256
Epoch 6/500
235s 470ms/step - loss: 3.8297 - acc: 0.6947 - val_loss: 3.3928 - val_acc: 0.7654
Epoch 7/500
235s 471ms/step - loss: 3.2680 - acc: 0.7257 - val_loss: 2.8972 - val_acc: 0.7805
Epoch 8/500
236s 471ms/step - loss: 2.8023 - acc: 0.7493 - val_loss: 2.4718 - val_acc: 0.8117
Epoch 9/500
235s 471ms/step - loss: 2.4351 - acc: 0.7652 - val_loss: 2.1518 - val_acc: 0.8216
Epoch 10/500
235s 470ms/step - loss: 2.1298 - acc: 0.7822 - val_loss: 1.8664 - val_acc: 0.8355
Epoch 11/500
235s 470ms/step - loss: 1.8768 - acc: 0.7961 - val_loss: 1.6576 - val_acc: 0.8407
Epoch 12/500
235s 470ms/step - loss: 1.6745 - acc: 0.8071 - val_loss: 1.4888 - val_acc: 0.8456
Epoch 13/500
235s 471ms/step - loss: 1.5155 - acc: 0.8139 - val_loss: 1.3255 - val_acc: 0.8598
Epoch 14/500
235s 471ms/step - loss: 1.3782 - acc: 0.8230 - val_loss: 1.2249 - val_acc: 0.8616
Epoch 15/500
235s 471ms/step - loss: 1.2630 - acc: 0.8293 - val_loss: 1.1236 - val_acc: 0.8655
Epoch 16/500
235s 471ms/step - loss: 1.1829 - acc: 0.8342 - val_loss: 1.0384 - val_acc: 0.8768
Epoch 17/500
235s 470ms/step - loss: 1.1094 - acc: 0.8389 - val_loss: 0.9748 - val_acc: 0.8752
Epoch 18/500
236s 471ms/step - loss: 1.0510 - acc: 0.8448 - val_loss: 0.9660 - val_acc: 0.8697
Epoch 19/500
235s 471ms/step - loss: 1.0037 - acc: 0.8472 - val_loss: 0.9055 - val_acc: 0.8760
Epoch 20/500
235s 471ms/step - loss: 0.9615 - acc: 0.8520 - val_loss: 0.8935 - val_acc: 0.8711
Epoch 21/500
235s 471ms/step - loss: 0.9345 - acc: 0.8545 - val_loss: 0.8621 - val_acc: 0.8743
Epoch 22/500
235s 470ms/step - loss: 0.9044 - acc: 0.8589 - val_loss: 0.8440 - val_acc: 0.8776
Epoch 23/500
235s 470ms/step - loss: 0.8816 - acc: 0.8625 - val_loss: 0.8310 - val_acc: 0.8792
Epoch 24/500
235s 470ms/step - loss: 0.8640 - acc: 0.8659 - val_loss: 0.8157 - val_acc: 0.8820
Epoch 25/500
235s 470ms/step - loss: 0.8446 - acc: 0.8696 - val_loss: 0.7921 - val_acc: 0.8873
Epoch 26/500
235s 470ms/step - loss: 0.8283 - acc: 0.8716 - val_loss: 0.7739 - val_acc: 0.8934
Epoch 27/500
235s 470ms/step - loss: 0.8212 - acc: 0.8720 - val_loss: 0.7726 - val_acc: 0.8885
Epoch 28/500
235s 471ms/step - loss: 0.8089 - acc: 0.8743 - val_loss: 0.7783 - val_acc: 0.8855
Epoch 29/500
235s 470ms/step - loss: 0.7970 - acc: 0.8775 - val_loss: 0.7350 - val_acc: 0.8988
Epoch 30/500
235s 470ms/step - loss: 0.7911 - acc: 0.8792 - val_loss: 0.7695 - val_acc: 0.8860
Epoch 31/500
235s 470ms/step - loss: 0.7846 - acc: 0.8802 - val_loss: 0.7392 - val_acc: 0.8989
Epoch 32/500
235s 471ms/step - loss: 0.7784 - acc: 0.8814 - val_loss: 0.7618 - val_acc: 0.8888
Epoch 33/500
235s 470ms/step - loss: 0.7724 - acc: 0.8842 - val_loss: 0.7547 - val_acc: 0.8937
Epoch 34/500
235s 470ms/step - loss: 0.7680 - acc: 0.8856 - val_loss: 0.7400 - val_acc: 0.8941
Epoch 35/500
235s 470ms/step - loss: 0.7646 - acc: 0.8865 - val_loss: 0.7079 - val_acc: 0.9096
Epoch 36/500
235s 470ms/step - loss: 0.7567 - acc: 0.8889 - val_loss: 0.7297 - val_acc: 0.8991
Epoch 37/500
235s 471ms/step - loss: 0.7518 - acc: 0.8920 - val_loss: 0.7265 - val_acc: 0.9011
Epoch 38/500
235s 470ms/step - loss: 0.7499 - acc: 0.8911 - val_loss: 0.7068 - val_acc: 0.9108
Epoch 39/500
235s 470ms/step - loss: 0.7455 - acc: 0.8927 - val_loss: 0.7524 - val_acc: 0.8939
Epoch 40/500
235s 470ms/step - loss: 0.7451 - acc: 0.8926 - val_loss: 0.7293 - val_acc: 0.9007
Epoch 41/500
235s 471ms/step - loss: 0.7434 - acc: 0.8951 - val_loss: 0.6985 - val_acc: 0.9097
Epoch 42/500
235s 470ms/step - loss: 0.7439 - acc: 0.8933 - val_loss: 0.7252 - val_acc: 0.9018
Epoch 43/500
235s 470ms/step - loss: 0.7433 - acc: 0.8952 - val_loss: 0.7304 - val_acc: 0.9006
Epoch 44/500
235s 470ms/step - loss: 0.7393 - acc: 0.8958 - val_loss: 0.6997 - val_acc: 0.9134
Epoch 45/500
235s 470ms/step - loss: 0.7348 - acc: 0.8992 - val_loss: 0.7287 - val_acc: 0.9035
Epoch 46/500
235s 470ms/step - loss: 0.7373 - acc: 0.8976 - val_loss: 0.7235 - val_acc: 0.9036
Epoch 47/500
235s 470ms/step - loss: 0.7382 - acc: 0.8974 - val_loss: 0.7178 - val_acc: 0.9081
Epoch 48/500
235s 470ms/step - loss: 0.7363 - acc: 0.8975 - val_loss: 0.7247 - val_acc: 0.9044
Epoch 49/500
235s 470ms/step - loss: 0.7306 - acc: 0.9009 - val_loss: 0.7328 - val_acc: 0.9006
Epoch 50/500
235s 470ms/step - loss: 0.7356 - acc: 0.9003 - val_loss: 0.7096 - val_acc: 0.9114
Epoch 51/500
235s 470ms/step - loss: 0.7282 - acc: 0.9029 - val_loss: 0.7156 - val_acc: 0.9076
Epoch 52/500
235s 470ms/step - loss: 0.7286 - acc: 0.9014 - val_loss: 0.7233 - val_acc: 0.9046
Epoch 53/500
235s 470ms/step - loss: 0.7304 - acc: 0.9016 - val_loss: 0.7087 - val_acc: 0.9088
Epoch 54/500
235s 470ms/step - loss: 0.7261 - acc: 0.9030 - val_loss: 0.7202 - val_acc: 0.9085
Epoch 55/500
235s 470ms/step - loss: 0.7257 - acc: 0.9034 - val_loss: 0.7138 - val_acc: 0.9095
Epoch 56/500
235s 470ms/step - loss: 0.7230 - acc: 0.9043 - val_loss: 0.7196 - val_acc: 0.9084
Epoch 57/500
235s 470ms/step - loss: 0.7212 - acc: 0.9048 - val_loss: 0.7094 - val_acc: 0.9098
Epoch 58/500
236s 473ms/step - loss: 0.7247 - acc: 0.9037 - val_loss: 0.7177 - val_acc: 0.9101
Epoch 59/500
236s 473ms/step - loss: 0.7183 - acc: 0.9067 - val_loss: 0.7385 - val_acc: 0.9026
Epoch 60/500
236s 472ms/step - loss: 0.7224 - acc: 0.9044 - val_loss: 0.7005 - val_acc: 0.9120
Epoch 61/500
236s 472ms/step - loss: 0.7185 - acc: 0.9050 - val_loss: 0.7287 - val_acc: 0.9067
Epoch 62/500
236s 472ms/step - loss: 0.7237 - acc: 0.9049 - val_loss: 0.6969 - val_acc: 0.9160
Epoch 63/500
236s 472ms/step - loss: 0.7177 - acc: 0.9074 - val_loss: 0.7044 - val_acc: 0.9117
Epoch 64/500
236s 472ms/step - loss: 0.7140 - acc: 0.9096 - val_loss: 0.7135 - val_acc: 0.9089
Epoch 65/500
236s 472ms/step - loss: 0.7120 - acc: 0.9074 - val_loss: 0.7107 - val_acc: 0.9093
Epoch 66/500
236s 472ms/step - loss: 0.7148 - acc: 0.9074 - val_loss: 0.7084 - val_acc: 0.9090
Epoch 67/500
236s 472ms/step - loss: 0.7156 - acc: 0.9081 - val_loss: 0.7086 - val_acc: 0.9132
Epoch 68/500
236s 472ms/step - loss: 0.7180 - acc: 0.9074 - val_loss: 0.7177 - val_acc: 0.9090
Epoch 69/500
237s 473ms/step - loss: 0.7111 - acc: 0.9094 - val_loss: 0.7278 - val_acc: 0.9047
Epoch 70/500
236s 473ms/step - loss: 0.7138 - acc: 0.9093 - val_loss: 0.7179 - val_acc: 0.9090
Epoch 71/500
237s 473ms/step - loss: 0.7165 - acc: 0.9084 - val_loss: 0.7251 - val_acc: 0.9065
Epoch 72/500
236s 473ms/step - loss: 0.7133 - acc: 0.9109 - val_loss: 0.6957 - val_acc: 0.9160
Epoch 73/500
236s 473ms/step - loss: 0.7129 - acc: 0.9106 - val_loss: 0.7008 - val_acc: 0.9154
Epoch 74/500
237s 473ms/step - loss: 0.7109 - acc: 0.9110 - val_loss: 0.7126 - val_acc: 0.9121
Epoch 75/500
236s 473ms/step - loss: 0.7143 - acc: 0.9105 - val_loss: 0.7286 - val_acc: 0.9061
Epoch 76/500
236s 472ms/step - loss: 0.7091 - acc: 0.9125 - val_loss: 0.7024 - val_acc: 0.9149
Epoch 77/500
236s 472ms/step - loss: 0.7129 - acc: 0.9104 - val_loss: 0.7176 - val_acc: 0.9106
Epoch 78/500
236s 472ms/step - loss: 0.7108 - acc: 0.9118 - val_loss: 0.6977 - val_acc: 0.9191
Epoch 79/500
237s 473ms/step - loss: 0.7088 - acc: 0.9123 - val_loss: 0.7253 - val_acc: 0.9069
Epoch 80/500
237s 473ms/step - loss: 0.7138 - acc: 0.9106 - val_loss: 0.7052 - val_acc: 0.9192
Epoch 81/500
236s 472ms/step - loss: 0.7120 - acc: 0.9130 - val_loss: 0.7205 - val_acc: 0.9113
Epoch 82/500
236s 472ms/step - loss: 0.7099 - acc: 0.9129 - val_loss: 0.7249 - val_acc: 0.9120
Epoch 83/500
236s 472ms/step - loss: 0.7076 - acc: 0.9117 - val_loss: 0.7329 - val_acc: 0.9060
Epoch 84/500
236s 472ms/step - loss: 0.7150 - acc: 0.9105 - val_loss: 0.6931 - val_acc: 0.9205
Epoch 85/500
236s 472ms/step - loss: 0.7062 - acc: 0.9159 - val_loss: 0.7189 - val_acc: 0.9107
Epoch 86/500
236s 472ms/step - loss: 0.7111 - acc: 0.9117 - val_loss: 0.6910 - val_acc: 0.9204
Epoch 87/500
236s 472ms/step - loss: 0.7070 - acc: 0.9138 - val_loss: 0.6921 - val_acc: 0.9201
Epoch 88/500
236s 472ms/step - loss: 0.7050 - acc: 0.9142 - val_loss: 0.6977 - val_acc: 0.9186
Epoch 89/500
236s 472ms/step - loss: 0.7056 - acc: 0.9136 - val_loss: 0.7174 - val_acc: 0.9109
Epoch 90/500
236s 472ms/step - loss: 0.7032 - acc: 0.9154 - val_loss: 0.6996 - val_acc: 0.9184
Epoch 91/500
236s 472ms/step - loss: 0.7060 - acc: 0.9139 - val_loss: 0.7090 - val_acc: 0.9143
Epoch 92/500
236s 472ms/step - loss: 0.7066 - acc: 0.9130 - val_loss: 0.7228 - val_acc: 0.9114
Epoch 93/500
236s 473ms/step - loss: 0.7063 - acc: 0.9155 - val_loss: 0.7039 - val_acc: 0.9216
Epoch 94/500
236s 473ms/step - loss: 0.7072 - acc: 0.9140 - val_loss: 0.7116 - val_acc: 0.9150
Epoch 95/500
236s 472ms/step - loss: 0.7096 - acc: 0.9143 - val_loss: 0.7216 - val_acc: 0.9109
Epoch 96/500
236s 473ms/step - loss: 0.7006 - acc: 0.9161 - val_loss: 0.7143 - val_acc: 0.9141
Epoch 97/500
236s 472ms/step - loss: 0.7069 - acc: 0.9142 - val_loss: 0.6954 - val_acc: 0.9224
Epoch 98/500
236s 473ms/step - loss: 0.7064 - acc: 0.9151 - val_loss: 0.7273 - val_acc: 0.9081
Epoch 99/500
236s 472ms/step - loss: 0.7038 - acc: 0.9161 - val_loss: 0.7274 - val_acc: 0.9132
Epoch 100/500
236s 472ms/step - loss: 0.7064 - acc: 0.9162 - val_loss: 0.7092 - val_acc: 0.9159
Epoch 101/500
236s 472ms/step - loss: 0.7011 - acc: 0.9169 - val_loss: 0.7331 - val_acc: 0.9080
Epoch 102/500
236s 472ms/step - loss: 0.7047 - acc: 0.9159 - val_loss: 0.7092 - val_acc: 0.9185
Epoch 103/500
237s 473ms/step - loss: 0.6994 - acc: 0.9166 - val_loss: 0.7029 - val_acc: 0.9180
Epoch 104/500
237s 473ms/step - loss: 0.6976 - acc: 0.9186 - val_loss: 0.7215 - val_acc: 0.9083
Epoch 105/500
236s 473ms/step - loss: 0.7002 - acc: 0.9159 - val_loss: 0.7191 - val_acc: 0.9111
Epoch 106/500
236s 472ms/step - loss: 0.7019 - acc: 0.9152 - val_loss: 0.7217 - val_acc: 0.9138
Epoch 107/500
236s 472ms/step - loss: 0.7078 - acc: 0.9154 - val_loss: 0.6926 - val_acc: 0.9249
Epoch 108/500
236s 472ms/step - loss: 0.7069 - acc: 0.9154 - val_loss: 0.7048 - val_acc: 0.9214
Epoch 109/500
236s 472ms/step - loss: 0.6975 - acc: 0.9182 - val_loss: 0.7130 - val_acc: 0.9130
Epoch 110/500
236s 472ms/step - loss: 0.7010 - acc: 0.9168 - val_loss: 0.7074 - val_acc: 0.9140
Epoch 111/500
236s 472ms/step - loss: 0.7020 - acc: 0.9175 - val_loss: 0.7142 - val_acc: 0.9161
Epoch 112/500
236s 473ms/step - loss: 0.6991 - acc: 0.9179 - val_loss: 0.7238 - val_acc: 0.9075
Epoch 113/500
236s 473ms/step - loss: 0.7022 - acc: 0.9165 - val_loss: 0.7162 - val_acc: 0.9165
Epoch 114/500
236s 473ms/step - loss: 0.7006 - acc: 0.9177 - val_loss: 0.7261 - val_acc: 0.9125
Epoch 115/500
237s 473ms/step - loss: 0.6987 - acc: 0.9184 - val_loss: 0.7110 - val_acc: 0.9121
Epoch 116/500
236s 472ms/step - loss: 0.6984 - acc: 0.9169 - val_loss: 0.7012 - val_acc: 0.9221
Epoch 117/500
236s 472ms/step - loss: 0.7002 - acc: 0.9162 - val_loss: 0.7278 - val_acc: 0.9113
Epoch 118/500
236s 472ms/step - loss: 0.6998 - acc: 0.9181 - val_loss: 0.7352 - val_acc: 0.9079
Epoch 119/500
236s 473ms/step - loss: 0.6989 - acc: 0.9187 - val_loss: 0.7147 - val_acc: 0.9162
Epoch 120/500
237s 473ms/step - loss: 0.7025 - acc: 0.9175 - val_loss: 0.7014 - val_acc: 0.9195
Epoch 121/500
236s 473ms/step - loss: 0.7003 - acc: 0.9183 - val_loss: 0.6987 - val_acc: 0.9177
Epoch 122/500
236s 472ms/step - loss: 0.6996 - acc: 0.9172 - val_loss: 0.7206 - val_acc: 0.9146
Epoch 123/500
236s 472ms/step - loss: 0.7033 - acc: 0.9174 - val_loss: 0.7128 - val_acc: 0.9187
Epoch 124/500
236s 472ms/step - loss: 0.6957 - acc: 0.9194 - val_loss: 0.7079 - val_acc: 0.9177
Epoch 125/500
236s 472ms/step - loss: 0.7028 - acc: 0.9171 - val_loss: 0.7080 - val_acc: 0.9200
Epoch 126/500
236s 472ms/step - loss: 0.7005 - acc: 0.9167 - val_loss: 0.7362 - val_acc: 0.9096
Epoch 127/500
236s 472ms/step - loss: 0.7044 - acc: 0.9182 - val_loss: 0.7139 - val_acc: 0.9164
Epoch 128/500
236s 472ms/step - loss: 0.7031 - acc: 0.9184 - val_loss: 0.7105 - val_acc: 0.9162
Epoch 129/500
236s 472ms/step - loss: 0.6979 - acc: 0.9194 - val_loss: 0.7255 - val_acc: 0.9160
Epoch 130/500
236s 472ms/step - loss: 0.7016 - acc: 0.9192 - val_loss: 0.7252 - val_acc: 0.9150
Epoch 131/500
236s 472ms/step - loss: 0.6991 - acc: 0.9208 - val_loss: 0.7086 - val_acc: 0.9204
Epoch 132/500
236s 472ms/step - loss: 0.7005 - acc: 0.9195 - val_loss: 0.7169 - val_acc: 0.9163
Epoch 133/500
236s 472ms/step - loss: 0.7006 - acc: 0.9191 - val_loss: 0.7040 - val_acc: 0.9202
Epoch 134/500
236s 472ms/step - loss: 0.7009 - acc: 0.9196 - val_loss: 0.7241 - val_acc: 0.9139
Epoch 135/500
237s 473ms/step - loss: 0.6915 - acc: 0.9211 - val_loss: 0.7318 - val_acc: 0.9088
Epoch 136/500
236s 472ms/step - loss: 0.7054 - acc: 0.9172 - val_loss: 0.7155 - val_acc: 0.9157
Epoch 137/500
236s 473ms/step - loss: 0.7006 - acc: 0.9191 - val_loss: 0.7221 - val_acc: 0.9152
Epoch 138/500
236s 472ms/step - loss: 0.6938 - acc: 0.9209 - val_loss: 0.7026 - val_acc: 0.9228
Epoch 139/500
236s 472ms/step - loss: 0.6994 - acc: 0.9189 - val_loss: 0.7087 - val_acc: 0.9200
Epoch 140/500
236s 472ms/step - loss: 0.6990 - acc: 0.9194 - val_loss: 0.7284 - val_acc: 0.9125
Epoch 141/500
236s 472ms/step - loss: 0.7056 - acc: 0.9174 - val_loss: 0.7021 - val_acc: 0.9224
Epoch 142/500
236s 472ms/step - loss: 0.6948 - acc: 0.9199 - val_loss: 0.7238 - val_acc: 0.9201
Epoch 143/500
236s 472ms/step - loss: 0.6957 - acc: 0.9199 - val_loss: 0.7073 - val_acc: 0.9164
Epoch 144/500
236s 472ms/step - loss: 0.6981 - acc: 0.9206 - val_loss: 0.7212 - val_acc: 0.9148
Epoch 145/500
236s 472ms/step - loss: 0.6976 - acc: 0.9212 - val_loss: 0.7079 - val_acc: 0.9204
Epoch 146/500
236s 473ms/step - loss: 0.6993 - acc: 0.9197 - val_loss: 0.7286 - val_acc: 0.9154
Epoch 147/500
236s 472ms/step - loss: 0.6891 - acc: 0.9234 - val_loss: 0.7160 - val_acc: 0.9175
Epoch 148/500
236s 472ms/step - loss: 0.6978 - acc: 0.9203 - val_loss: 0.7208 - val_acc: 0.9138
Epoch 149/500
236s 472ms/step - loss: 0.6975 - acc: 0.9196 - val_loss: 0.7227 - val_acc: 0.9138
Epoch 150/500
236s 472ms/step - loss: 0.6930 - acc: 0.9217 - val_loss: 0.7080 - val_acc: 0.9171
Epoch 151/500
lr changed to 0.010000000149011612
236s 472ms/step - loss: 0.5908 - acc: 0.9565 - val_loss: 0.6227 - val_acc: 0.9453
Epoch 152/500
236s 472ms/step - loss: 0.5375 - acc: 0.9734 - val_loss: 0.6086 - val_acc: 0.9479
Epoch 153/500
236s 472ms/step - loss: 0.5165 - acc: 0.9768 - val_loss: 0.5996 - val_acc: 0.9481
Epoch 154/500
236s 472ms/step - loss: 0.5005 - acc: 0.9798 - val_loss: 0.5907 - val_acc: 0.9494
Epoch 155/500
236s 472ms/step - loss: 0.4879 - acc: 0.9815 - val_loss: 0.5831 - val_acc: 0.9512
Epoch 156/500
236s 472ms/step - loss: 0.4739 - acc: 0.9842 - val_loss: 0.5746 - val_acc: 0.9508
Epoch 157/500
236s 472ms/step - loss: 0.4628 - acc: 0.9846 - val_loss: 0.5674 - val_acc: 0.9516
Epoch 158/500
236s 472ms/step - loss: 0.4513 - acc: 0.9866 - val_loss: 0.5614 - val_acc: 0.9513
Epoch 159/500
236s 472ms/step - loss: 0.4419 - acc: 0.9882 - val_loss: 0.5624 - val_acc: 0.9510
Epoch 160/500
236s 472ms/step - loss: 0.4331 - acc: 0.9874 - val_loss: 0.5522 - val_acc: 0.9516
Epoch 161/500
236s 472ms/step - loss: 0.4238 - acc: 0.9895 - val_loss: 0.5444 - val_acc: 0.9517
Epoch 162/500
236s 473ms/step - loss: 0.4177 - acc: 0.9885 - val_loss: 0.5420 - val_acc: 0.9517
Epoch 163/500
236s 472ms/step - loss: 0.4088 - acc: 0.9897 - val_loss: 0.5349 - val_acc: 0.9511
Epoch 164/500
236s 472ms/step - loss: 0.4013 - acc: 0.9897 - val_loss: 0.5262 - val_acc: 0.9537
Epoch 165/500
236s 472ms/step - loss: 0.3953 - acc: 0.9895 - val_loss: 0.5197 - val_acc: 0.9520
Epoch 166/500
236s 472ms/step - loss: 0.3872 - acc: 0.9908 - val_loss: 0.5165 - val_acc: 0.9517
Epoch 167/500
236s 472ms/step - loss: 0.3805 - acc: 0.9911 - val_loss: 0.5151 - val_acc: 0.9519
Epoch 168/500
236s 472ms/step - loss: 0.3732 - acc: 0.9910 - val_loss: 0.5118 - val_acc: 0.9506
Epoch 169/500
236s 472ms/step - loss: 0.3677 - acc: 0.9913 - val_loss: 0.5079 - val_acc: 0.9497
Epoch 170/500
236s 472ms/step - loss: 0.3603 - acc: 0.9913 - val_loss: 0.4986 - val_acc: 0.9512
Epoch 171/500
236s 472ms/step - loss: 0.3551 - acc: 0.9920 - val_loss: 0.4933 - val_acc: 0.9517
Epoch 172/500
236s 472ms/step - loss: 0.3506 - acc: 0.9916 - val_loss: 0.4847 - val_acc: 0.9527
Epoch 173/500
236s 473ms/step - loss: 0.3436 - acc: 0.9921 - val_loss: 0.4808 - val_acc: 0.9528
Epoch 174/500
236s 472ms/step - loss: 0.3374 - acc: 0.9922 - val_loss: 0.4800 - val_acc: 0.9528
Epoch 175/500
236s 472ms/step - loss: 0.3324 - acc: 0.9924 - val_loss: 0.4743 - val_acc: 0.9512
Epoch 176/500
236s 472ms/step - loss: 0.3277 - acc: 0.9925 - val_loss: 0.4741 - val_acc: 0.9505
Epoch 177/500
236s 472ms/step - loss: 0.3222 - acc: 0.9929 - val_loss: 0.4663 - val_acc: 0.9503
Epoch 178/500
236s 472ms/step - loss: 0.3179 - acc: 0.9926 - val_loss: 0.4694 - val_acc: 0.9497
Epoch 179/500
236s 472ms/step - loss: 0.3148 - acc: 0.9922 - val_loss: 0.4584 - val_acc: 0.9502
Epoch 180/500
236s 472ms/step - loss: 0.3082 - acc: 0.9938 - val_loss: 0.4647 - val_acc: 0.9484
Epoch 181/500
236s 471ms/step - loss: 0.3024 - acc: 0.9936 - val_loss: 0.4562 - val_acc: 0.9503
Epoch 182/500
236s 472ms/step - loss: 0.2996 - acc: 0.9931 - val_loss: 0.4528 - val_acc: 0.9518
Epoch 183/500
236s 472ms/step - loss: 0.2949 - acc: 0.9935 - val_loss: 0.4555 - val_acc: 0.9503
Epoch 184/500
236s 472ms/step - loss: 0.2916 - acc: 0.9932 - val_loss: 0.4519 - val_acc: 0.9481
Epoch 185/500
236s 472ms/step - loss: 0.2886 - acc: 0.9929 - val_loss: 0.4428 - val_acc: 0.9491
Epoch 186/500
236s 472ms/step - loss: 0.2857 - acc: 0.9929 - val_loss: 0.4383 - val_acc: 0.9495
Epoch 187/500
236s 472ms/step - loss: 0.2816 - acc: 0.9928 - val_loss: 0.4380 - val_acc: 0.9479
Epoch 188/500
236s 472ms/step - loss: 0.2764 - acc: 0.9935 - val_loss: 0.4373 - val_acc: 0.9479
Epoch 189/500
236s 472ms/step - loss: 0.2742 - acc: 0.9930 - val_loss: 0.4204 - val_acc: 0.9510
Epoch 190/500
236s 472ms/step - loss: 0.2697 - acc: 0.9935 - val_loss: 0.4241 - val_acc: 0.9500
Epoch 191/500
236s 472ms/step - loss: 0.2642 - acc: 0.9940 - val_loss: 0.4287 - val_acc: 0.9486
Epoch 192/500
236s 472ms/step - loss: 0.2631 - acc: 0.9932 - val_loss: 0.4230 - val_acc: 0.9478
Epoch 193/500
236s 472ms/step - loss: 0.2614 - acc: 0.9929 - val_loss: 0.4182 - val_acc: 0.9470
Epoch 194/500
236s 472ms/step - loss: 0.2577 - acc: 0.9929 - val_loss: 0.4143 - val_acc: 0.9465
Epoch 195/500
236s 472ms/step - loss: 0.2551 - acc: 0.9933 - val_loss: 0.4124 - val_acc: 0.9488
Epoch 196/500
236s 472ms/step - loss: 0.2522 - acc: 0.9933 - val_loss: 0.4111 - val_acc: 0.9469
Epoch 197/500
236s 472ms/step - loss: 0.2487 - acc: 0.9930 - val_loss: 0.4062 - val_acc: 0.9515
Epoch 198/500
236s 472ms/step - loss: 0.2474 - acc: 0.9926 - val_loss: 0.4036 - val_acc: 0.9505
Epoch 199/500
236s 472ms/step - loss: 0.2419 - acc: 0.9943 - val_loss: 0.3958 - val_acc: 0.9522
Epoch 200/500
236s 472ms/step - loss: 0.2393 - acc: 0.9939 - val_loss: 0.4007 - val_acc: 0.9509
Epoch 201/500
236s 472ms/step - loss: 0.2399 - acc: 0.9927 - val_loss: 0.3961 - val_acc: 0.9475
Epoch 202/500
236s 472ms/step - loss: 0.2385 - acc: 0.9922 - val_loss: 0.3966 - val_acc: 0.9486
Epoch 203/500
236s 471ms/step - loss: 0.2354 - acc: 0.9926 - val_loss: 0.3975 - val_acc: 0.9502
Epoch 204/500
236s 472ms/step - loss: 0.2314 - acc: 0.9935 - val_loss: 0.3895 - val_acc: 0.9510
Epoch 205/500
236s 472ms/step - loss: 0.2286 - acc: 0.9939 - val_loss: 0.3812 - val_acc: 0.9511
Epoch 206/500
236s 471ms/step - loss: 0.2286 - acc: 0.9926 - val_loss: 0.3841 - val_acc: 0.9500
Epoch 207/500
236s 472ms/step - loss: 0.2286 - acc: 0.9919 - val_loss: 0.3856 - val_acc: 0.9480
Epoch 208/500
236s 472ms/step - loss: 0.2249 - acc: 0.9928 - val_loss: 0.3721 - val_acc: 0.9509
Epoch 209/500
236s 471ms/step - loss: 0.2228 - acc: 0.9927 - val_loss: 0.3787 - val_acc: 0.9504
Epoch 210/500
236s 472ms/step - loss: 0.2222 - acc: 0.9925 - val_loss: 0.3797 - val_acc: 0.9483
Epoch 211/500
236s 471ms/step - loss: 0.2223 - acc: 0.9917 - val_loss: 0.3699 - val_acc: 0.9516
Epoch 212/500
236s 472ms/step - loss: 0.2185 - acc: 0.9925 - val_loss: 0.3764 - val_acc: 0.9500
Epoch 213/500
236s 472ms/step - loss: 0.2165 - acc: 0.9924 - val_loss: 0.3842 - val_acc: 0.9466
Epoch 214/500
236s 472ms/step - loss: 0.2151 - acc: 0.9925 - val_loss: 0.3631 - val_acc: 0.9517
Epoch 215/500
236s 471ms/step - loss: 0.2148 - acc: 0.9921 - val_loss: 0.3738 - val_acc: 0.9485
Epoch 216/500
236s 472ms/step - loss: 0.2137 - acc: 0.9920 - val_loss: 0.3658 - val_acc: 0.9489
Epoch 217/500
236s 472ms/step - loss: 0.2141 - acc: 0.9916 - val_loss: 0.3644 - val_acc: 0.9492
Epoch 218/500
236s 472ms/step - loss: 0.2071 - acc: 0.9929 - val_loss: 0.3660 - val_acc: 0.9489
Epoch 219/500
236s 472ms/step - loss: 0.2108 - acc: 0.9912 - val_loss: 0.3556 - val_acc: 0.9510
Epoch 220/500
236s 471ms/step - loss: 0.2068 - acc: 0.9921 - val_loss: 0.3661 - val_acc: 0.9492
Epoch 221/500
236s 472ms/step - loss: 0.2061 - acc: 0.9921 - val_loss: 0.3592 - val_acc: 0.9478
Epoch 222/500
236s 472ms/step - loss: 0.2073 - acc: 0.9908 - val_loss: 0.3573 - val_acc: 0.9485
Epoch 223/500
236s 472ms/step - loss: 0.2066 - acc: 0.9913 - val_loss: 0.3703 - val_acc: 0.9447
Epoch 224/500
236s 472ms/step - loss: 0.2075 - acc: 0.9903 - val_loss: 0.3552 - val_acc: 0.9499
Epoch 225/500
236s 471ms/step - loss: 0.2015 - acc: 0.9922 - val_loss: 0.3700 - val_acc: 0.9470
Epoch 226/500
236s 472ms/step - loss: 0.2033 - acc: 0.9914 - val_loss: 0.3646 - val_acc: 0.9483
Epoch 227/500
236s 472ms/step - loss: 0.2039 - acc: 0.9908 - val_loss: 0.3619 - val_acc: 0.9483
Epoch 228/500
236s 472ms/step - loss: 0.1979 - acc: 0.9927 - val_loss: 0.3593 - val_acc: 0.9490
Epoch 229/500
236s 472ms/step - loss: 0.1987 - acc: 0.9915 - val_loss: 0.3628 - val_acc: 0.9469
Epoch 230/500
236s 472ms/step - loss: 0.1991 - acc: 0.9916 - val_loss: 0.3524 - val_acc: 0.9471
Epoch 231/500
236s 472ms/step - loss: 0.1983 - acc: 0.9913 - val_loss: 0.3551 - val_acc: 0.9484
Epoch 232/500
236s 472ms/step - loss: 0.1953 - acc: 0.9923 - val_loss: 0.3511 - val_acc: 0.9465
Epoch 233/500
236s 472ms/step - loss: 0.1950 - acc: 0.9917 - val_loss: 0.3513 - val_acc: 0.9472
Epoch 234/500
236s 472ms/step - loss: 0.1930 - acc: 0.9920 - val_loss: 0.3515 - val_acc: 0.9501
Epoch 235/500
236s 472ms/step - loss: 0.1956 - acc: 0.9909 - val_loss: 0.3401 - val_acc: 0.9506
Epoch 236/500
236s 472ms/step - loss: 0.1941 - acc: 0.9909 - val_loss: 0.3536 - val_acc: 0.9471
Epoch 237/500
236s 472ms/step - loss: 0.1961 - acc: 0.9904 - val_loss: 0.3740 - val_acc: 0.9419
Epoch 238/500
236s 471ms/step - loss: 0.1975 - acc: 0.9894 - val_loss: 0.3527 - val_acc: 0.9472
Epoch 239/500
236s 472ms/step - loss: 0.1947 - acc: 0.9908 - val_loss: 0.3384 - val_acc: 0.9506
Epoch 240/500
236s 472ms/step - loss: 0.1910 - acc: 0.9915 - val_loss: 0.3403 - val_acc: 0.9489
Epoch 241/500
236s 472ms/step - loss: 0.1920 - acc: 0.9907 - val_loss: 0.3510 - val_acc: 0.9443
Epoch 242/500
236s 472ms/step - loss: 0.1940 - acc: 0.9899 - val_loss: 0.3439 - val_acc: 0.9487
Epoch 243/500
236s 472ms/step - loss: 0.1885 - acc: 0.9918 - val_loss: 0.3400 - val_acc: 0.9496
Epoch 244/500
236s 471ms/step - loss: 0.1901 - acc: 0.9912 - val_loss: 0.3487 - val_acc: 0.9483
Epoch 245/500
236s 471ms/step - loss: 0.1885 - acc: 0.9916 - val_loss: 0.3321 - val_acc: 0.9526
Epoch 246/500
236s 472ms/step - loss: 0.1887 - acc: 0.9912 - val_loss: 0.3563 - val_acc: 0.9456
Epoch 247/500
236s 471ms/step - loss: 0.1912 - acc: 0.9901 - val_loss: 0.3408 - val_acc: 0.9457
Epoch 248/500
236s 472ms/step - loss: 0.1900 - acc: 0.9907 - val_loss: 0.3511 - val_acc: 0.9462
Epoch 249/500
236s 472ms/step - loss: 0.1902 - acc: 0.9898 - val_loss: 0.3484 - val_acc: 0.9470
Epoch 250/500
236s 472ms/step - loss: 0.1854 - acc: 0.9916 - val_loss: 0.3378 - val_acc: 0.9509
Epoch 251/500
236s 471ms/step - loss: 0.1896 - acc: 0.9899 - val_loss: 0.3470 - val_acc: 0.9466
Epoch 252/500
236s 472ms/step - loss: 0.1885 - acc: 0.9906 - val_loss: 0.3437 - val_acc: 0.9500
Epoch 253/500
236s 472ms/step - loss: 0.1855 - acc: 0.9912 - val_loss: 0.3424 - val_acc: 0.9472
Epoch 254/500
236s 471ms/step - loss: 0.1835 - acc: 0.9914 - val_loss: 0.3507 - val_acc: 0.9477
Epoch 255/500
236s 472ms/step - loss: 0.1887 - acc: 0.9900 - val_loss: 0.3447 - val_acc: 0.9457
Epoch 256/500
236s 472ms/step - loss: 0.1895 - acc: 0.9896 - val_loss: 0.3444 - val_acc: 0.9474
Epoch 257/500
236s 472ms/step - loss: 0.1876 - acc: 0.9904 - val_loss: 0.3446 - val_acc: 0.9448
Epoch 258/500
236s 472ms/step - loss: 0.1864 - acc: 0.9901 - val_loss: 0.3412 - val_acc: 0.9471
Epoch 259/500
236s 471ms/step - loss: 0.1845 - acc: 0.9910 - val_loss: 0.3501 - val_acc: 0.9435
Epoch 260/500
236s 471ms/step - loss: 0.1886 - acc: 0.9891 - val_loss: 0.3388 - val_acc: 0.9468
Epoch 261/500
236s 471ms/step - loss: 0.1876 - acc: 0.9893 - val_loss: 0.3583 - val_acc: 0.9458
Epoch 262/500
236s 471ms/step - loss: 0.1873 - acc: 0.9899 - val_loss: 0.3586 - val_acc: 0.9446
Epoch 263/500
236s 471ms/step - loss: 0.1814 - acc: 0.9917 - val_loss: 0.3447 - val_acc: 0.9480
Epoch 264/500
236s 472ms/step - loss: 0.1851 - acc: 0.9908 - val_loss: 0.3413 - val_acc: 0.9464
Epoch 265/500
236s 471ms/step - loss: 0.1869 - acc: 0.9897 - val_loss: 0.3462 - val_acc: 0.9421
Epoch 266/500
236s 472ms/step - loss: 0.1830 - acc: 0.9908 - val_loss: 0.3249 - val_acc: 0.9518
Epoch 267/500
236s 472ms/step - loss: 0.1854 - acc: 0.9905 - val_loss: 0.3291 - val_acc: 0.9479
Epoch 268/500
236s 471ms/step - loss: 0.1838 - acc: 0.9901 - val_loss: 0.3499 - val_acc: 0.9453
Epoch 269/500
236s 471ms/step - loss: 0.1842 - acc: 0.9903 - val_loss: 0.3493 - val_acc: 0.9468
Epoch 270/500
236s 472ms/step - loss: 0.1858 - acc: 0.9897 - val_loss: 0.3450 - val_acc: 0.9461
Epoch 271/500
236s 471ms/step - loss: 0.1822 - acc: 0.9903 - val_loss: 0.3357 - val_acc: 0.9477
Epoch 272/500
236s 471ms/step - loss: 0.1840 - acc: 0.9904 - val_loss: 0.3432 - val_acc: 0.9452
Epoch 273/500
236s 472ms/step - loss: 0.1873 - acc: 0.9894 - val_loss: 0.3369 - val_acc: 0.9472
Epoch 274/500
236s 472ms/step - loss: 0.1826 - acc: 0.9906 - val_loss: 0.3556 - val_acc: 0.9443
Epoch 275/500
236s 472ms/step - loss: 0.1812 - acc: 0.9905 - val_loss: 0.3580 - val_acc: 0.9474
Epoch 276/500
236s 472ms/step - loss: 0.1835 - acc: 0.9900 - val_loss: 0.3549 - val_acc: 0.9447
Epoch 277/500
236s 471ms/step - loss: 0.1808 - acc: 0.9908 - val_loss: 0.3434 - val_acc: 0.9454
Epoch 278/500
236s 472ms/step - loss: 0.1822 - acc: 0.9904 - val_loss: 0.3398 - val_acc: 0.9476
Epoch 279/500
236s 472ms/step - loss: 0.1825 - acc: 0.9906 - val_loss: 0.3292 - val_acc: 0.9497
Epoch 280/500
236s 471ms/step - loss: 0.1839 - acc: 0.9896 - val_loss: 0.3487 - val_acc: 0.9433
Epoch 281/500
236s 472ms/step - loss: 0.1826 - acc: 0.9905 - val_loss: 0.3433 - val_acc: 0.9464
Epoch 282/500
236s 472ms/step - loss: 0.1808 - acc: 0.9910 - val_loss: 0.3261 - val_acc: 0.9469
Epoch 283/500
236s 472ms/step - loss: 0.1816 - acc: 0.9907 - val_loss: 0.3523 - val_acc: 0.9418
Epoch 284/500
236s 472ms/step - loss: 0.1847 - acc: 0.9895 - val_loss: 0.3297 - val_acc: 0.9477
Epoch 285/500
236s 472ms/step - loss: 0.1813 - acc: 0.9909 - val_loss: 0.3337 - val_acc: 0.9472
Epoch 286/500
236s 472ms/step - loss: 0.1797 - acc: 0.9911 - val_loss: 0.3495 - val_acc: 0.9457
Epoch 287/500
236s 471ms/step - loss: 0.1824 - acc: 0.9905 - val_loss: 0.3378 - val_acc: 0.9471
Epoch 288/500
236s 472ms/step - loss: 0.1815 - acc: 0.9904 - val_loss: 0.3371 - val_acc: 0.9479
Epoch 289/500
236s 472ms/step - loss: 0.1795 - acc: 0.9909 - val_loss: 0.3345 - val_acc: 0.9481
Epoch 290/500
236s 472ms/step - loss: 0.1833 - acc: 0.9896 - val_loss: 0.3221 - val_acc: 0.9505
Epoch 291/500
236s 471ms/step - loss: 0.1798 - acc: 0.9911 - val_loss: 0.3425 - val_acc: 0.9464
Epoch 292/500
236s 471ms/step - loss: 0.1826 - acc: 0.9899 - val_loss: 0.3563 - val_acc: 0.9407
Epoch 293/500
236s 471ms/step - loss: 0.1809 - acc: 0.9906 - val_loss: 0.3398 - val_acc: 0.9463
Epoch 294/500
236s 471ms/step - loss: 0.1835 - acc: 0.9897 - val_loss: 0.3427 - val_acc: 0.9478
Epoch 295/500
236s 471ms/step - loss: 0.1792 - acc: 0.9913 - val_loss: 0.3443 - val_acc: 0.9463
Epoch 296/500
236s 471ms/step - loss: 0.1794 - acc: 0.9907 - val_loss: 0.3456 - val_acc: 0.9457
Epoch 297/500
236s 472ms/step - loss: 0.1751 - acc: 0.9922 - val_loss: 0.3362 - val_acc: 0.9457
Epoch 298/500
236s 472ms/step - loss: 0.1821 - acc: 0.9893 - val_loss: 0.3465 - val_acc: 0.9440
Epoch 299/500
236s 472ms/step - loss: 0.1827 - acc: 0.9893 - val_loss: 0.3462 - val_acc: 0.9443
Epoch 300/500
236s 471ms/step - loss: 0.1793 - acc: 0.9910 - val_loss: 0.3537 - val_acc: 0.9422
Epoch 301/500
lr changed to 0.0009999999776482583
236s 471ms/step - loss: 0.1692 - acc: 0.9943 - val_loss: 0.3201 - val_acc: 0.9535
Epoch 302/500
236s 472ms/step - loss: 0.1627 - acc: 0.9966 - val_loss: 0.3157 - val_acc: 0.9546
Epoch 303/500
236s 471ms/step - loss: 0.1600 - acc: 0.9972 - val_loss: 0.3139 - val_acc: 0.9556
Epoch 304/500
236s 472ms/step - loss: 0.1580 - acc: 0.9980 - val_loss: 0.3134 - val_acc: 0.9543
Epoch 305/500
236s 471ms/step - loss: 0.1571 - acc: 0.9981 - val_loss: 0.3141 - val_acc: 0.9555
Epoch 306/500
236s 472ms/step - loss: 0.1567 - acc: 0.9982 - val_loss: 0.3135 - val_acc: 0.9556
Epoch 307/500
236s 471ms/step - loss: 0.1557 - acc: 0.9983 - val_loss: 0.3120 - val_acc: 0.9546
Epoch 308/500
236s 471ms/step - loss: 0.1544 - acc: 0.9987 - val_loss: 0.3123 - val_acc: 0.9548
Epoch 309/500
236s 472ms/step - loss: 0.1543 - acc: 0.9986 - val_loss: 0.3112 - val_acc: 0.9569
Epoch 310/500
236s 471ms/step - loss: 0.1539 - acc: 0.9988 - val_loss: 0.3101 - val_acc: 0.9567
Epoch 311/500
236s 472ms/step - loss: 0.1534 - acc: 0.9988 - val_loss: 0.3122 - val_acc: 0.9568
Epoch 312/500
236s 471ms/step - loss: 0.1531 - acc: 0.9987 - val_loss: 0.3123 - val_acc: 0.9562
Epoch 313/500
236s 471ms/step - loss: 0.1526 - acc: 0.9985 - val_loss: 0.3129 - val_acc: 0.9559
Epoch 314/500
236s 472ms/step - loss: 0.1526 - acc: 0.9986 - val_loss: 0.3135 - val_acc: 0.9561
Epoch 315/500
236s 471ms/step - loss: 0.1518 - acc: 0.9988 - val_loss: 0.3111 - val_acc: 0.9569
Epoch 316/500
236s 471ms/step - loss: 0.1517 - acc: 0.9986 - val_loss: 0.3121 - val_acc: 0.9563
Epoch 317/500
236s 471ms/step - loss: 0.1511 - acc: 0.9989 - val_loss: 0.3115 - val_acc: 0.9563
Epoch 318/500
236s 471ms/step - loss: 0.1508 - acc: 0.9989 - val_loss: 0.3115 - val_acc: 0.9571
Epoch 319/500
236s 471ms/step - loss: 0.1500 - acc: 0.9990 - val_loss: 0.3102 - val_acc: 0.9576
Epoch 320/500
236s 471ms/step - loss: 0.1502 - acc: 0.9989 - val_loss: 0.3100 - val_acc: 0.9562
Epoch 321/500
236s 472ms/step - loss: 0.1494 - acc: 0.9990 - val_loss: 0.3092 - val_acc: 0.9569
Epoch 322/500
236s 472ms/step - loss: 0.1489 - acc: 0.9990 - val_loss: 0.3092 - val_acc: 0.9562
Epoch 323/500
236s 472ms/step - loss: 0.1492 - acc: 0.9988 - val_loss: 0.3078 - val_acc: 0.9566
Epoch 324/500
236s 472ms/step - loss: 0.1485 - acc: 0.9991 - val_loss: 0.3068 - val_acc: 0.9575
Epoch 325/500
236s 472ms/step - loss: 0.1485 - acc: 0.9991 - val_loss: 0.3066 - val_acc: 0.9567
Epoch 326/500
236s 471ms/step - loss: 0.1479 - acc: 0.9992 - val_loss: 0.3056 - val_acc: 0.9575
Epoch 327/500
236s 471ms/step - loss: 0.1475 - acc: 0.9992 - val_loss: 0.3074 - val_acc: 0.9576
Epoch 328/500
236s 472ms/step - loss: 0.1469 - acc: 0.9994 - val_loss: 0.3073 - val_acc: 0.9571
Epoch 329/500
236s 471ms/step - loss: 0.1472 - acc: 0.9991 - val_loss: 0.3074 - val_acc: 0.9563
Epoch 330/500
236s 471ms/step - loss: 0.1466 - acc: 0.9992 - val_loss: 0.3068 - val_acc: 0.9575
...
Epoch 421/500
234s 469ms/step - loss: 0.1255 - acc: 0.9993 - val_loss: 0.2911 - val_acc: 0.9584
Epoch 422/500
235s 470ms/step - loss: 0.1253 - acc: 0.9995 - val_loss: 0.2907 - val_acc: 0.9574
Epoch 423/500
235s 469ms/step - loss: 0.1248 - acc: 0.9996 - val_loss: 0.2899 - val_acc: 0.9573
Epoch 424/500
235s 469ms/step - loss: 0.1251 - acc: 0.9993 - val_loss: 0.2932 - val_acc: 0.9582
Epoch 425/500
235s 469ms/step - loss: 0.1246 - acc: 0.9994 - val_loss: 0.2925 - val_acc: 0.9579
Epoch 426/500
234s 469ms/step - loss: 0.1245 - acc: 0.9993 - val_loss: 0.2902 - val_acc: 0.9574
Epoch 427/500
234s 469ms/step - loss: 0.1242 - acc: 0.9995 - val_loss: 0.2919 - val_acc: 0.9570
Epoch 428/500
234s 469ms/step - loss: 0.1237 - acc: 0.9995 - val_loss: 0.2930 - val_acc: 0.9577
Epoch 429/500
235s 469ms/step - loss: 0.1239 - acc: 0.9994 - val_loss: 0.2921 - val_acc: 0.9584
Epoch 430/500
234s 469ms/step - loss: 0.1239 - acc: 0.9994 - val_loss: 0.2924 - val_acc: 0.9575
Epoch 431/500
234s 469ms/step - loss: 0.1237 - acc: 0.9993 - val_loss: 0.2931 - val_acc: 0.9587
Epoch 432/500
235s 470ms/step - loss: 0.1232 - acc: 0.9996 - val_loss: 0.2933 - val_acc: 0.9585
Epoch 433/500
235s 470ms/step - loss: 0.1233 - acc: 0.9995 - val_loss: 0.2922 - val_acc: 0.9584
Epoch 434/500
235s 469ms/step - loss: 0.1226 - acc: 0.9996 - val_loss: 0.2936 - val_acc: 0.9582
Epoch 435/500
234s 469ms/step - loss: 0.1230 - acc: 0.9993 - val_loss: 0.2946 - val_acc: 0.9584
Epoch 436/500
234s 469ms/step - loss: 0.1224 - acc: 0.9996 - val_loss: 0.2939 - val_acc: 0.9573
Epoch 437/500
235s 470ms/step - loss: 0.1227 - acc: 0.9992 - val_loss: 0.2949 - val_acc: 0.9577
Epoch 438/500
235s 469ms/step - loss: 0.1222 - acc: 0.9995 - val_loss: 0.2933 - val_acc: 0.9581
Epoch 439/500
235s 469ms/step - loss: 0.1217 - acc: 0.9995 - val_loss: 0.2919 - val_acc: 0.9574
Epoch 440/500
234s 469ms/step - loss: 0.1215 - acc: 0.9995 - val_loss: 0.2917 - val_acc: 0.9583
Epoch 441/500
235s 469ms/step - loss: 0.1213 - acc: 0.9996 - val_loss: 0.2930 - val_acc: 0.9577
Epoch 442/500
234s 469ms/step - loss: 0.1217 - acc: 0.9993 - val_loss: 0.2951 - val_acc: 0.9571
Epoch 443/500
234s 469ms/step - loss: 0.1212 - acc: 0.9995 - val_loss: 0.2957 - val_acc: 0.9566
Epoch 444/500
234s 469ms/step - loss: 0.1207 - acc: 0.9996 - val_loss: 0.2949 - val_acc: 0.9569
Epoch 445/500
234s 469ms/step - loss: 0.1210 - acc: 0.9994 - val_loss: 0.2923 - val_acc: 0.9569
Epoch 446/500
234s 469ms/step - loss: 0.1206 - acc: 0.9994 - val_loss: 0.2919 - val_acc: 0.9573
Epoch 447/500
234s 469ms/step - loss: 0.1209 - acc: 0.9993 - val_loss: 0.2947 - val_acc: 0.9577
Epoch 448/500
234s 469ms/step - loss: 0.1203 - acc: 0.9995 - val_loss: 0.2923 - val_acc: 0.9581
Epoch 449/500
234s 469ms/step - loss: 0.1203 - acc: 0.9994 - val_loss: 0.2926 - val_acc: 0.9579
Epoch 450/500
235s 469ms/step - loss: 0.1199 - acc: 0.9995 - val_loss: 0.2888 - val_acc: 0.9591
Epoch 451/500
lr changed to 9.999999310821295e-05
234s 469ms/step - loss: 0.1197 - acc: 0.9994 - val_loss: 0.2890 - val_acc: 0.9588
Epoch 452/500
235s 470ms/step - loss: 0.1196 - acc: 0.9996 - val_loss: 0.2891 - val_acc: 0.9587
Epoch 453/500
235s 469ms/step - loss: 0.1197 - acc: 0.9997 - val_loss: 0.2887 - val_acc: 0.9589
Epoch 454/500
235s 470ms/step - loss: 0.1197 - acc: 0.9994 - val_loss: 0.2888 - val_acc: 0.9588
Epoch 455/500
234s 469ms/step - loss: 0.1196 - acc: 0.9995 - val_loss: 0.2885 - val_acc: 0.9592
Epoch 456/500
234s 469ms/step - loss: 0.1196 - acc: 0.9994 - val_loss: 0.2887 - val_acc: 0.9586
Epoch 457/500
235s 469ms/step - loss: 0.1195 - acc: 0.9996 - val_loss: 0.2885 - val_acc: 0.9583
Epoch 458/500
234s 469ms/step - loss: 0.1194 - acc: 0.9997 - val_loss: 0.2886 - val_acc: 0.9585
Epoch 459/500
235s 471ms/step - loss: 0.1193 - acc: 0.9996 - val_loss: 0.2887 - val_acc: 0.9585
Epoch 460/500
235s 470ms/step - loss: 0.1191 - acc: 0.9997 - val_loss: 0.2885 - val_acc: 0.9581
Epoch 461/500
235s 470ms/step - loss: 0.1194 - acc: 0.9995 - val_loss: 0.2887 - val_acc: 0.9585
Epoch 462/500
235s 470ms/step - loss: 0.1192 - acc: 0.9996 - val_loss: 0.2886 - val_acc: 0.9584
Epoch 463/500
235s 469ms/step - loss: 0.1192 - acc: 0.9996 - val_loss: 0.2887 - val_acc: 0.9579
Epoch 464/500
234s 469ms/step - loss: 0.1194 - acc: 0.9996 - val_loss: 0.2891 - val_acc: 0.9583
Epoch 465/500
234s 469ms/step - loss: 0.1194 - acc: 0.9995 - val_loss: 0.2888 - val_acc: 0.9585
Epoch 466/500
234s 469ms/step - loss: 0.1189 - acc: 0.9998 - val_loss: 0.2889 - val_acc: 0.9586
Epoch 467/500
234s 469ms/step - loss: 0.1190 - acc: 0.9997 - val_loss: 0.2885 - val_acc: 0.9584
Epoch 468/500
234s 469ms/step - loss: 0.1192 - acc: 0.9995 - val_loss: 0.2882 - val_acc: 0.9582
Epoch 469/500
235s 469ms/step - loss: 0.1192 - acc: 0.9996 - val_loss: 0.2882 - val_acc: 0.9582
Epoch 470/500
235s 469ms/step - loss: 0.1191 - acc: 0.9996 - val_loss: 0.2885 - val_acc: 0.9579
Epoch 471/500
234s 469ms/step - loss: 0.1193 - acc: 0.9995 - val_loss: 0.2885 - val_acc: 0.9580
Epoch 472/500
235s 471ms/step - loss: 0.1195 - acc: 0.9994 - val_loss: 0.2885 - val_acc: 0.9579
Epoch 473/500
235s 470ms/step - loss: 0.1191 - acc: 0.9996 - val_loss: 0.2883 - val_acc: 0.9578
Epoch 474/500
235s 470ms/step - loss: 0.1195 - acc: 0.9994 - val_loss: 0.2887 - val_acc: 0.9579
Epoch 475/500
235s 469ms/step - loss: 0.1192 - acc: 0.9995 - val_loss: 0.2884 - val_acc: 0.9582
Epoch 476/500
234s 469ms/step - loss: 0.1190 - acc: 0.9996 - val_loss: 0.2885 - val_acc: 0.9581
Epoch 477/500
234s 469ms/step - loss: 0.1189 - acc: 0.9997 - val_loss: 0.2888 - val_acc: 0.9581
Epoch 478/500
234s 469ms/step - loss: 0.1190 - acc: 0.9996 - val_loss: 0.2889 - val_acc: 0.9582
Epoch 479/500
234s 469ms/step - loss: 0.1187 - acc: 0.9997 - val_loss: 0.2887 - val_acc: 0.9584
Epoch 480/500
234s 469ms/step - loss: 0.1188 - acc: 0.9996 - val_loss: 0.2882 - val_acc: 0.9581
Epoch 481/500
235s 469ms/step - loss: 0.1191 - acc: 0.9996 - val_loss: 0.2883 - val_acc: 0.9578
Epoch 482/500
235s 471ms/step - loss: 0.1189 - acc: 0.9996 - val_loss: 0.2880 - val_acc: 0.9581
Epoch 483/500
234s 469ms/step - loss: 0.1186 - acc: 0.9996 - val_loss: 0.2881 - val_acc: 0.9582
Epoch 484/500
234s 469ms/step - loss: 0.1188 - acc: 0.9996 - val_loss: 0.2881 - val_acc: 0.9577
Epoch 485/500
234s 469ms/step - loss: 0.1190 - acc: 0.9996 - val_loss: 0.2884 - val_acc: 0.9578
Epoch 486/500
234s 469ms/step - loss: 0.1186 - acc: 0.9997 - val_loss: 0.2884 - val_acc: 0.9579
Epoch 487/500
234s 469ms/step - loss: 0.1189 - acc: 0.9995 - val_loss: 0.2883 - val_acc: 0.9580
Epoch 488/500
237s 473ms/step - loss: 0.1183 - acc: 0.9997 - val_loss: 0.2883 - val_acc: 0.9580
Epoch 489/500
235s 469ms/step - loss: 0.1186 - acc: 0.9997 - val_loss: 0.2885 - val_acc: 0.9578
Epoch 490/500
234s 468ms/step - loss: 0.1186 - acc: 0.9997 - val_loss: 0.2887 - val_acc: 0.9579
Epoch 491/500
234s 468ms/step - loss: 0.1187 - acc: 0.9996 - val_loss: 0.2884 - val_acc: 0.9581
Epoch 492/500
234s 469ms/step - loss: 0.1187 - acc: 0.9994 - val_loss: 0.2881 - val_acc: 0.9579
Epoch 493/500
234s 469ms/step - loss: 0.1185 - acc: 0.9996 - val_loss: 0.2883 - val_acc: 0.9578
Epoch 494/500
234s 469ms/step - loss: 0.1187 - acc: 0.9997 - val_loss: 0.2881 - val_acc: 0.9579
Epoch 495/500
234s 468ms/step - loss: 0.1187 - acc: 0.9997 - val_loss: 0.2880 - val_acc: 0.9579
Epoch 496/500
234s 469ms/step - loss: 0.1185 - acc: 0.9996 - val_loss: 0.2881 - val_acc: 0.9579
Epoch 497/500
234s 469ms/step - loss: 0.1186 - acc: 0.9996 - val_loss: 0.2881 - val_acc: 0.9582
Epoch 498/500
235s 469ms/step - loss: 0.1187 - acc: 0.9995 - val_loss: 0.2883 - val_acc: 0.9581
Epoch 499/500
234s 469ms/step - loss: 0.1184 - acc: 0.9996 - val_loss: 0.2884 - val_acc: 0.9580
Epoch 500/500
234s 469ms/step - loss: 0.1187 - acc: 0.9996 - val_loss: 0.2881 - val_acc: 0.9580
Train loss: 0.11700774446129798
Train accuracy: 0.999960000038147
Test loss: 0.2880836722254753
Test accuracy: 0.9580000066757202

测试准确率是95.80%,离96%还差一点。

Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht, Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, IEEE Transactions on Industrial Electronics, DOI: 10.1109/TIE.2020.2972458, Date of Publication: 13 February 2020

https://ieeexplore.ieee.org/document/8998530

本文系转载,前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文系转载前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

评论
作者已关闭评论
0 条评论
热度
最新
推荐阅读
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档