前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >【哈工大】Dynamic ReLU:自适应参数化ReLU及Keras代码(调参记录22)Cifar10~95.25%

【哈工大】Dynamic ReLU:自适应参数化ReLU及Keras代码(调参记录22)Cifar10~95.25%

作者头像
用户7368967
修改2020-05-28 14:14:05
3890
修改2020-05-28 14:14:05
举报
文章被收录于专栏:深度学习知识深度学习知识

本文介绍哈工大团队提出的一种动态ReLU(Dynamic ReLU)激活函数,即自适应参数化ReLU激活函数,原本是应用在基于一维振动信号的故障诊断,能够让每个样本有自己独特的ReLU参数,在2019年5月3日投稿至IEEE Transactions on Industrial Electronics,2020年1月24日录用,2020年2月13日在IEEE官网公布

本文在调参记录21的基础上,将残差模块的个数,从60个增加到120个,测试深度残差网络+自适应参数化ReLU激活函数在Cifar10图像集上的效果。

自适应参数化ReLU激活函数的基本原理如下:

自适应参数化ReLU:一种Dynamic ReLU激活函数
自适应参数化ReLU:一种Dynamic ReLU激活函数

Keras程序:

代码语言:python
复制
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Tue Apr 14 04:17:45 2020
Implemented using TensorFlow 1.0.1 and Keras 2.2.1

Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht,
Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, 
IEEE Transactions on Industrial Electronics, DOI: 10.1109/TIE.2020.2972458,
Date of Publication: 13 February 2020

@author: Minghang Zhao
"""

from __future__ import print_function
import keras
import numpy as np
from keras.datasets import cifar10
from keras.layers import Dense, Conv2D, BatchNormalization, Activation, Minimum
from keras.layers import AveragePooling2D, Input, GlobalAveragePooling2D, Concatenate, Reshape
from keras.regularizers import l2
from keras import backend as K
from keras.models import Model
from keras import optimizers
from keras.preprocessing.image import ImageDataGenerator
from keras.callbacks import LearningRateScheduler
K.set_learning_phase(1)

# The data, split between train and test sets
(x_train, y_train), (x_test, y_test) = cifar10.load_data()
x_train = x_train.astype('float32') / 255.
x_test = x_test.astype('float32') / 255.
x_test = x_test-np.mean(x_train)
x_train = x_train-np.mean(x_train)
print('x_train shape:', x_train.shape)
print(x_train.shape[0], 'train samples')
print(x_test.shape[0], 'test samples')

# convert class vectors to binary class matrices
y_train = keras.utils.to_categorical(y_train, 10)
y_test = keras.utils.to_categorical(y_test, 10)

# Schedule the learning rate, multiply 0.1 every 150 epoches
def scheduler(epoch):
    if epoch % 150 == 0 and epoch != 0:
        lr = K.get_value(model.optimizer.lr)
        K.set_value(model.optimizer.lr, lr * 0.1)
        print("lr changed to {}".format(lr * 0.1))
    return K.get_value(model.optimizer.lr)

# An adaptively parametric rectifier linear unit (APReLU)
def aprelu(inputs):
    # get the number of channels
    channels = inputs.get_shape().as_list()[-1]
    # get a zero feature map
    zeros_input = keras.layers.subtract([inputs, inputs])
    # get a feature map with only positive features
    pos_input = Activation('relu')(inputs)
    # get a feature map with only negative features
    neg_input = Minimum()([inputs,zeros_input])
    # define a network to obtain the scaling coefficients
    scales_p = GlobalAveragePooling2D()(pos_input)
    scales_n = GlobalAveragePooling2D()(neg_input)
    scales = Concatenate()([scales_n, scales_p])
    scales = Dense(channels//16, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales)
    scales = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(scales)
    scales = Activation('relu')(scales)
    scales = Dense(channels, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales)
    scales = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(scales)
    scales = Activation('sigmoid')(scales)
    scales = Reshape((1,1,channels))(scales)
    # apply a paramtetric relu
    neg_part = keras.layers.multiply([scales, neg_input])
    return keras.layers.add([pos_input, neg_part])

# Residual Block
def residual_block(incoming, nb_blocks, out_channels, downsample=False,
                   downsample_strides=2):
    
    residual = incoming
    in_channels = incoming.get_shape().as_list()[-1]
    
    for i in range(nb_blocks):
        
        identity = residual
        
        if not downsample:
            downsample_strides = 1
        
        residual = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(residual)
        residual = Activation('relu')(residual)
        residual = Conv2D(out_channels, 3, strides=(downsample_strides, downsample_strides), 
                          padding='same', kernel_initializer='he_normal', 
                          kernel_regularizer=l2(1e-4))(residual)
        
        residual = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(residual)
        residual = Activation('relu')(residual)
        residual = Conv2D(out_channels, 3, padding='same', kernel_initializer='he_normal', 
                          kernel_regularizer=l2(1e-4))(residual)
        
        residual = aprelu(residual)
        
        # Downsampling
        if downsample_strides > 1:
            identity = AveragePooling2D(pool_size=(1,1), strides=(2,2))(identity)
            
        # Zero_padding to match channels
        if in_channels != out_channels:
            zeros_identity = keras.layers.subtract([identity, identity])
            identity = keras.layers.concatenate([identity, zeros_identity])
            in_channels = out_channels
        
        residual = keras.layers.add([residual, identity])
    
    return residual


# define and train a model
inputs = Input(shape=(32, 32, 3))
net = Conv2D(16, 3, padding='same', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(inputs)
net = residual_block(net, 40, 32, downsample=False)
net = residual_block(net,  1, 32, downsample=True)
net = residual_block(net, 39, 32, downsample=False)
net = residual_block(net,  1, 64, downsample=True)
net = residual_block(net, 39, 64, downsample=False)
net = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(net)
net = Activation('relu')(net)
net = GlobalAveragePooling2D()(net)
outputs = Dense(10, activation='softmax', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(net)
model = Model(inputs=inputs, outputs=outputs)
sgd = optimizers.SGD(lr=0.1, decay=0., momentum=0.9, nesterov=True)
model.compile(loss='categorical_crossentropy', optimizer=sgd, metrics=['accuracy'])

# data augmentation
datagen = ImageDataGenerator(
    # randomly rotate images in the range (deg 0 to 180)
    rotation_range=30,
    # Range for random zoom
    zoom_range = 0.2,
    # shear angle in counter-clockwise direction in degrees
    shear_range = 30,
    # randomly flip images
    horizontal_flip=True,
    # randomly shift images horizontally
    width_shift_range=0.125,
    # randomly shift images vertically
    height_shift_range=0.125)

reduce_lr = LearningRateScheduler(scheduler)
# fit the model on the batches generated by datagen.flow().
model.fit_generator(datagen.flow(x_train, y_train, batch_size=100),
                    validation_data=(x_test, y_test), epochs=500, 
                    verbose=1, callbacks=[reduce_lr], workers=4)

# get results
K.set_learning_phase(0)
DRSN_train_score = model.evaluate(x_train, y_train, batch_size=100, verbose=0)
print('Train loss:', DRSN_train_score[0])
print('Train accuracy:', DRSN_train_score[1])
DRSN_test_score = model.evaluate(x_test, y_test, batch_size=100, verbose=0)
print('Test loss:', DRSN_test_score[0])
print('Test accuracy:', DRSN_test_score[1])

实验结果:

代码语言:javascript
复制
Using TensorFlow backend.
x_train shape: (50000, 32, 32, 3)
50000 train samples
10000 test samples
Epoch 1/500
318s 637ms/step - loss: 6.0165 - acc: 0.3791 - val_loss: 5.2157 - val_acc: 0.5307
Epoch 2/500
221s 443ms/step - loss: 4.8584 - acc: 0.5361 - val_loss: 4.2761 - val_acc: 0.6383
Epoch 3/500
221s 442ms/step - loss: 4.0487 - acc: 0.6159 - val_loss: 3.5837 - val_acc: 0.6913
Epoch 4/500
221s 442ms/step - loss: 3.4323 - acc: 0.6610 - val_loss: 3.0189 - val_acc: 0.7396
Epoch 5/500
221s 442ms/step - loss: 2.9384 - acc: 0.6943 - val_loss: 2.5795 - val_acc: 0.7697
Epoch 6/500
221s 442ms/step - loss: 2.5470 - acc: 0.7181 - val_loss: 2.2296 - val_acc: 0.7848
Epoch 7/500
221s 442ms/step - loss: 2.2227 - acc: 0.7400 - val_loss: 1.9631 - val_acc: 0.7931
Epoch 8/500
222s 444ms/step - loss: 1.9632 - acc: 0.7546 - val_loss: 1.7318 - val_acc: 0.8098
Epoch 9/500
221s 443ms/step - loss: 1.7535 - acc: 0.7685 - val_loss: 1.5313 - val_acc: 0.8197
Epoch 10/500
221s 442ms/step - loss: 1.5759 - acc: 0.7798 - val_loss: 1.4001 - val_acc: 0.8214
Epoch 11/500
221s 442ms/step - loss: 1.4432 - acc: 0.7859 - val_loss: 1.2776 - val_acc: 0.8309
Epoch 12/500
221s 442ms/step - loss: 1.3201 - acc: 0.7977 - val_loss: 1.1707 - val_acc: 0.8349
Epoch 13/500
222s 443ms/step - loss: 1.2295 - acc: 0.8028 - val_loss: 1.0760 - val_acc: 0.8454
Epoch 14/500
222s 443ms/step - loss: 1.1552 - acc: 0.8069 - val_loss: 1.0225 - val_acc: 0.8432
Epoch 15/500
221s 441ms/step - loss: 1.0964 - acc: 0.8119 - val_loss: 0.9549 - val_acc: 0.8514
Epoch 16/500
221s 442ms/step - loss: 1.0386 - acc: 0.8174 - val_loss: 0.9072 - val_acc: 0.8614
Epoch 17/500
221s 442ms/step - loss: 0.9979 - acc: 0.8204 - val_loss: 0.8765 - val_acc: 0.8566
Epoch 18/500
221s 441ms/step - loss: 0.9611 - acc: 0.8260 - val_loss: 0.8820 - val_acc: 0.8502
Epoch 19/500
220s 441ms/step - loss: 0.9351 - acc: 0.8290 - val_loss: 0.8319 - val_acc: 0.8601
Epoch 20/500
221s 441ms/step - loss: 0.9130 - acc: 0.8295 - val_loss: 0.8077 - val_acc: 0.8643
Epoch 21/500
221s 441ms/step - loss: 0.8837 - acc: 0.8347 - val_loss: 0.7924 - val_acc: 0.8683
Epoch 22/500
221s 441ms/step - loss: 0.8741 - acc: 0.8349 - val_loss: 0.7675 - val_acc: 0.8747
Epoch 23/500
221s 442ms/step - loss: 0.8536 - acc: 0.8403 - val_loss: 0.7988 - val_acc: 0.8599
Epoch 24/500
221s 441ms/step - loss: 0.8457 - acc: 0.8395 - val_loss: 0.7619 - val_acc: 0.8698
Epoch 25/500
221s 441ms/step - loss: 0.8354 - acc: 0.8422 - val_loss: 0.7466 - val_acc: 0.8708
Epoch 26/500
221s 441ms/step - loss: 0.8210 - acc: 0.8449 - val_loss: 0.7481 - val_acc: 0.8714
Epoch 27/500
220s 441ms/step - loss: 0.8155 - acc: 0.8473 - val_loss: 0.7636 - val_acc: 0.8669
Epoch 28/500
220s 441ms/step - loss: 0.8154 - acc: 0.8470 - val_loss: 0.7301 - val_acc: 0.8785
Epoch 29/500
220s 441ms/step - loss: 0.7967 - acc: 0.8537 - val_loss: 0.7206 - val_acc: 0.8811
Epoch 30/500
220s 441ms/step - loss: 0.7961 - acc: 0.8510 - val_loss: 0.7203 - val_acc: 0.8814
Epoch 31/500
221s 441ms/step - loss: 0.7932 - acc: 0.8534 - val_loss: 0.7010 - val_acc: 0.8835
Epoch 32/500
220s 441ms/step - loss: 0.7783 - acc: 0.8585 - val_loss: 0.7239 - val_acc: 0.8797
Epoch 33/500
221s 441ms/step - loss: 0.7744 - acc: 0.8577 - val_loss: 0.7140 - val_acc: 0.8795
Epoch 34/500
221s 442ms/step - loss: 0.7753 - acc: 0.8591 - val_loss: 0.7185 - val_acc: 0.8811
Epoch 35/500
221s 441ms/step - loss: 0.7737 - acc: 0.8575 - val_loss: 0.7251 - val_acc: 0.8752
Epoch 36/500
220s 441ms/step - loss: 0.7666 - acc: 0.8632 - val_loss: 0.7151 - val_acc: 0.8814
Epoch 37/500
221s 441ms/step - loss: 0.7746 - acc: 0.8593 - val_loss: 0.7119 - val_acc: 0.8792
Epoch 38/500
220s 441ms/step - loss: 0.7644 - acc: 0.8631 - val_loss: 0.7091 - val_acc: 0.8819
Epoch 39/500
221s 442ms/step - loss: 0.7620 - acc: 0.8639 - val_loss: 0.7190 - val_acc: 0.8809
Epoch 40/500
221s 441ms/step - loss: 0.7507 - acc: 0.8660 - val_loss: 0.7065 - val_acc: 0.8840
Epoch 41/500
221s 441ms/step - loss: 0.7550 - acc: 0.8658 - val_loss: 0.6998 - val_acc: 0.8858
Epoch 42/500
220s 441ms/step - loss: 0.7546 - acc: 0.8666 - val_loss: 0.7195 - val_acc: 0.8803
Epoch 43/500
221s 441ms/step - loss: 0.7514 - acc: 0.8680 - val_loss: 0.6949 - val_acc: 0.8895
Epoch 44/500
220s 441ms/step - loss: 0.7511 - acc: 0.8661 - val_loss: 0.7011 - val_acc: 0.8872
Epoch 45/500
221s 441ms/step - loss: 0.7431 - acc: 0.8688 - val_loss: 0.7057 - val_acc: 0.8848
Epoch 46/500
221s 441ms/step - loss: 0.7464 - acc: 0.8683 - val_loss: 0.7014 - val_acc: 0.8827
Epoch 47/500
220s 441ms/step - loss: 0.7487 - acc: 0.8678 - val_loss: 0.7002 - val_acc: 0.8859
Epoch 48/500
220s 441ms/step - loss: 0.7453 - acc: 0.8701 - val_loss: 0.6912 - val_acc: 0.8891
Epoch 49/500
220s 441ms/step - loss: 0.7431 - acc: 0.8694 - val_loss: 0.6798 - val_acc: 0.8932
Epoch 50/500
221s 441ms/step - loss: 0.7409 - acc: 0.8726 - val_loss: 0.6813 - val_acc: 0.8949
Epoch 51/500
220s 440ms/step - loss: 0.7370 - acc: 0.8732 - val_loss: 0.7049 - val_acc: 0.8886
Epoch 52/500
220s 441ms/step - loss: 0.7315 - acc: 0.8748 - val_loss: 0.6921 - val_acc: 0.8881
Epoch 53/500
220s 441ms/step - loss: 0.7374 - acc: 0.8728 - val_loss: 0.6728 - val_acc: 0.8990
Epoch 54/500
220s 441ms/step - loss: 0.7326 - acc: 0.8749 - val_loss: 0.6982 - val_acc: 0.8861
Epoch 55/500
221s 441ms/step - loss: 0.7353 - acc: 0.8723 - val_loss: 0.6776 - val_acc: 0.8918
Epoch 56/500
220s 441ms/step - loss: 0.7300 - acc: 0.8752 - val_loss: 0.6822 - val_acc: 0.8923
Epoch 57/500
221s 441ms/step - loss: 0.7321 - acc: 0.8756 - val_loss: 0.6854 - val_acc: 0.8963
Epoch 58/500
221s 441ms/step - loss: 0.7341 - acc: 0.8742 - val_loss: 0.7065 - val_acc: 0.8861
Epoch 59/500
220s 441ms/step - loss: 0.7334 - acc: 0.8749 - val_loss: 0.6815 - val_acc: 0.8960
Epoch 60/500
221s 443ms/step - loss: 0.7250 - acc: 0.8774 - val_loss: 0.6798 - val_acc: 0.8977
Epoch 61/500
222s 444ms/step - loss: 0.7309 - acc: 0.8759 - val_loss: 0.6892 - val_acc: 0.8964
Epoch 62/500
225s 451ms/step - loss: 0.7249 - acc: 0.8784 - val_loss: 0.6967 - val_acc: 0.8923
Epoch 63/500
226s 451ms/step - loss: 0.7291 - acc: 0.8770 - val_loss: 0.7028 - val_acc: 0.8907
Epoch 64/500
225s 451ms/step - loss: 0.7234 - acc: 0.8817 - val_loss: 0.6920 - val_acc: 0.8903
Epoch 65/500
225s 451ms/step - loss: 0.7279 - acc: 0.8787 - val_loss: 0.6723 - val_acc: 0.9003
Epoch 66/500
225s 451ms/step - loss: 0.7229 - acc: 0.8801 - val_loss: 0.6937 - val_acc: 0.8939
Epoch 67/500
225s 450ms/step - loss: 0.7207 - acc: 0.8795 - val_loss: 0.7028 - val_acc: 0.8928
Epoch 68/500
226s 451ms/step - loss: 0.7226 - acc: 0.8804 - val_loss: 0.6830 - val_acc: 0.8942
Epoch 69/500
225s 451ms/step - loss: 0.7210 - acc: 0.8801 - val_loss: 0.6928 - val_acc: 0.8941
Epoch 70/500
225s 451ms/step - loss: 0.7197 - acc: 0.8817 - val_loss: 0.6946 - val_acc: 0.8912
Epoch 71/500
225s 450ms/step - loss: 0.7228 - acc: 0.8799 - val_loss: 0.6721 - val_acc: 0.9010
Epoch 72/500
226s 451ms/step - loss: 0.7195 - acc: 0.8836 - val_loss: 0.6764 - val_acc: 0.9032
Epoch 73/500
225s 450ms/step - loss: 0.7210 - acc: 0.8810 - val_loss: 0.6776 - val_acc: 0.8979
Epoch 74/500
225s 451ms/step - loss: 0.7174 - acc: 0.8819 - val_loss: 0.6784 - val_acc: 0.8965
Epoch 75/500
225s 451ms/step - loss: 0.7144 - acc: 0.8838 - val_loss: 0.6799 - val_acc: 0.8988
Epoch 76/500
225s 451ms/step - loss: 0.7188 - acc: 0.8814 - val_loss: 0.6884 - val_acc: 0.8945
Epoch 77/500
225s 451ms/step - loss: 0.7188 - acc: 0.8833 - val_loss: 0.7054 - val_acc: 0.8915
Epoch 78/500
225s 451ms/step - loss: 0.7147 - acc: 0.8835 - val_loss: 0.6905 - val_acc: 0.8957
Epoch 79/500
225s 451ms/step - loss: 0.7168 - acc: 0.8830 - val_loss: 0.6794 - val_acc: 0.9000
Epoch 80/500
225s 451ms/step - loss: 0.7150 - acc: 0.8829 - val_loss: 0.6843 - val_acc: 0.8957
Epoch 81/500
225s 451ms/step - loss: 0.7102 - acc: 0.8846 - val_loss: 0.6813 - val_acc: 0.8954
Epoch 82/500
226s 451ms/step - loss: 0.7093 - acc: 0.8844 - val_loss: 0.6944 - val_acc: 0.8913
Epoch 83/500
225s 451ms/step - loss: 0.7105 - acc: 0.8840 - val_loss: 0.6791 - val_acc: 0.8964
Epoch 84/500
225s 451ms/step - loss: 0.7068 - acc: 0.8872 - val_loss: 0.6921 - val_acc: 0.8905
Epoch 85/500
225s 451ms/step - loss: 0.7118 - acc: 0.8866 - val_loss: 0.6970 - val_acc: 0.8921
Epoch 86/500
225s 451ms/step - loss: 0.7108 - acc: 0.8842 - val_loss: 0.6891 - val_acc: 0.8955
Epoch 87/500
225s 451ms/step - loss: 0.7105 - acc: 0.8832 - val_loss: 0.6872 - val_acc: 0.8949
Epoch 88/500
225s 451ms/step - loss: 0.7133 - acc: 0.8846 - val_loss: 0.6777 - val_acc: 0.8978
Epoch 89/500
225s 451ms/step - loss: 0.7105 - acc: 0.8853 - val_loss: 0.6784 - val_acc: 0.8953
Epoch 90/500
225s 451ms/step - loss: 0.7031 - acc: 0.8884 - val_loss: 0.6937 - val_acc: 0.8952
Epoch 91/500
225s 451ms/step - loss: 0.7002 - acc: 0.8892 - val_loss: 0.6709 - val_acc: 0.9001
Epoch 92/500
225s 451ms/step - loss: 0.7098 - acc: 0.8863 - val_loss: 0.6674 - val_acc: 0.9002
Epoch 93/500
225s 451ms/step - loss: 0.7034 - acc: 0.8882 - val_loss: 0.7211 - val_acc: 0.8831
Epoch 94/500
225s 450ms/step - loss: 0.7056 - acc: 0.8870 - val_loss: 0.6597 - val_acc: 0.9043
Epoch 95/500
225s 450ms/step - loss: 0.7070 - acc: 0.8861 - val_loss: 0.6682 - val_acc: 0.9026
Epoch 96/500
221s 442ms/step - loss: 0.7015 - acc: 0.8893 - val_loss: 0.6766 - val_acc: 0.9009
Epoch 97/500
224s 448ms/step - loss: 0.7089 - acc: 0.8855 - val_loss: 0.6844 - val_acc: 0.8970
Epoch 98/500
225s 450ms/step - loss: 0.7052 - acc: 0.8885 - val_loss: 0.6668 - val_acc: 0.9040
Epoch 99/500
225s 451ms/step - loss: 0.7072 - acc: 0.8879 - val_loss: 0.6808 - val_acc: 0.8978
Epoch 100/500
225s 451ms/step - loss: 0.7016 - acc: 0.8891 - val_loss: 0.6898 - val_acc: 0.8935
Epoch 101/500
225s 451ms/step - loss: 0.7018 - acc: 0.8888 - val_loss: 0.6803 - val_acc: 0.8980
Epoch 102/500
225s 451ms/step - loss: 0.7099 - acc: 0.8865 - val_loss: 0.6773 - val_acc: 0.8986
Epoch 103/500
225s 451ms/step - loss: 0.7075 - acc: 0.8875 - val_loss: 0.6743 - val_acc: 0.9014
Epoch 104/500
225s 451ms/step - loss: 0.7048 - acc: 0.8881 - val_loss: 0.6627 - val_acc: 0.9064
Epoch 105/500
225s 451ms/step - loss: 0.7041 - acc: 0.8890 - val_loss: 0.6741 - val_acc: 0.9032
Epoch 106/500
226s 451ms/step - loss: 0.7036 - acc: 0.8884 - val_loss: 0.6736 - val_acc: 0.9037
Epoch 107/500
225s 451ms/step - loss: 0.7043 - acc: 0.8882 - val_loss: 0.6758 - val_acc: 0.9005
Epoch 108/500
225s 451ms/step - loss: 0.7024 - acc: 0.8891 - val_loss: 0.6812 - val_acc: 0.8990
Epoch 109/500
225s 450ms/step - loss: 0.7044 - acc: 0.8872 - val_loss: 0.6736 - val_acc: 0.9016
Epoch 110/500
225s 451ms/step - loss: 0.6999 - acc: 0.8913 - val_loss: 0.6756 - val_acc: 0.9007
Epoch 111/500
226s 451ms/step - loss: 0.6951 - acc: 0.8930 - val_loss: 0.6871 - val_acc: 0.8945
Epoch 112/500
226s 451ms/step - loss: 0.6970 - acc: 0.8898 - val_loss: 0.6875 - val_acc: 0.8950
Epoch 113/500
225s 450ms/step - loss: 0.7006 - acc: 0.8902 - val_loss: 0.6711 - val_acc: 0.9032
Epoch 114/500
225s 451ms/step - loss: 0.7000 - acc: 0.8896 - val_loss: 0.6824 - val_acc: 0.8962
Epoch 115/500
225s 450ms/step - loss: 0.6969 - acc: 0.8904 - val_loss: 0.6761 - val_acc: 0.8975
Epoch 116/500
225s 451ms/step - loss: 0.6939 - acc: 0.8913 - val_loss: 0.6924 - val_acc: 0.8974
Epoch 117/500
225s 451ms/step - loss: 0.7028 - acc: 0.8895 - val_loss: 0.6773 - val_acc: 0.9014
Epoch 118/500
225s 450ms/step - loss: 0.6994 - acc: 0.8906 - val_loss: 0.7111 - val_acc: 0.8884
Epoch 119/500
225s 451ms/step - loss: 0.7059 - acc: 0.8889 - val_loss: 0.6947 - val_acc: 0.8955
Epoch 120/500
226s 451ms/step - loss: 0.7000 - acc: 0.8902 - val_loss: 0.6832 - val_acc: 0.8976
Epoch 121/500
225s 451ms/step - loss: 0.6976 - acc: 0.8911 - val_loss: 0.6770 - val_acc: 0.9027
Epoch 122/500
226s 451ms/step - loss: 0.6962 - acc: 0.8918 - val_loss: 0.7034 - val_acc: 0.8925
Epoch 123/500
225s 451ms/step - loss: 0.6908 - acc: 0.8940 - val_loss: 0.6872 - val_acc: 0.8974
Epoch 124/500
225s 450ms/step - loss: 0.7004 - acc: 0.8896 - val_loss: 0.6788 - val_acc: 0.8979
Epoch 125/500
225s 451ms/step - loss: 0.6953 - acc: 0.8924 - val_loss: 0.6973 - val_acc: 0.8933
Epoch 126/500
225s 451ms/step - loss: 0.6998 - acc: 0.8913 - val_loss: 0.6845 - val_acc: 0.8960
Epoch 127/500
225s 451ms/step - loss: 0.7004 - acc: 0.8908 - val_loss: 0.6787 - val_acc: 0.9009
Epoch 128/500
225s 451ms/step - loss: 0.7020 - acc: 0.8898 - val_loss: 0.6899 - val_acc: 0.8970
Epoch 129/500
225s 451ms/step - loss: 0.6948 - acc: 0.8928 - val_loss: 0.6748 - val_acc: 0.9026
Epoch 130/500
221s 443ms/step - loss: 0.6958 - acc: 0.8922 - val_loss: 0.6656 - val_acc: 0.9032
Epoch 131/500
220s 440ms/step - loss: 0.6950 - acc: 0.8929 - val_loss: 0.7022 - val_acc: 0.8930
Epoch 132/500
220s 441ms/step - loss: 0.6973 - acc: 0.8918 - val_loss: 0.6895 - val_acc: 0.8919
Epoch 133/500
220s 441ms/step - loss: 0.6927 - acc: 0.8932 - val_loss: 0.6894 - val_acc: 0.8936
Epoch 134/500
220s 441ms/step - loss: 0.6966 - acc: 0.8910 - val_loss: 0.6820 - val_acc: 0.8989
Epoch 135/500
220s 441ms/step - loss: 0.7001 - acc: 0.8912 - val_loss: 0.6699 - val_acc: 0.9020
Epoch 136/500
221s 441ms/step - loss: 0.6937 - acc: 0.8935 - val_loss: 0.6767 - val_acc: 0.8963
Epoch 137/500
221s 442ms/step - loss: 0.6915 - acc: 0.8933 - val_loss: 0.6711 - val_acc: 0.9046
Epoch 138/500
221s 441ms/step - loss: 0.6897 - acc: 0.8931 - val_loss: 0.6808 - val_acc: 0.8977
Epoch 139/500
220s 440ms/step - loss: 0.6931 - acc: 0.8921 - val_loss: 0.6911 - val_acc: 0.8960
Epoch 140/500
221s 441ms/step - loss: 0.6956 - acc: 0.8927 - val_loss: 0.6699 - val_acc: 0.9041
Epoch 141/500
220s 441ms/step - loss: 0.6909 - acc: 0.8939 - val_loss: 0.6755 - val_acc: 0.8995
Epoch 142/500
221s 441ms/step - loss: 0.6919 - acc: 0.8943 - val_loss: 0.6701 - val_acc: 0.9018
Epoch 143/500
221s 442ms/step - loss: 0.6932 - acc: 0.8930 - val_loss: 0.6764 - val_acc: 0.9030
Epoch 144/500
221s 441ms/step - loss: 0.6964 - acc: 0.8930 - val_loss: 0.6952 - val_acc: 0.8951
Epoch 145/500
220s 441ms/step - loss: 0.6910 - acc: 0.8926 - val_loss: 0.6635 - val_acc: 0.9064
Epoch 146/500
220s 441ms/step - loss: 0.6973 - acc: 0.8925 - val_loss: 0.6861 - val_acc: 0.8976
Epoch 147/500
220s 440ms/step - loss: 0.6910 - acc: 0.8927 - val_loss: 0.6739 - val_acc: 0.9041
Epoch 148/500
220s 441ms/step - loss: 0.6919 - acc: 0.8936 - val_loss: 0.6705 - val_acc: 0.9049
Epoch 149/500
220s 441ms/step - loss: 0.6925 - acc: 0.8936 - val_loss: 0.6694 - val_acc: 0.9025
Epoch 150/500
220s 441ms/step - loss: 0.6944 - acc: 0.8928 - val_loss: 0.6793 - val_acc: 0.8986
Epoch 151/500
lr changed to 0.010000000149011612
220s 441ms/step - loss: 0.5821 - acc: 0.9317 - val_loss: 0.5776 - val_acc: 0.9323
Epoch 152/500
220s 441ms/step - loss: 0.5235 - acc: 0.9495 - val_loss: 0.5587 - val_acc: 0.9370
Epoch 153/500
220s 441ms/step - loss: 0.5024 - acc: 0.9543 - val_loss: 0.5500 - val_acc: 0.9381
Epoch 154/500
221s 441ms/step - loss: 0.4852 - acc: 0.9583 - val_loss: 0.5434 - val_acc: 0.9393
Epoch 155/500
220s 440ms/step - loss: 0.4739 - acc: 0.9607 - val_loss: 0.5420 - val_acc: 0.9374
Epoch 156/500
220s 440ms/step - loss: 0.4595 - acc: 0.9631 - val_loss: 0.5295 - val_acc: 0.9397
Epoch 157/500
221s 441ms/step - loss: 0.4497 - acc: 0.9647 - val_loss: 0.5211 - val_acc: 0.9406
Epoch 158/500
220s 441ms/step - loss: 0.4421 - acc: 0.9653 - val_loss: 0.5143 - val_acc: 0.9411
Epoch 159/500
221s 441ms/step - loss: 0.4317 - acc: 0.9660 - val_loss: 0.5100 - val_acc: 0.9416
Epoch 160/500
221s 441ms/step - loss: 0.4200 - acc: 0.9692 - val_loss: 0.5001 - val_acc: 0.9459
Epoch 161/500
221s 441ms/step - loss: 0.4136 - acc: 0.9686 - val_loss: 0.4992 - val_acc: 0.9447
Epoch 162/500
220s 440ms/step - loss: 0.4050 - acc: 0.9708 - val_loss: 0.4958 - val_acc: 0.9420
Epoch 163/500
220s 441ms/step - loss: 0.4000 - acc: 0.9709 - val_loss: 0.4927 - val_acc: 0.9432
Epoch 164/500
220s 441ms/step - loss: 0.3903 - acc: 0.9721 - val_loss: 0.4920 - val_acc: 0.9431
Epoch 165/500
220s 441ms/step - loss: 0.3828 - acc: 0.9730 - val_loss: 0.4873 - val_acc: 0.9426
Epoch 166/500
220s 441ms/step - loss: 0.3785 - acc: 0.9733 - val_loss: 0.4890 - val_acc: 0.9394
Epoch 167/500
220s 441ms/step - loss: 0.3722 - acc: 0.9735 - val_loss: 0.4821 - val_acc: 0.9408
Epoch 168/500
221s 441ms/step - loss: 0.3642 - acc: 0.9755 - val_loss: 0.4671 - val_acc: 0.9428
Epoch 169/500
220s 441ms/step - loss: 0.3602 - acc: 0.9751 - val_loss: 0.4627 - val_acc: 0.9444
Epoch 170/500
220s 441ms/step - loss: 0.3544 - acc: 0.9756 - val_loss: 0.4749 - val_acc: 0.9396
Epoch 171/500
221s 441ms/step - loss: 0.3498 - acc: 0.9758 - val_loss: 0.4694 - val_acc: 0.9420
Epoch 172/500
221s 442ms/step - loss: 0.3465 - acc: 0.9761 - val_loss: 0.4702 - val_acc: 0.9391
Epoch 173/500
220s 441ms/step - loss: 0.3428 - acc: 0.9761 - val_loss: 0.4564 - val_acc: 0.9429
Epoch 174/500
224s 447ms/step - loss: 0.3382 - acc: 0.9763 - val_loss: 0.4583 - val_acc: 0.9406
Epoch 175/500
221s 442ms/step - loss: 0.3277 - acc: 0.9789 - val_loss: 0.4522 - val_acc: 0.9418
Epoch 176/500
224s 448ms/step - loss: 0.3287 - acc: 0.9769 - val_loss: 0.4466 - val_acc: 0.9420
Epoch 177/500
225s 450ms/step - loss: 0.3249 - acc: 0.9766 - val_loss: 0.4433 - val_acc: 0.9435
Epoch 178/500
225s 450ms/step - loss: 0.3182 - acc: 0.9785 - val_loss: 0.4391 - val_acc: 0.9420
Epoch 179/500
223s 447ms/step - loss: 0.3148 - acc: 0.9781 - val_loss: 0.4420 - val_acc: 0.9387
Epoch 180/500
221s 442ms/step - loss: 0.3119 - acc: 0.9781 - val_loss: 0.4455 - val_acc: 0.9399
Epoch 181/500
222s 443ms/step - loss: 0.3051 - acc: 0.9795 - val_loss: 0.4441 - val_acc: 0.9388
Epoch 182/500
225s 450ms/step - loss: 0.3056 - acc: 0.9792 - val_loss: 0.4391 - val_acc: 0.9390
Epoch 183/500
224s 448ms/step - loss: 0.3042 - acc: 0.9779 - val_loss: 0.4373 - val_acc: 0.9397
Epoch 184/500
221s 442ms/step - loss: 0.2972 - acc: 0.9791 - val_loss: 0.4329 - val_acc: 0.9398
Epoch 185/500
225s 450ms/step - loss: 0.2929 - acc: 0.9800 - val_loss: 0.4325 - val_acc: 0.9401
Epoch 186/500
225s 450ms/step - loss: 0.2905 - acc: 0.9807 - val_loss: 0.4300 - val_acc: 0.9392
Epoch 187/500
225s 450ms/step - loss: 0.2876 - acc: 0.9803 - val_loss: 0.4242 - val_acc: 0.9405
Epoch 188/500
225s 450ms/step - loss: 0.2842 - acc: 0.9806 - val_loss: 0.4245 - val_acc: 0.9396
Epoch 189/500
225s 450ms/step - loss: 0.2856 - acc: 0.9785 - val_loss: 0.4227 - val_acc: 0.9406
Epoch 190/500
224s 448ms/step - loss: 0.2823 - acc: 0.9794 - val_loss: 0.4057 - val_acc: 0.9425
Epoch 191/500
221s 442ms/step - loss: 0.2757 - acc: 0.9811 - val_loss: 0.4065 - val_acc: 0.9422
Epoch 192/500
221s 442ms/step - loss: 0.2775 - acc: 0.9799 - val_loss: 0.4066 - val_acc: 0.9430
Epoch 193/500
222s 445ms/step - loss: 0.2756 - acc: 0.9796 - val_loss: 0.4032 - val_acc: 0.9419
Epoch 194/500
225s 450ms/step - loss: 0.2733 - acc: 0.9795 - val_loss: 0.4105 - val_acc: 0.9391
Epoch 195/500
225s 451ms/step - loss: 0.2689 - acc: 0.9809 - val_loss: 0.4044 - val_acc: 0.9418
Epoch 196/500
225s 450ms/step - loss: 0.2678 - acc: 0.9802 - val_loss: 0.3969 - val_acc: 0.9425
Epoch 197/500
225s 450ms/step - loss: 0.2612 - acc: 0.9825 - val_loss: 0.3984 - val_acc: 0.9437
Epoch 198/500
225s 450ms/step - loss: 0.2686 - acc: 0.9783 - val_loss: 0.4037 - val_acc: 0.9374
Epoch 199/500
225s 450ms/step - loss: 0.2650 - acc: 0.9796 - val_loss: 0.3946 - val_acc: 0.9383
Epoch 200/500
225s 450ms/step - loss: 0.2595 - acc: 0.9804 - val_loss: 0.3933 - val_acc: 0.9401
Epoch 201/500
225s 450ms/step - loss: 0.2575 - acc: 0.9813 - val_loss: 0.3920 - val_acc: 0.9396
Epoch 202/500
225s 450ms/step - loss: 0.2610 - acc: 0.9788 - val_loss: 0.3916 - val_acc: 0.9383
Epoch 203/500
224s 448ms/step - loss: 0.2591 - acc: 0.9796 - val_loss: 0.4071 - val_acc: 0.9366
Epoch 204/500
221s 442ms/step - loss: 0.2575 - acc: 0.9794 - val_loss: 0.3900 - val_acc: 0.9390
Epoch 205/500
224s 448ms/step - loss: 0.2534 - acc: 0.9801 - val_loss: 0.3909 - val_acc: 0.9394
Epoch 206/500
225s 450ms/step - loss: 0.2505 - acc: 0.9813 - val_loss: 0.3957 - val_acc: 0.9407
Epoch 207/500
222s 443ms/step - loss: 0.2473 - acc: 0.9808 - val_loss: 0.3851 - val_acc: 0.9403
Epoch 208/500
220s 441ms/step - loss: 0.2490 - acc: 0.9803 - val_loss: 0.3753 - val_acc: 0.9435
Epoch 209/500
220s 441ms/step - loss: 0.2467 - acc: 0.9808 - val_loss: 0.3765 - val_acc: 0.9431
Epoch 210/500
220s 441ms/step - loss: 0.2457 - acc: 0.9805 - val_loss: 0.3830 - val_acc: 0.9407
Epoch 211/500
220s 441ms/step - loss: 0.2430 - acc: 0.9815 - val_loss: 0.3849 - val_acc: 0.9414
Epoch 212/500
221s 441ms/step - loss: 0.2483 - acc: 0.9789 - val_loss: 0.3818 - val_acc: 0.9407
Epoch 213/500
221s 441ms/step - loss: 0.2394 - acc: 0.9812 - val_loss: 0.3814 - val_acc: 0.9384
Epoch 214/500
220s 441ms/step - loss: 0.2425 - acc: 0.9797 - val_loss: 0.3818 - val_acc: 0.9400
Epoch 215/500
221s 441ms/step - loss: 0.2417 - acc: 0.9802 - val_loss: 0.3813 - val_acc: 0.9381
Epoch 216/500
220s 441ms/step - loss: 0.2433 - acc: 0.9790 - val_loss: 0.3790 - val_acc: 0.9382
Epoch 217/500
220s 441ms/step - loss: 0.2413 - acc: 0.9800 - val_loss: 0.3832 - val_acc: 0.9388
Epoch 218/500
221s 442ms/step - loss: 0.2386 - acc: 0.9798 - val_loss: 0.3793 - val_acc: 0.9377
Epoch 219/500
220s 441ms/step - loss: 0.2396 - acc: 0.9797 - val_loss: 0.3909 - val_acc: 0.9340
Epoch 220/500
221s 441ms/step - loss: 0.2376 - acc: 0.9796 - val_loss: 0.3930 - val_acc: 0.9364
Epoch 221/500
221s 441ms/step - loss: 0.2365 - acc: 0.9806 - val_loss: 0.3738 - val_acc: 0.9370
Epoch 222/500
220s 441ms/step - loss: 0.2398 - acc: 0.9785 - val_loss: 0.3940 - val_acc: 0.9340
Epoch 223/500
220s 441ms/step - loss: 0.2359 - acc: 0.9800 - val_loss: 0.3768 - val_acc: 0.9411
Epoch 224/500
221s 441ms/step - loss: 0.2365 - acc: 0.9795 - val_loss: 0.3841 - val_acc: 0.9354
Epoch 225/500
221s 442ms/step - loss: 0.2353 - acc: 0.9802 - val_loss: 0.3856 - val_acc: 0.9374
Epoch 226/500
221s 441ms/step - loss: 0.2389 - acc: 0.9783 - val_loss: 0.3753 - val_acc: 0.9379
Epoch 227/500
220s 441ms/step - loss: 0.2312 - acc: 0.9809 - val_loss: 0.3766 - val_acc: 0.9403
Epoch 228/500
220s 441ms/step - loss: 0.2394 - acc: 0.9772 - val_loss: 0.3825 - val_acc: 0.9374
Epoch 229/500
220s 440ms/step - loss: 0.2333 - acc: 0.9795 - val_loss: 0.3886 - val_acc: 0.9352
Epoch 230/500
220s 441ms/step - loss: 0.2290 - acc: 0.9804 - val_loss: 0.3754 - val_acc: 0.9375
Epoch 231/500
221s 441ms/step - loss: 0.2297 - acc: 0.9804 - val_loss: 0.3832 - val_acc: 0.9370
Epoch 232/500
221s 442ms/step - loss: 0.2333 - acc: 0.9790 - val_loss: 0.3736 - val_acc: 0.9388
Epoch 233/500
221s 442ms/step - loss: 0.2344 - acc: 0.9781 - val_loss: 0.3842 - val_acc: 0.9363
Epoch 234/500
220s 441ms/step - loss: 0.2314 - acc: 0.9797 - val_loss: 0.3821 - val_acc: 0.9355
Epoch 235/500
220s 440ms/step - loss: 0.2304 - acc: 0.9794 - val_loss: 0.3787 - val_acc: 0.9368
Epoch 236/500
221s 442ms/step - loss: 0.2330 - acc: 0.9784 - val_loss: 0.3721 - val_acc: 0.9369
Epoch 237/500
220s 440ms/step - loss: 0.2317 - acc: 0.9788 - val_loss: 0.3697 - val_acc: 0.9387
Epoch 238/500
221s 441ms/step - loss: 0.2286 - acc: 0.9792 - val_loss: 0.3800 - val_acc: 0.9375
Epoch 239/500
220s 441ms/step - loss: 0.2312 - acc: 0.9788 - val_loss: 0.3691 - val_acc: 0.9399
Epoch 240/500
220s 441ms/step - loss: 0.2300 - acc: 0.9790 - val_loss: 0.3751 - val_acc: 0.9399
Epoch 241/500
220s 441ms/step - loss: 0.2266 - acc: 0.9799 - val_loss: 0.3759 - val_acc: 0.9363
Epoch 242/500
220s 441ms/step - loss: 0.2308 - acc: 0.9785 - val_loss: 0.3801 - val_acc: 0.9365
Epoch 243/500
220s 440ms/step - loss: 0.2270 - acc: 0.9796 - val_loss: 0.3688 - val_acc: 0.9390
Epoch 244/500
220s 441ms/step - loss: 0.2259 - acc: 0.9799 - val_loss: 0.3671 - val_acc: 0.9404
Epoch 245/500
220s 441ms/step - loss: 0.2261 - acc: 0.9811 - val_loss: 0.3679 - val_acc: 0.9365
Epoch 246/500
220s 441ms/step - loss: 0.2266 - acc: 0.9792 - val_loss: 0.3778 - val_acc: 0.9353
Epoch 247/500
221s 441ms/step - loss: 0.2276 - acc: 0.9788 - val_loss: 0.3714 - val_acc: 0.9368
Epoch 248/500
221s 441ms/step - loss: 0.2247 - acc: 0.9798 - val_loss: 0.3816 - val_acc: 0.9332
Epoch 249/500
220s 441ms/step - loss: 0.2263 - acc: 0.9793 - val_loss: 0.3611 - val_acc: 0.9409
Epoch 250/500
220s 441ms/step - loss: 0.2289 - acc: 0.9784 - val_loss: 0.3810 - val_acc: 0.9349
Epoch 251/500
220s 440ms/step - loss: 0.2283 - acc: 0.9776 - val_loss: 0.3684 - val_acc: 0.9353
Epoch 252/500
220s 440ms/step - loss: 0.2269 - acc: 0.9789 - val_loss: 0.3777 - val_acc: 0.9352
Epoch 253/500
220s 441ms/step - loss: 0.2251 - acc: 0.9795 - val_loss: 0.3760 - val_acc: 0.9355
Epoch 254/500
220s 441ms/step - loss: 0.2305 - acc: 0.9773 - val_loss: 0.3834 - val_acc: 0.9354
Epoch 255/500
221s 441ms/step - loss: 0.2241 - acc: 0.9790 - val_loss: 0.3709 - val_acc: 0.9379
Epoch 256/500
220s 440ms/step - loss: 0.2255 - acc: 0.9788 - val_loss: 0.3664 - val_acc: 0.9367
Epoch 257/500
220s 441ms/step - loss: 0.2235 - acc: 0.9799 - val_loss: 0.3739 - val_acc: 0.9364
Epoch 258/500
221s 441ms/step - loss: 0.2268 - acc: 0.9788 - val_loss: 0.3718 - val_acc: 0.9358
Epoch 259/500
220s 440ms/step - loss: 0.2211 - acc: 0.9799 - val_loss: 0.3787 - val_acc: 0.9360
Epoch 260/500
221s 441ms/step - loss: 0.2253 - acc: 0.9784 - val_loss: 0.3616 - val_acc: 0.9384
Epoch 261/500
220s 441ms/step - loss: 0.2215 - acc: 0.9803 - val_loss: 0.3872 - val_acc: 0.9318
Epoch 262/500
220s 441ms/step - loss: 0.2277 - acc: 0.9779 - val_loss: 0.3808 - val_acc: 0.9360
Epoch 263/500
221s 442ms/step - loss: 0.2268 - acc: 0.9779 - val_loss: 0.3859 - val_acc: 0.9343
Epoch 264/500
220s 441ms/step - loss: 0.2246 - acc: 0.9791 - val_loss: 0.3848 - val_acc: 0.9330
Epoch 265/500
221s 441ms/step - loss: 0.2246 - acc: 0.9783 - val_loss: 0.3800 - val_acc: 0.9354
Epoch 266/500
220s 441ms/step - loss: 0.2260 - acc: 0.9786 - val_loss: 0.3780 - val_acc: 0.9337
Epoch 267/500
221s 441ms/step - loss: 0.2216 - acc: 0.9804 - val_loss: 0.3744 - val_acc: 0.9373
Epoch 268/500
221s 441ms/step - loss: 0.2208 - acc: 0.9807 - val_loss: 0.3647 - val_acc: 0.9394
Epoch 269/500
221s 441ms/step - loss: 0.2247 - acc: 0.9789 - val_loss: 0.3728 - val_acc: 0.9348
Epoch 270/500
221s 441ms/step - loss: 0.2190 - acc: 0.9804 - val_loss: 0.3703 - val_acc: 0.9366
Epoch 271/500
221s 441ms/step - loss: 0.2213 - acc: 0.9798 - val_loss: 0.3617 - val_acc: 0.9370
Epoch 272/500
221s 441ms/step - loss: 0.2255 - acc: 0.9776 - val_loss: 0.3695 - val_acc: 0.9377
Epoch 273/500
220s 441ms/step - loss: 0.2245 - acc: 0.9781 - val_loss: 0.3775 - val_acc: 0.9349
Epoch 274/500
221s 441ms/step - loss: 0.2225 - acc: 0.9785 - val_loss: 0.3806 - val_acc: 0.9345
Epoch 275/500
221s 441ms/step - loss: 0.2229 - acc: 0.9794 - val_loss: 0.3718 - val_acc: 0.9373
Epoch 276/500
221s 441ms/step - loss: 0.2195 - acc: 0.9806 - val_loss: 0.3849 - val_acc: 0.9339
Epoch 277/500
221s 441ms/step - loss: 0.2204 - acc: 0.9796 - val_loss: 0.3656 - val_acc: 0.9390
Epoch 278/500
221s 441ms/step - loss: 0.2195 - acc: 0.9800 - val_loss: 0.3760 - val_acc: 0.9374
Epoch 279/500
220s 441ms/step - loss: 0.2240 - acc: 0.9790 - val_loss: 0.3694 - val_acc: 0.9344
Epoch 280/500
220s 441ms/step - loss: 0.2203 - acc: 0.9800 - val_loss: 0.3602 - val_acc: 0.9386
Epoch 281/500
221s 441ms/step - loss: 0.2201 - acc: 0.9801 - val_loss: 0.3794 - val_acc: 0.9354
Epoch 282/500
221s 441ms/step - loss: 0.2208 - acc: 0.9802 - val_loss: 0.3660 - val_acc: 0.9377
Epoch 283/500
220s 441ms/step - loss: 0.2164 - acc: 0.9808 - val_loss: 0.3827 - val_acc: 0.9327
Epoch 284/500
221s 441ms/step - loss: 0.2227 - acc: 0.9785 - val_loss: 0.3633 - val_acc: 0.9401
Epoch 285/500
221s 441ms/step - loss: 0.2184 - acc: 0.9808 - val_loss: 0.3862 - val_acc: 0.9309
Epoch 286/500
221s 441ms/step - loss: 0.2159 - acc: 0.9814 - val_loss: 0.3762 - val_acc: 0.9375
Epoch 287/500
221s 442ms/step - loss: 0.2238 - acc: 0.9775 - val_loss: 0.3692 - val_acc: 0.9336
Epoch 288/500
221s 441ms/step - loss: 0.2228 - acc: 0.9786 - val_loss: 0.3746 - val_acc: 0.9354
Epoch 289/500
220s 441ms/step - loss: 0.2197 - acc: 0.9805 - val_loss: 0.3581 - val_acc: 0.9392
Epoch 290/500
222s 444ms/step - loss: 0.2174 - acc: 0.9806 - val_loss: 0.3626 - val_acc: 0.9376
Epoch 291/500
221s 441ms/step - loss: 0.2201 - acc: 0.9796 - val_loss: 0.3834 - val_acc: 0.9323
Epoch 292/500
221s 442ms/step - loss: 0.2217 - acc: 0.9788 - val_loss: 0.3770 - val_acc: 0.9356
Epoch 293/500
221s 442ms/step - loss: 0.2214 - acc: 0.9791 - val_loss: 0.3685 - val_acc: 0.9359
Epoch 294/500
221s 442ms/step - loss: 0.2186 - acc: 0.9795 - val_loss: 0.3708 - val_acc: 0.9375
Epoch 295/500
221s 442ms/step - loss: 0.2191 - acc: 0.9798 - val_loss: 0.3763 - val_acc: 0.9367
Epoch 296/500
221s 442ms/step - loss: 0.2200 - acc: 0.9803 - val_loss: 0.3730 - val_acc: 0.9362
Epoch 297/500
221s 442ms/step - loss: 0.2207 - acc: 0.9795 - val_loss: 0.3731 - val_acc: 0.9350
Epoch 298/500
221s 441ms/step - loss: 0.2197 - acc: 0.9793 - val_loss: 0.3533 - val_acc: 0.9387
Epoch 299/500
221s 441ms/step - loss: 0.2201 - acc: 0.9797 - val_loss: 0.3747 - val_acc: 0.9365
Epoch 300/500
221s 441ms/step - loss: 0.2160 - acc: 0.9809 - val_loss: 0.3678 - val_acc: 0.9386
Epoch 301/500
lr changed to 0.0009999999776482583
221s 442ms/step - loss: 0.2016 - acc: 0.9862 - val_loss: 0.3429 - val_acc: 0.9460
Epoch 302/500
221s 442ms/step - loss: 0.1867 - acc: 0.9912 - val_loss: 0.3401 - val_acc: 0.9479
Epoch 303/500
221s 442ms/step - loss: 0.1819 - acc: 0.9931 - val_loss: 0.3386 - val_acc: 0.9472
Epoch 304/500
221s 441ms/step - loss: 0.1794 - acc: 0.9943 - val_loss: 0.3365 - val_acc: 0.9486
Epoch 305/500
221s 442ms/step - loss: 0.1787 - acc: 0.9938 - val_loss: 0.3357 - val_acc: 0.9490
Epoch 306/500
221s 442ms/step - loss: 0.1760 - acc: 0.9951 - val_loss: 0.3340 - val_acc: 0.9482
Epoch 307/500
221s 442ms/step - loss: 0.1769 - acc: 0.9943 - val_loss: 0.3335 - val_acc: 0.9489
Epoch 308/500
221s 442ms/step - loss: 0.1746 - acc: 0.9952 - val_loss: 0.3342 - val_acc: 0.9501
Epoch 309/500
220s 441ms/step - loss: 0.1731 - acc: 0.9955 - val_loss: 0.3358 - val_acc: 0.9494
Epoch 310/500
221s 441ms/step - loss: 0.1727 - acc: 0.9957 - val_loss: 0.3339 - val_acc: 0.9501
Epoch 311/500
221s 441ms/step - loss: 0.1720 - acc: 0.9958 - val_loss: 0.3305 - val_acc: 0.9511
Epoch 312/500
221s 442ms/step - loss: 0.1715 - acc: 0.9960 - val_loss: 0.3325 - val_acc: 0.9510
Epoch 313/500
221s 441ms/step - loss: 0.1713 - acc: 0.9956 - val_loss: 0.3348 - val_acc: 0.9495
Epoch 314/500
221s 442ms/step - loss: 0.1697 - acc: 0.9963 - val_loss: 0.3338 - val_acc: 0.9500
Epoch 315/500
221s 441ms/step - loss: 0.1693 - acc: 0.9965 - val_loss: 0.3344 - val_acc: 0.9500
Epoch 316/500
221s 442ms/step - loss: 0.1687 - acc: 0.9960 - val_loss: 0.3332 - val_acc: 0.9507
Epoch 317/500
221s 442ms/step - loss: 0.1673 - acc: 0.9967 - val_loss: 0.3317 - val_acc: 0.9504
Epoch 318/500
221s 442ms/step - loss: 0.1678 - acc: 0.9965 - val_loss: 0.3321 - val_acc: 0.9502
Epoch 319/500
221s 442ms/step - loss: 0.1668 - acc: 0.9968 - val_loss: 0.3320 - val_acc: 0.9495
Epoch 320/500
221s 442ms/step - loss: 0.1671 - acc: 0.9965 - val_loss: 0.3326 - val_acc: 0.9493
Epoch 321/500
221s 442ms/step - loss: 0.1651 - acc: 0.9973 - val_loss: 0.3311 - val_acc: 0.9510
Epoch 322/500
221s 442ms/step - loss: 0.1659 - acc: 0.9967 - val_loss: 0.3320 - val_acc: 0.9498
Epoch 323/500
221s 441ms/step - loss: 0.1659 - acc: 0.9965 - val_loss: 0.3319 - val_acc: 0.9506
Epoch 324/500
221s 441ms/step - loss: 0.1648 - acc: 0.9968 - val_loss: 0.3337 - val_acc: 0.9505
Epoch 325/500
221s 442ms/step - loss: 0.1645 - acc: 0.9967 - val_loss: 0.3342 - val_acc: 0.9495
Epoch 326/500
221s 442ms/step - loss: 0.1640 - acc: 0.9971 - val_loss: 0.3324 - val_acc: 0.9495
Epoch 327/500
221s 442ms/step - loss: 0.1630 - acc: 0.9972 - val_loss: 0.3289 - val_acc: 0.9507
Epoch 328/500
221s 442ms/step - loss: 0.1630 - acc: 0.9972 - val_loss: 0.3306 - val_acc: 0.9512
Epoch 329/500
221s 442ms/step - loss: 0.1636 - acc: 0.9967 - val_loss: 0.3330 - val_acc: 0.9507
Epoch 330/500
221s 442ms/step - loss: 0.1622 - acc: 0.9973 - val_loss: 0.3326 - val_acc: 0.9501
Epoch 331/500
221s 441ms/step - loss: 0.1612 - acc: 0.9975 - val_loss: 0.3305 - val_acc: 0.9514
Epoch 332/500
221s 441ms/step - loss: 0.1600 - acc: 0.9979 - val_loss: 0.3299 - val_acc: 0.9517
Epoch 333/500
221s 441ms/step - loss: 0.1597 - acc: 0.9980 - val_loss: 0.3313 - val_acc: 0.9511
Epoch 334/500
221s 441ms/step - loss: 0.1607 - acc: 0.9972 - val_loss: 0.3278 - val_acc: 0.9517
Epoch 335/500
221s 443ms/step - loss: 0.1607 - acc: 0.9974 - val_loss: 0.3277 - val_acc: 0.9527
Epoch 336/500
221s 441ms/step - loss: 0.1588 - acc: 0.9980 - val_loss: 0.3276 - val_acc: 0.9527
Epoch 337/500
221s 442ms/step - loss: 0.1595 - acc: 0.9973 - val_loss: 0.3257 - val_acc: 0.9520
Epoch 338/500
221s 441ms/step - loss: 0.1585 - acc: 0.9976 - val_loss: 0.3274 - val_acc: 0.9521
Epoch 339/500
221s 442ms/step - loss: 0.1594 - acc: 0.9974 - val_loss: 0.3298 - val_acc: 0.9524
Epoch 340/500
221s 441ms/step - loss: 0.1582 - acc: 0.9977 - val_loss: 0.3282 - val_acc: 0.9530
Epoch 341/500
221s 441ms/step - loss: 0.1590 - acc: 0.9972 - val_loss: 0.3273 - val_acc: 0.9527
Epoch 342/500
221s 441ms/step - loss: 0.1575 - acc: 0.9977 - val_loss: 0.3262 - val_acc: 0.9518
Epoch 343/500
221s 442ms/step - loss: 0.1570 - acc: 0.9979 - val_loss: 0.3263 - val_acc: 0.9515
Epoch 344/500
221s 442ms/step - loss: 0.1574 - acc: 0.9977 - val_loss: 0.3259 - val_acc: 0.9511
Epoch 345/500
221s 442ms/step - loss: 0.1568 - acc: 0.9979 - val_loss: 0.3262 - val_acc: 0.9518
Epoch 346/500
221s 442ms/step - loss: 0.1565 - acc: 0.9978 - val_loss: 0.3262 - val_acc: 0.9531
Epoch 347/500
221s 442ms/step - loss: 0.1558 - acc: 0.9979 - val_loss: 0.3248 - val_acc: 0.9538
...
Epoch 425/500
221s 442ms/step - loss: 0.1356 - acc: 0.9984 - val_loss: 0.3168 - val_acc: 0.9518
Epoch 426/500
221s 442ms/step - loss: 0.1353 - acc: 0.9986 - val_loss: 0.3137 - val_acc: 0.9528
Epoch 427/500
221s 442ms/step - loss: 0.1352 - acc: 0.9983 - val_loss: 0.3135 - val_acc: 0.9522
Epoch 428/500
221s 442ms/step - loss: 0.1349 - acc: 0.9983 - val_loss: 0.3173 - val_acc: 0.9515
Epoch 429/500
221s 441ms/step - loss: 0.1343 - acc: 0.9986 - val_loss: 0.3213 - val_acc: 0.9507
Epoch 430/500
221s 442ms/step - loss: 0.1352 - acc: 0.9982 - val_loss: 0.3126 - val_acc: 0.9522
Epoch 431/500
221s 442ms/step - loss: 0.1349 - acc: 0.9981 - val_loss: 0.3169 - val_acc: 0.9505
Epoch 432/500
221s 441ms/step - loss: 0.1336 - acc: 0.9985 - val_loss: 0.3198 - val_acc: 0.9501
Epoch 433/500
221s 442ms/step - loss: 0.1343 - acc: 0.9982 - val_loss: 0.3190 - val_acc: 0.9510
Epoch 434/500
221s 441ms/step - loss: 0.1337 - acc: 0.9984 - val_loss: 0.3182 - val_acc: 0.9504
Epoch 435/500
221s 442ms/step - loss: 0.1331 - acc: 0.9984 - val_loss: 0.3177 - val_acc: 0.9507
Epoch 436/500
221s 441ms/step - loss: 0.1328 - acc: 0.9986 - val_loss: 0.3173 - val_acc: 0.9512
Epoch 437/500
221s 442ms/step - loss: 0.1331 - acc: 0.9983 - val_loss: 0.3203 - val_acc: 0.9508
Epoch 438/500
221s 442ms/step - loss: 0.1327 - acc: 0.9983 - val_loss: 0.3148 - val_acc: 0.9518
Epoch 439/500
221s 442ms/step - loss: 0.1330 - acc: 0.9983 - val_loss: 0.3128 - val_acc: 0.9517
Epoch 440/500
221s 441ms/step - loss: 0.1329 - acc: 0.9984 - val_loss: 0.3160 - val_acc: 0.9508
Epoch 441/500
221s 442ms/step - loss: 0.1324 - acc: 0.9984 - val_loss: 0.3167 - val_acc: 0.9507
Epoch 442/500
221s 441ms/step - loss: 0.1318 - acc: 0.9986 - val_loss: 0.3176 - val_acc: 0.9513
Epoch 443/500
221s 441ms/step - loss: 0.1317 - acc: 0.9983 - val_loss: 0.3188 - val_acc: 0.9527
Epoch 444/500
221s 442ms/step - loss: 0.1310 - acc: 0.9986 - val_loss: 0.3166 - val_acc: 0.9513
Epoch 445/500
221s 441ms/step - loss: 0.1315 - acc: 0.9982 - val_loss: 0.3168 - val_acc: 0.9513
Epoch 446/500
221s 441ms/step - loss: 0.1311 - acc: 0.9984 - val_loss: 0.3179 - val_acc: 0.9506
Epoch 447/500
221s 442ms/step - loss: 0.1313 - acc: 0.9984 - val_loss: 0.3192 - val_acc: 0.9504
Epoch 448/500
221s 441ms/step - loss: 0.1306 - acc: 0.9985 - val_loss: 0.3191 - val_acc: 0.9512
Epoch 449/500
221s 442ms/step - loss: 0.1302 - acc: 0.9987 - val_loss: 0.3182 - val_acc: 0.9511
Epoch 450/500
221s 442ms/step - loss: 0.1303 - acc: 0.9984 - val_loss: 0.3147 - val_acc: 0.9518
Epoch 451/500
lr changed to 9.999999310821295e-05
221s 441ms/step - loss: 0.1303 - acc: 0.9983 - val_loss: 0.3143 - val_acc: 0.9514
Epoch 452/500
221s 441ms/step - loss: 0.1305 - acc: 0.9984 - val_loss: 0.3135 - val_acc: 0.9513
Epoch 453/500
221s 442ms/step - loss: 0.1298 - acc: 0.9986 - val_loss: 0.3131 - val_acc: 0.9516
Epoch 454/500
221s 442ms/step - loss: 0.1301 - acc: 0.9984 - val_loss: 0.3129 - val_acc: 0.9519
Epoch 455/500
221s 442ms/step - loss: 0.1293 - acc: 0.9986 - val_loss: 0.3130 - val_acc: 0.9518
Epoch 456/500
221s 442ms/step - loss: 0.1295 - acc: 0.9988 - val_loss: 0.3125 - val_acc: 0.9519
Epoch 457/500
221s 441ms/step - loss: 0.1295 - acc: 0.9986 - val_loss: 0.3125 - val_acc: 0.9525
Epoch 458/500
221s 442ms/step - loss: 0.1289 - acc: 0.9989 - val_loss: 0.3124 - val_acc: 0.9525
Epoch 459/500
221s 441ms/step - loss: 0.1294 - acc: 0.9988 - val_loss: 0.3126 - val_acc: 0.9526
Epoch 460/500
221s 441ms/step - loss: 0.1296 - acc: 0.9987 - val_loss: 0.3129 - val_acc: 0.9524
Epoch 461/500
221s 442ms/step - loss: 0.1301 - acc: 0.9985 - val_loss: 0.3133 - val_acc: 0.9526
Epoch 462/500
221s 441ms/step - loss: 0.1293 - acc: 0.9987 - val_loss: 0.3133 - val_acc: 0.9526
Epoch 463/500
221s 442ms/step - loss: 0.1290 - acc: 0.9988 - val_loss: 0.3130 - val_acc: 0.9527
Epoch 464/500
221s 441ms/step - loss: 0.1298 - acc: 0.9984 - val_loss: 0.3126 - val_acc: 0.9530
Epoch 465/500
221s 441ms/step - loss: 0.1290 - acc: 0.9986 - val_loss: 0.3122 - val_acc: 0.9525
Epoch 466/500
221s 442ms/step - loss: 0.1292 - acc: 0.9986 - val_loss: 0.3121 - val_acc: 0.9526
Epoch 467/500
221s 441ms/step - loss: 0.1286 - acc: 0.9989 - val_loss: 0.3122 - val_acc: 0.9524
Epoch 468/500
221s 442ms/step - loss: 0.1288 - acc: 0.9989 - val_loss: 0.3123 - val_acc: 0.9526
Epoch 469/500
221s 441ms/step - loss: 0.1284 - acc: 0.9989 - val_loss: 0.3130 - val_acc: 0.9522
Epoch 470/500
223s 445ms/step - loss: 0.1284 - acc: 0.9989 - val_loss: 0.3136 - val_acc: 0.9522
Epoch 471/500
221s 442ms/step - loss: 0.1282 - acc: 0.9990 - val_loss: 0.3138 - val_acc: 0.9517
Epoch 472/500
221s 442ms/step - loss: 0.1291 - acc: 0.9988 - val_loss: 0.3133 - val_acc: 0.9523
Epoch 473/500
221s 441ms/step - loss: 0.1296 - acc: 0.9984 - val_loss: 0.3130 - val_acc: 0.9524
Epoch 474/500
221s 441ms/step - loss: 0.1284 - acc: 0.9988 - val_loss: 0.3128 - val_acc: 0.9527
Epoch 475/500
221s 441ms/step - loss: 0.1283 - acc: 0.9989 - val_loss: 0.3126 - val_acc: 0.9523
Epoch 476/500
221s 442ms/step - loss: 0.1290 - acc: 0.9987 - val_loss: 0.3125 - val_acc: 0.9524
Epoch 477/500
221s 442ms/step - loss: 0.1287 - acc: 0.9988 - val_loss: 0.3121 - val_acc: 0.9521
Epoch 478/500
221s 442ms/step - loss: 0.1291 - acc: 0.9986 - val_loss: 0.3123 - val_acc: 0.9521
Epoch 479/500
221s 442ms/step - loss: 0.1292 - acc: 0.9986 - val_loss: 0.3124 - val_acc: 0.9522
Epoch 480/500
221s 442ms/step - loss: 0.1291 - acc: 0.9986 - val_loss: 0.3123 - val_acc: 0.9519
Epoch 481/500
221s 442ms/step - loss: 0.1282 - acc: 0.9989 - val_loss: 0.3125 - val_acc: 0.9521
Epoch 482/500
221s 442ms/step - loss: 0.1291 - acc: 0.9988 - val_loss: 0.3125 - val_acc: 0.9522
Epoch 483/500
221s 442ms/step - loss: 0.1286 - acc: 0.9988 - val_loss: 0.3125 - val_acc: 0.9516
Epoch 484/500
220s 441ms/step - loss: 0.1280 - acc: 0.9991 - val_loss: 0.3123 - val_acc: 0.9518
Epoch 485/500
220s 441ms/step - loss: 0.1281 - acc: 0.9989 - val_loss: 0.3128 - val_acc: 0.9519
Epoch 486/500
221s 441ms/step - loss: 0.1281 - acc: 0.9990 - val_loss: 0.3127 - val_acc: 0.9520
Epoch 487/500
221s 441ms/step - loss: 0.1282 - acc: 0.9990 - val_loss: 0.3127 - val_acc: 0.9520
Epoch 488/500
221s 441ms/step - loss: 0.1283 - acc: 0.9988 - val_loss: 0.3129 - val_acc: 0.9520
Epoch 489/500
221s 442ms/step - loss: 0.1282 - acc: 0.9988 - val_loss: 0.3131 - val_acc: 0.9521
Epoch 490/500
221s 441ms/step - loss: 0.1283 - acc: 0.9987 - val_loss: 0.3131 - val_acc: 0.9522
Epoch 491/500
221s 441ms/step - loss: 0.1280 - acc: 0.9990 - val_loss: 0.3133 - val_acc: 0.9526
Epoch 492/500
221s 442ms/step - loss: 0.1281 - acc: 0.9989 - val_loss: 0.3132 - val_acc: 0.9524
Epoch 493/500
221s 441ms/step - loss: 0.1282 - acc: 0.9988 - val_loss: 0.3126 - val_acc: 0.9527
Epoch 494/500
221s 441ms/step - loss: 0.1281 - acc: 0.9989 - val_loss: 0.3125 - val_acc: 0.9522
Epoch 495/500
221s 441ms/step - loss: 0.1278 - acc: 0.9991 - val_loss: 0.3118 - val_acc: 0.9524
Epoch 496/500
221s 441ms/step - loss: 0.1280 - acc: 0.9990 - val_loss: 0.3118 - val_acc: 0.9522
Epoch 497/500
221s 442ms/step - loss: 0.1279 - acc: 0.9988 - val_loss: 0.3118 - val_acc: 0.9527
Epoch 498/500
222s 443ms/step - loss: 0.1275 - acc: 0.9991 - val_loss: 0.3113 - val_acc: 0.9523
Epoch 499/500
221s 443ms/step - loss: 0.1275 - acc: 0.9990 - val_loss: 0.3115 - val_acc: 0.9524
Epoch 500/500
221s 443ms/step - loss: 0.1278 - acc: 0.9989 - val_loss: 0.3119 - val_acc: 0.9525
Train loss: 0.12599779877066614
Train accuracy: 0.9992800006866455
Test loss: 0.3119203564524651
Test accuracy: 0.9525000017881393

相较于调参记录21的95.12%,只提高了0.13%。

Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht, Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, IEEE Transactions on Industrial Electronics, DOI: 10.1109/TIE.2020.2972458, Date of Publication: 13 February 2020

https://ieeexplore.ieee.org/document/8998530

本文系转载,前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文系转载前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

评论
作者已关闭评论
0 条评论
热度
最新
推荐阅读
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档