前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >【哈工大】Dynamic ReLU:自适应参数化ReLU及Keras代码(调参记录19)Cifar10~93.96%

【哈工大】Dynamic ReLU:自适应参数化ReLU及Keras代码(调参记录19)Cifar10~93.96%

作者头像
用户7368967
修改2020-05-28 09:56:49
3360
修改2020-05-28 09:56:49
举报

本文介绍哈工大团队提出的一种动态ReLU(Dynamic ReLU)激活函数,即自适应参数化ReLU激活函数,原本是应用在基于一维振动信号的故障诊断,能够让每个样本有自己独特的ReLU参数,在2019年5月3日投稿至IEEE Transactions on Industrial Electronics,2020年1月24日录用,2020年2月13日在IEEE官网公布

由于调参记录18依然存在过拟合,本文将自适应参数化ReLU激活函数中最后一层的神经元个数减少为1个,继续测试深度残差网络+自适应参数化ReLU激活函数在Cifar10数据集上的效果。

同时,迭代次数从调参记录18中的5000个epoch,减少到了500个epoch,因为5000次实在是太费时间了,差不多要四天才能跑完。

自适应参数化ReLU激活函数的基本原理如下:

自适应参数化ReLU:一种Dynamic ReLU(动态ReLU)激活函数
自适应参数化ReLU:一种Dynamic ReLU(动态ReLU)激活函数

Keras程序如下:

#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Tue Apr 14 04:17:45 2020
Implemented using TensorFlow 1.0.1 and Keras 2.2.1

Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht,
Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, 
IEEE Transactions on Industrial Electronics, DOI: 10.1109/TIE.2020.2972458,
Date of Publication: 13 February 2020

@author: Minghang Zhao
"""

from __future__ import print_function
import keras
import numpy as np
from keras.datasets import cifar10
from keras.layers import Dense, Conv2D, BatchNormalization, Activation, Minimum
from keras.layers import AveragePooling2D, Input, GlobalAveragePooling2D, Concatenate, Reshape
from keras.regularizers import l2
from keras import backend as K
from keras.models import Model
from keras import optimizers
from keras.preprocessing.image import ImageDataGenerator
from keras.callbacks import LearningRateScheduler
K.set_learning_phase(1)

# The data, split between train and test sets
(x_train, y_train), (x_test, y_test) = cifar10.load_data()
x_train = x_train.astype('float32') / 255.
x_test = x_test.astype('float32') / 255.
x_test = x_test-np.mean(x_train)
x_train = x_train-np.mean(x_train)
print('x_train shape:', x_train.shape)
print(x_train.shape[0], 'train samples')
print(x_test.shape[0], 'test samples')

# convert class vectors to binary class matrices
y_train = keras.utils.to_categorical(y_train, 10)
y_test = keras.utils.to_categorical(y_test, 10)

# Schedule the learning rate, multiply 0.1 every 150 epoches
def scheduler(epoch):
    if epoch % 150 == 0 and epoch != 0:
        lr = K.get_value(model.optimizer.lr)
        K.set_value(model.optimizer.lr, lr * 0.1)
        print("lr changed to {}".format(lr * 0.1))
    return K.get_value(model.optimizer.lr)

# An adaptively parametric rectifier linear unit (APReLU)
def aprelu(inputs):
    # get the number of channels
    channels = inputs.get_shape().as_list()[-1]
    # get a zero feature map
    zeros_input = keras.layers.subtract([inputs, inputs])
    # get a feature map with only positive features
    pos_input = Activation('relu')(inputs)
    # get a feature map with only negative features
    neg_input = Minimum()([inputs,zeros_input])
    # define a network to obtain the scaling coefficients
    scales_p = GlobalAveragePooling2D()(pos_input)
    scales_n = GlobalAveragePooling2D()(neg_input)
    scales = Concatenate()([scales_n, scales_p])
    scales = Dense(channels//16, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales)
    scales = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(scales)
    scales = Activation('relu')(scales)
    scales = Dense(1, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales)
    scales = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(scales)
    scales = Activation('sigmoid')(scales)
    scales = Reshape((1,1,1))(scales)
    # apply a paramtetric relu
    neg_part = keras.layers.multiply([scales, neg_input])
    return keras.layers.add([pos_input, neg_part])

# Residual Block
def residual_block(incoming, nb_blocks, out_channels, downsample=False,
                   downsample_strides=2):
    
    residual = incoming
    in_channels = incoming.get_shape().as_list()[-1]
    
    for i in range(nb_blocks):
        
        identity = residual
        
        if not downsample:
            downsample_strides = 1
        
        residual = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(residual)
        residual = aprelu(residual)
        residual = Conv2D(out_channels, 3, strides=(downsample_strides, downsample_strides), 
                          padding='same', kernel_initializer='he_normal', 
                          kernel_regularizer=l2(1e-4))(residual)
        
        residual = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(residual)
        residual = aprelu(residual)
        residual = Conv2D(out_channels, 3, padding='same', kernel_initializer='he_normal', 
                          kernel_regularizer=l2(1e-4))(residual)
        
        # Downsampling
        if downsample_strides > 1:
            identity = AveragePooling2D(pool_size=(1,1), strides=(2,2))(identity)
            
        # Zero_padding to match channels
        if in_channels != out_channels:
            zeros_identity = keras.layers.subtract([identity, identity])
            identity = keras.layers.concatenate([identity, zeros_identity])
            in_channels = out_channels
        
        residual = keras.layers.add([residual, identity])
    
    return residual


# define and train a model
inputs = Input(shape=(32, 32, 3))
net = Conv2D(16, 3, padding='same', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(inputs)
net = residual_block(net, 9, 32, downsample=False)
net = residual_block(net, 1, 32, downsample=True)
net = residual_block(net, 8, 32, downsample=False)
net = residual_block(net, 1, 64, downsample=True)
net = residual_block(net, 8, 64, downsample=False)
net = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(net)
net = aprelu(net)
net = GlobalAveragePooling2D()(net)
outputs = Dense(10, activation='softmax', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(net)
model = Model(inputs=inputs, outputs=outputs)
sgd = optimizers.SGD(lr=0.1, decay=0., momentum=0.9, nesterov=True)
model.compile(loss='categorical_crossentropy', optimizer=sgd, metrics=['accuracy'])

# data augmentation
datagen = ImageDataGenerator(
    # randomly rotate images in the range (deg 0 to 180)
    rotation_range=30,
    # Range for random zoom
    zoom_range = 0.2,
    # shear angle in counter-clockwise direction in degrees
    shear_range = 30,
    # randomly flip images
    horizontal_flip=True,
    # randomly shift images horizontally
    width_shift_range=0.125,
    # randomly shift images vertically
    height_shift_range=0.125)

reduce_lr = LearningRateScheduler(scheduler)
# fit the model on the batches generated by datagen.flow().
model.fit_generator(datagen.flow(x_train, y_train, batch_size=100),
                    validation_data=(x_test, y_test), epochs=500, 
                    verbose=1, callbacks=[reduce_lr], workers=4)

# get results
K.set_learning_phase(0)
DRSN_train_score = model.evaluate(x_train, y_train, batch_size=100, verbose=0)
print('Train loss:', DRSN_train_score[0])
print('Train accuracy:', DRSN_train_score[1])
DRSN_test_score = model.evaluate(x_test, y_test, batch_size=100, verbose=0)
print('Test loss:', DRSN_test_score[0])
print('Test accuracy:', DRSN_test_score[1])

实验结果如下:

Using TensorFlow backend.
x_train shape: (50000, 32, 32, 3)
50000 train samples
10000 test samples
Epoch 1/500
107s 215ms/step - loss: 2.3702 - acc: 0.3922 - val_loss: 1.9601 - val_acc: 0.5235
Epoch 2/500
77s 154ms/step - loss: 1.9532 - acc: 0.5157 - val_loss: 1.6734 - val_acc: 0.5998
Epoch 3/500
77s 154ms/step - loss: 1.6989 - acc: 0.5797 - val_loss: 1.4728 - val_acc: 0.6495
Epoch 4/500
77s 154ms/step - loss: 1.5366 - acc: 0.6184 - val_loss: 1.3253 - val_acc: 0.6888
Epoch 5/500
77s 154ms/step - loss: 1.4110 - acc: 0.6444 - val_loss: 1.2022 - val_acc: 0.7197
Epoch 6/500
77s 154ms/step - loss: 1.3059 - acc: 0.6707 - val_loss: 1.1398 - val_acc: 0.7236
Epoch 7/500
77s 154ms/step - loss: 1.2295 - acc: 0.6873 - val_loss: 1.0509 - val_acc: 0.7515
Epoch 8/500
77s 154ms/step - loss: 1.1568 - acc: 0.7041 - val_loss: 0.9907 - val_acc: 0.7686
Epoch 9/500
77s 154ms/step - loss: 1.1016 - acc: 0.7207 - val_loss: 0.9470 - val_acc: 0.7863
Epoch 10/500
77s 154ms/step - loss: 1.0521 - acc: 0.7346 - val_loss: 0.9005 - val_acc: 0.7911
Epoch 11/500
77s 154ms/step - loss: 1.0246 - acc: 0.7423 - val_loss: 0.8991 - val_acc: 0.7881
Epoch 12/500
77s 154ms/step - loss: 0.9941 - acc: 0.7506 - val_loss: 0.8390 - val_acc: 0.8093
Epoch 13/500
77s 154ms/step - loss: 0.9642 - acc: 0.7602 - val_loss: 0.8239 - val_acc: 0.8147
Epoch 14/500
77s 154ms/step - loss: 0.9465 - acc: 0.7652 - val_loss: 0.8057 - val_acc: 0.8170
Epoch 15/500
77s 154ms/step - loss: 0.9296 - acc: 0.7701 - val_loss: 0.8180 - val_acc: 0.8114
Epoch 16/500
77s 154ms/step - loss: 0.9103 - acc: 0.7767 - val_loss: 0.7975 - val_acc: 0.8207
Epoch 17/500
77s 154ms/step - loss: 0.9027 - acc: 0.7801 - val_loss: 0.8048 - val_acc: 0.8186
Epoch 18/500
77s 154ms/step - loss: 0.8904 - acc: 0.7848 - val_loss: 0.7542 - val_acc: 0.8376
Epoch 19/500
77s 154ms/step - loss: 0.8765 - acc: 0.7889 - val_loss: 0.7633 - val_acc: 0.8313
Epoch 20/500
77s 154ms/step - loss: 0.8739 - acc: 0.7913 - val_loss: 0.7411 - val_acc: 0.8432
Epoch 21/500
77s 154ms/step - loss: 0.8587 - acc: 0.7976 - val_loss: 0.7357 - val_acc: 0.8466
Epoch 22/500
77s 154ms/step - loss: 0.8505 - acc: 0.7982 - val_loss: 0.7369 - val_acc: 0.8437
Epoch 23/500
77s 154ms/step - loss: 0.8495 - acc: 0.8014 - val_loss: 0.7507 - val_acc: 0.8415
Epoch 24/500
77s 154ms/step - loss: 0.8382 - acc: 0.8070 - val_loss: 0.7494 - val_acc: 0.8423
Epoch 25/500
77s 154ms/step - loss: 0.8339 - acc: 0.8097 - val_loss: 0.7374 - val_acc: 0.8441
Epoch 26/500
77s 154ms/step - loss: 0.8284 - acc: 0.8105 - val_loss: 0.7195 - val_acc: 0.8517
Epoch 27/500
77s 154ms/step - loss: 0.8244 - acc: 0.8139 - val_loss: 0.7054 - val_acc: 0.8611
Epoch 28/500
77s 154ms/step - loss: 0.8242 - acc: 0.8143 - val_loss: 0.6997 - val_acc: 0.8614
Epoch 29/500
77s 154ms/step - loss: 0.8145 - acc: 0.8186 - val_loss: 0.6966 - val_acc: 0.8598
Epoch 30/500
77s 154ms/step - loss: 0.8092 - acc: 0.8197 - val_loss: 0.7344 - val_acc: 0.8498
Epoch 31/500
77s 154ms/step - loss: 0.8048 - acc: 0.8219 - val_loss: 0.7232 - val_acc: 0.8574
Epoch 32/500
77s 154ms/step - loss: 0.8054 - acc: 0.8244 - val_loss: 0.6888 - val_acc: 0.8652
Epoch 33/500
77s 154ms/step - loss: 0.8000 - acc: 0.8231 - val_loss: 0.7236 - val_acc: 0.8533
Epoch 34/500
77s 154ms/step - loss: 0.7994 - acc: 0.8258 - val_loss: 0.7096 - val_acc: 0.8584
Epoch 35/500
77s 154ms/step - loss: 0.7933 - acc: 0.8291 - val_loss: 0.7063 - val_acc: 0.8602
Epoch 36/500
77s 154ms/step - loss: 0.7955 - acc: 0.8275 - val_loss: 0.7124 - val_acc: 0.8599
Epoch 37/500
77s 154ms/step - loss: 0.7961 - acc: 0.8280 - val_loss: 0.7020 - val_acc: 0.8650
Epoch 38/500
77s 154ms/step - loss: 0.7864 - acc: 0.8332 - val_loss: 0.7201 - val_acc: 0.8573
Epoch 39/500
77s 154ms/step - loss: 0.7949 - acc: 0.8303 - val_loss: 0.7009 - val_acc: 0.8648
Epoch 40/500
77s 154ms/step - loss: 0.7781 - acc: 0.8349 - val_loss: 0.6954 - val_acc: 0.8636
Epoch 41/500
77s 154ms/step - loss: 0.7821 - acc: 0.8352 - val_loss: 0.6819 - val_acc: 0.8736
Epoch 42/500
77s 154ms/step - loss: 0.7805 - acc: 0.8345 - val_loss: 0.7347 - val_acc: 0.8550
Epoch 43/500
77s 154ms/step - loss: 0.7749 - acc: 0.8384 - val_loss: 0.7029 - val_acc: 0.8642
Epoch 44/500
77s 154ms/step - loss: 0.7777 - acc: 0.8368 - val_loss: 0.6967 - val_acc: 0.8676
Epoch 45/500
77s 154ms/step - loss: 0.7725 - acc: 0.8393 - val_loss: 0.6867 - val_acc: 0.8722
Epoch 46/500
77s 154ms/step - loss: 0.7737 - acc: 0.8408 - val_loss: 0.7075 - val_acc: 0.8644
Epoch 47/500
77s 154ms/step - loss: 0.7734 - acc: 0.8395 - val_loss: 0.6958 - val_acc: 0.8667
Epoch 48/500
77s 154ms/step - loss: 0.7750 - acc: 0.8404 - val_loss: 0.6956 - val_acc: 0.8701
Epoch 49/500
77s 154ms/step - loss: 0.7691 - acc: 0.8417 - val_loss: 0.6977 - val_acc: 0.8677
Epoch 50/500
77s 154ms/step - loss: 0.7661 - acc: 0.8433 - val_loss: 0.7094 - val_acc: 0.8683
Epoch 51/500
77s 154ms/step - loss: 0.7638 - acc: 0.8469 - val_loss: 0.6972 - val_acc: 0.8678
Epoch 52/500
77s 154ms/step - loss: 0.7613 - acc: 0.8455 - val_loss: 0.7113 - val_acc: 0.8676
Epoch 53/500
77s 154ms/step - loss: 0.7647 - acc: 0.8460 - val_loss: 0.6946 - val_acc: 0.8692
Epoch 54/500
77s 154ms/step - loss: 0.7572 - acc: 0.8468 - val_loss: 0.7242 - val_acc: 0.8628
Epoch 55/500
77s 154ms/step - loss: 0.7560 - acc: 0.8504 - val_loss: 0.7084 - val_acc: 0.8671
Epoch 56/500
77s 154ms/step - loss: 0.7578 - acc: 0.8473 - val_loss: 0.6979 - val_acc: 0.8724
Epoch 57/500
77s 154ms/step - loss: 0.7635 - acc: 0.8468 - val_loss: 0.6928 - val_acc: 0.8722
Epoch 58/500
77s 154ms/step - loss: 0.7563 - acc: 0.8489 - val_loss: 0.6907 - val_acc: 0.8736
Epoch 59/500
77s 154ms/step - loss: 0.7578 - acc: 0.8495 - val_loss: 0.6854 - val_acc: 0.8757
Epoch 60/500
77s 154ms/step - loss: 0.7565 - acc: 0.8482 - val_loss: 0.6837 - val_acc: 0.8743
Epoch 61/500
77s 154ms/step - loss: 0.7570 - acc: 0.8499 - val_loss: 0.6821 - val_acc: 0.8742
Epoch 62/500
77s 154ms/step - loss: 0.7595 - acc: 0.8484 - val_loss: 0.6889 - val_acc: 0.8722
Epoch 63/500
77s 154ms/step - loss: 0.7536 - acc: 0.8512 - val_loss: 0.6748 - val_acc: 0.8800
Epoch 64/500
77s 154ms/step - loss: 0.7539 - acc: 0.8514 - val_loss: 0.6508 - val_acc: 0.8901
Epoch 65/500
77s 154ms/step - loss: 0.7483 - acc: 0.8535 - val_loss: 0.6852 - val_acc: 0.8777
Epoch 66/500
77s 154ms/step - loss: 0.7496 - acc: 0.8535 - val_loss: 0.6940 - val_acc: 0.8756
Epoch 67/500
77s 154ms/step - loss: 0.7568 - acc: 0.8505 - val_loss: 0.6830 - val_acc: 0.8805
Epoch 68/500
77s 154ms/step - loss: 0.7549 - acc: 0.8508 - val_loss: 0.6732 - val_acc: 0.8840
Epoch 69/500
77s 154ms/step - loss: 0.7479 - acc: 0.8549 - val_loss: 0.6955 - val_acc: 0.8744
Epoch 70/500
77s 154ms/step - loss: 0.7468 - acc: 0.8551 - val_loss: 0.6964 - val_acc: 0.8746
Epoch 71/500
77s 154ms/step - loss: 0.7499 - acc: 0.8553 - val_loss: 0.6850 - val_acc: 0.8784
Epoch 72/500
77s 154ms/step - loss: 0.7462 - acc: 0.8553 - val_loss: 0.6937 - val_acc: 0.8771
Epoch 73/500
77s 154ms/step - loss: 0.7467 - acc: 0.8559 - val_loss: 0.6876 - val_acc: 0.8761
Epoch 74/500
77s 154ms/step - loss: 0.7467 - acc: 0.8559 - val_loss: 0.7029 - val_acc: 0.8715
Epoch 75/500
77s 154ms/step - loss: 0.7435 - acc: 0.8561 - val_loss: 0.7184 - val_acc: 0.8663
Epoch 76/500
77s 154ms/step - loss: 0.7467 - acc: 0.8558 - val_loss: 0.6751 - val_acc: 0.8808
Epoch 77/500
77s 154ms/step - loss: 0.7398 - acc: 0.8575 - val_loss: 0.6843 - val_acc: 0.8812
Epoch 78/500
77s 154ms/step - loss: 0.7463 - acc: 0.8571 - val_loss: 0.6802 - val_acc: 0.8800
Epoch 79/500
77s 154ms/step - loss: 0.7395 - acc: 0.8568 - val_loss: 0.6877 - val_acc: 0.8769
Epoch 80/500
77s 154ms/step - loss: 0.7403 - acc: 0.8580 - val_loss: 0.6912 - val_acc: 0.8792
Epoch 81/500
77s 154ms/step - loss: 0.7429 - acc: 0.8555 - val_loss: 0.6887 - val_acc: 0.8787
Epoch 82/500
77s 154ms/step - loss: 0.7408 - acc: 0.8572 - val_loss: 0.7134 - val_acc: 0.8709
Epoch 83/500
77s 154ms/step - loss: 0.7413 - acc: 0.8573 - val_loss: 0.6921 - val_acc: 0.8776
Epoch 84/500
77s 154ms/step - loss: 0.7393 - acc: 0.8588 - val_loss: 0.6965 - val_acc: 0.8737
Epoch 85/500
77s 154ms/step - loss: 0.7440 - acc: 0.8568 - val_loss: 0.6806 - val_acc: 0.8803
Epoch 86/500
77s 154ms/step - loss: 0.7407 - acc: 0.8589 - val_loss: 0.6658 - val_acc: 0.8871
Epoch 87/500
77s 154ms/step - loss: 0.7366 - acc: 0.8587 - val_loss: 0.6804 - val_acc: 0.8812
Epoch 88/500
77s 154ms/step - loss: 0.7406 - acc: 0.8582 - val_loss: 0.6686 - val_acc: 0.8869
Epoch 89/500
77s 154ms/step - loss: 0.7345 - acc: 0.8611 - val_loss: 0.6744 - val_acc: 0.8836
Epoch 90/500
77s 154ms/step - loss: 0.7318 - acc: 0.8614 - val_loss: 0.6715 - val_acc: 0.8852
Epoch 91/500
77s 154ms/step - loss: 0.7376 - acc: 0.8600 - val_loss: 0.6939 - val_acc: 0.8737
Epoch 92/500
77s 154ms/step - loss: 0.7420 - acc: 0.8586 - val_loss: 0.6890 - val_acc: 0.8763
Epoch 93/500
77s 154ms/step - loss: 0.7315 - acc: 0.8631 - val_loss: 0.6761 - val_acc: 0.8821
Epoch 94/500
77s 154ms/step - loss: 0.7341 - acc: 0.8610 - val_loss: 0.6902 - val_acc: 0.8801
Epoch 95/500
77s 154ms/step - loss: 0.7370 - acc: 0.8604 - val_loss: 0.6938 - val_acc: 0.8742
Epoch 96/500
77s 154ms/step - loss: 0.7345 - acc: 0.8619 - val_loss: 0.6785 - val_acc: 0.8803
Epoch 97/500
77s 154ms/step - loss: 0.7356 - acc: 0.8598 - val_loss: 0.6974 - val_acc: 0.8753
Epoch 98/500
77s 154ms/step - loss: 0.7340 - acc: 0.8622 - val_loss: 0.6847 - val_acc: 0.8821
Epoch 99/500
77s 154ms/step - loss: 0.7321 - acc: 0.8632 - val_loss: 0.6772 - val_acc: 0.8883
Epoch 100/500
77s 154ms/step - loss: 0.7301 - acc: 0.8650 - val_loss: 0.6659 - val_acc: 0.8881
Epoch 101/500
77s 154ms/step - loss: 0.7364 - acc: 0.8625 - val_loss: 0.7062 - val_acc: 0.8735
Epoch 102/500
77s 154ms/step - loss: 0.7360 - acc: 0.8613 - val_loss: 0.6749 - val_acc: 0.8819
Epoch 103/500
77s 154ms/step - loss: 0.7305 - acc: 0.8628 - val_loss: 0.6853 - val_acc: 0.8840
Epoch 104/500
77s 154ms/step - loss: 0.7333 - acc: 0.8638 - val_loss: 0.6813 - val_acc: 0.8800
Epoch 105/500
77s 154ms/step - loss: 0.7308 - acc: 0.8631 - val_loss: 0.6599 - val_acc: 0.8892
Epoch 106/500
77s 154ms/step - loss: 0.7355 - acc: 0.8643 - val_loss: 0.6833 - val_acc: 0.8816
Epoch 107/500
77s 154ms/step - loss: 0.7286 - acc: 0.8654 - val_loss: 0.6744 - val_acc: 0.8830
Epoch 108/500
77s 154ms/step - loss: 0.7278 - acc: 0.8653 - val_loss: 0.6870 - val_acc: 0.8807
Epoch 109/500
77s 154ms/step - loss: 0.7270 - acc: 0.8652 - val_loss: 0.6901 - val_acc: 0.8821
Epoch 110/500
77s 154ms/step - loss: 0.7260 - acc: 0.8646 - val_loss: 0.6908 - val_acc: 0.8820
Epoch 111/500
77s 154ms/step - loss: 0.7290 - acc: 0.8645 - val_loss: 0.6973 - val_acc: 0.8755
Epoch 112/500
77s 154ms/step - loss: 0.7336 - acc: 0.8615 - val_loss: 0.6845 - val_acc: 0.8812
Epoch 113/500
77s 154ms/step - loss: 0.7296 - acc: 0.8635 - val_loss: 0.6835 - val_acc: 0.8811
Epoch 114/500
77s 154ms/step - loss: 0.7310 - acc: 0.8647 - val_loss: 0.6822 - val_acc: 0.8820
Epoch 115/500
77s 154ms/step - loss: 0.7251 - acc: 0.8660 - val_loss: 0.6822 - val_acc: 0.8803
Epoch 116/500
77s 154ms/step - loss: 0.7313 - acc: 0.8633 - val_loss: 0.6572 - val_acc: 0.8908
Epoch 117/500
77s 154ms/step - loss: 0.7289 - acc: 0.8636 - val_loss: 0.6956 - val_acc: 0.8817
Epoch 118/500
77s 154ms/step - loss: 0.7233 - acc: 0.8670 - val_loss: 0.7052 - val_acc: 0.8738
Epoch 119/500
77s 154ms/step - loss: 0.7243 - acc: 0.8667 - val_loss: 0.6675 - val_acc: 0.8891
Epoch 120/500
77s 154ms/step - loss: 0.7269 - acc: 0.8658 - val_loss: 0.6815 - val_acc: 0.8834
Epoch 121/500
77s 154ms/step - loss: 0.7248 - acc: 0.8656 - val_loss: 0.6670 - val_acc: 0.8878
Epoch 122/500
77s 154ms/step - loss: 0.7223 - acc: 0.8690 - val_loss: 0.6658 - val_acc: 0.8892
Epoch 123/500
77s 154ms/step - loss: 0.7248 - acc: 0.8675 - val_loss: 0.6889 - val_acc: 0.8798
Epoch 124/500
77s 154ms/step - loss: 0.7209 - acc: 0.8675 - val_loss: 0.6703 - val_acc: 0.8857
Epoch 125/500
77s 154ms/step - loss: 0.7276 - acc: 0.8668 - val_loss: 0.6875 - val_acc: 0.8791
Epoch 126/500
77s 154ms/step - loss: 0.7251 - acc: 0.8659 - val_loss: 0.6836 - val_acc: 0.8829
Epoch 127/500
77s 154ms/step - loss: 0.7280 - acc: 0.8668 - val_loss: 0.6832 - val_acc: 0.8836
Epoch 128/500
77s 154ms/step - loss: 0.7242 - acc: 0.8672 - val_loss: 0.6848 - val_acc: 0.8847
Epoch 129/500
77s 154ms/step - loss: 0.7267 - acc: 0.8663 - val_loss: 0.6778 - val_acc: 0.8852
Epoch 130/500
77s 154ms/step - loss: 0.7289 - acc: 0.8648 - val_loss: 0.6786 - val_acc: 0.8837
Epoch 131/500
77s 154ms/step - loss: 0.7219 - acc: 0.8685 - val_loss: 0.6562 - val_acc: 0.8899
Epoch 132/500
77s 154ms/step - loss: 0.7186 - acc: 0.8678 - val_loss: 0.6765 - val_acc: 0.8854
Epoch 133/500
77s 154ms/step - loss: 0.7199 - acc: 0.8688 - val_loss: 0.6697 - val_acc: 0.8887
Epoch 134/500
77s 154ms/step - loss: 0.7163 - acc: 0.8687 - val_loss: 0.6692 - val_acc: 0.8881
Epoch 135/500
77s 154ms/step - loss: 0.7208 - acc: 0.8671 - val_loss: 0.6777 - val_acc: 0.8818
Epoch 136/500
77s 154ms/step - loss: 0.7257 - acc: 0.8666 - val_loss: 0.6726 - val_acc: 0.8896
Epoch 137/500
77s 154ms/step - loss: 0.7224 - acc: 0.8658 - val_loss: 0.7068 - val_acc: 0.8746
Epoch 138/500
77s 154ms/step - loss: 0.7202 - acc: 0.8686 - val_loss: 0.6746 - val_acc: 0.8850
Epoch 139/500
77s 154ms/step - loss: 0.7253 - acc: 0.8672 - val_loss: 0.6856 - val_acc: 0.8843
Epoch 140/500
77s 154ms/step - loss: 0.7216 - acc: 0.8681 - val_loss: 0.6837 - val_acc: 0.8835
Epoch 141/500
77s 154ms/step - loss: 0.7251 - acc: 0.8686 - val_loss: 0.6652 - val_acc: 0.8893
Epoch 142/500
77s 154ms/step - loss: 0.7200 - acc: 0.8697 - val_loss: 0.6572 - val_acc: 0.8915
Epoch 143/500
77s 154ms/step - loss: 0.7208 - acc: 0.8682 - val_loss: 0.6792 - val_acc: 0.8858
Epoch 144/500
77s 154ms/step - loss: 0.7231 - acc: 0.8691 - val_loss: 0.6885 - val_acc: 0.8835
Epoch 145/500
77s 154ms/step - loss: 0.7191 - acc: 0.8704 - val_loss: 0.6828 - val_acc: 0.8862
Epoch 146/500
77s 153ms/step - loss: 0.7209 - acc: 0.8689 - val_loss: 0.6849 - val_acc: 0.8812
Epoch 147/500
77s 154ms/step - loss: 0.7243 - acc: 0.8688 - val_loss: 0.6824 - val_acc: 0.8838
Epoch 148/500
77s 154ms/step - loss: 0.7194 - acc: 0.8700 - val_loss: 0.6714 - val_acc: 0.8889
Epoch 149/500
77s 154ms/step - loss: 0.7220 - acc: 0.8691 - val_loss: 0.6686 - val_acc: 0.8902
Epoch 150/500
77s 154ms/step - loss: 0.7181 - acc: 0.8700 - val_loss: 0.6723 - val_acc: 0.8851
Epoch 151/500
lr changed to 0.010000000149011612
77s 154ms/step - loss: 0.6046 - acc: 0.9093 - val_loss: 0.5729 - val_acc: 0.9191
Epoch 152/500
77s 154ms/step - loss: 0.5434 - acc: 0.9281 - val_loss: 0.5547 - val_acc: 0.9222
Epoch 153/500
77s 154ms/step - loss: 0.5269 - acc: 0.9317 - val_loss: 0.5470 - val_acc: 0.9232
Epoch 154/500
77s 154ms/step - loss: 0.5083 - acc: 0.9357 - val_loss: 0.5377 - val_acc: 0.9255
Epoch 155/500
77s 154ms/step - loss: 0.4961 - acc: 0.9395 - val_loss: 0.5305 - val_acc: 0.9254
Epoch 156/500
77s 154ms/step - loss: 0.4827 - acc: 0.9411 - val_loss: 0.5238 - val_acc: 0.9269
Epoch 157/500
77s 154ms/step - loss: 0.4718 - acc: 0.9440 - val_loss: 0.5187 - val_acc: 0.9289
Epoch 158/500
77s 154ms/step - loss: 0.4637 - acc: 0.9443 - val_loss: 0.5135 - val_acc: 0.9290
Epoch 159/500
77s 154ms/step - loss: 0.4554 - acc: 0.9453 - val_loss: 0.5119 - val_acc: 0.9291
Epoch 160/500
77s 154ms/step - loss: 0.4475 - acc: 0.9456 - val_loss: 0.5078 - val_acc: 0.9271
Epoch 161/500
77s 154ms/step - loss: 0.4393 - acc: 0.9484 - val_loss: 0.4957 - val_acc: 0.9317
Epoch 162/500
77s 154ms/step - loss: 0.4290 - acc: 0.9491 - val_loss: 0.4937 - val_acc: 0.9283
Epoch 163/500
77s 154ms/step - loss: 0.4224 - acc: 0.9501 - val_loss: 0.4897 - val_acc: 0.9293
Epoch 164/500
77s 154ms/step - loss: 0.4194 - acc: 0.9498 - val_loss: 0.4830 - val_acc: 0.9312
Epoch 165/500
77s 154ms/step - loss: 0.4101 - acc: 0.9529 - val_loss: 0.4823 - val_acc: 0.9309
Epoch 166/500
77s 154ms/step - loss: 0.4087 - acc: 0.9508 - val_loss: 0.4761 - val_acc: 0.9302
Epoch 167/500
77s 154ms/step - loss: 0.3993 - acc: 0.9528 - val_loss: 0.4733 - val_acc: 0.9307
Epoch 168/500
77s 154ms/step - loss: 0.3958 - acc: 0.9528 - val_loss: 0.4612 - val_acc: 0.9310
Epoch 169/500
77s 154ms/step - loss: 0.3904 - acc: 0.9536 - val_loss: 0.4725 - val_acc: 0.9294
Epoch 170/500
77s 154ms/step - loss: 0.3820 - acc: 0.9552 - val_loss: 0.4625 - val_acc: 0.9293
Epoch 171/500
77s 154ms/step - loss: 0.3769 - acc: 0.9553 - val_loss: 0.4596 - val_acc: 0.9292
Epoch 172/500
77s 154ms/step - loss: 0.3732 - acc: 0.9567 - val_loss: 0.4686 - val_acc: 0.9271
Epoch 173/500
77s 154ms/step - loss: 0.3692 - acc: 0.9566 - val_loss: 0.4595 - val_acc: 0.9275
Epoch 174/500
77s 154ms/step - loss: 0.3697 - acc: 0.9547 - val_loss: 0.4510 - val_acc: 0.9305
Epoch 175/500
77s 154ms/step - loss: 0.3592 - acc: 0.9577 - val_loss: 0.4485 - val_acc: 0.9294
Epoch 176/500
77s 154ms/step - loss: 0.3553 - acc: 0.9583 - val_loss: 0.4527 - val_acc: 0.9276
Epoch 177/500
77s 154ms/step - loss: 0.3519 - acc: 0.9588 - val_loss: 0.4501 - val_acc: 0.9269
Epoch 178/500
77s 154ms/step - loss: 0.3508 - acc: 0.9571 - val_loss: 0.4489 - val_acc: 0.9253
Epoch 179/500
77s 154ms/step - loss: 0.3461 - acc: 0.9577 - val_loss: 0.4484 - val_acc: 0.9260
Epoch 180/500
77s 154ms/step - loss: 0.3446 - acc: 0.9583 - val_loss: 0.4392 - val_acc: 0.9274
Epoch 181/500
77s 154ms/step - loss: 0.3375 - acc: 0.9591 - val_loss: 0.4435 - val_acc: 0.9287
Epoch 182/500
77s 154ms/step - loss: 0.3375 - acc: 0.9584 - val_loss: 0.4446 - val_acc: 0.9278
Epoch 183/500
77s 154ms/step - loss: 0.3358 - acc: 0.9586 - val_loss: 0.4434 - val_acc: 0.9268
Epoch 184/500
77s 154ms/step - loss: 0.3294 - acc: 0.9607 - val_loss: 0.4529 - val_acc: 0.9267
Epoch 185/500
77s 154ms/step - loss: 0.3352 - acc: 0.9571 - val_loss: 0.4392 - val_acc: 0.9272
Epoch 186/500
77s 154ms/step - loss: 0.3289 - acc: 0.9587 - val_loss: 0.4367 - val_acc: 0.9276
Epoch 187/500
77s 154ms/step - loss: 0.3267 - acc: 0.9595 - val_loss: 0.4333 - val_acc: 0.9257
Epoch 188/500
77s 154ms/step - loss: 0.3191 - acc: 0.9600 - val_loss: 0.4392 - val_acc: 0.9257
Epoch 189/500
77s 154ms/step - loss: 0.3169 - acc: 0.9608 - val_loss: 0.4366 - val_acc: 0.9261
Epoch 190/500
77s 154ms/step - loss: 0.3180 - acc: 0.9594 - val_loss: 0.4283 - val_acc: 0.9274
Epoch 191/500
77s 154ms/step - loss: 0.3128 - acc: 0.9605 - val_loss: 0.4351 - val_acc: 0.9228
Epoch 192/500
77s 154ms/step - loss: 0.3105 - acc: 0.9610 - val_loss: 0.4294 - val_acc: 0.9255
Epoch 193/500
77s 154ms/step - loss: 0.3096 - acc: 0.9605 - val_loss: 0.4258 - val_acc: 0.9272
Epoch 194/500
77s 154ms/step - loss: 0.3074 - acc: 0.9614 - val_loss: 0.4288 - val_acc: 0.9248
Epoch 195/500
77s 154ms/step - loss: 0.3120 - acc: 0.9587 - val_loss: 0.4296 - val_acc: 0.9237
Epoch 196/500
77s 154ms/step - loss: 0.3029 - acc: 0.9617 - val_loss: 0.4240 - val_acc: 0.9248
Epoch 197/500
77s 154ms/step - loss: 0.3020 - acc: 0.9620 - val_loss: 0.4250 - val_acc: 0.9236
Epoch 198/500
77s 154ms/step - loss: 0.2999 - acc: 0.9615 - val_loss: 0.4228 - val_acc: 0.9248
Epoch 199/500
77s 154ms/step - loss: 0.3062 - acc: 0.9591 - val_loss: 0.4214 - val_acc: 0.9238
Epoch 200/500
77s 154ms/step - loss: 0.2965 - acc: 0.9610 - val_loss: 0.4208 - val_acc: 0.9223
Epoch 201/500
77s 154ms/step - loss: 0.2991 - acc: 0.9598 - val_loss: 0.4235 - val_acc: 0.9234
Epoch 202/500
77s 154ms/step - loss: 0.2970 - acc: 0.9607 - val_loss: 0.4145 - val_acc: 0.9254
Epoch 203/500
77s 154ms/step - loss: 0.2957 - acc: 0.9615 - val_loss: 0.4259 - val_acc: 0.9258
Epoch 204/500
77s 154ms/step - loss: 0.2985 - acc: 0.9593 - val_loss: 0.4215 - val_acc: 0.9255
Epoch 205/500
77s 154ms/step - loss: 0.2997 - acc: 0.9586 - val_loss: 0.4152 - val_acc: 0.9226
Epoch 206/500
77s 154ms/step - loss: 0.2937 - acc: 0.9603 - val_loss: 0.4019 - val_acc: 0.9318
Epoch 207/500
77s 154ms/step - loss: 0.2948 - acc: 0.9596 - val_loss: 0.4118 - val_acc: 0.9257
Epoch 208/500
77s 154ms/step - loss: 0.2952 - acc: 0.9597 - val_loss: 0.4051 - val_acc: 0.9306
Epoch 209/500
77s 154ms/step - loss: 0.2870 - acc: 0.9616 - val_loss: 0.4115 - val_acc: 0.9262
Epoch 210/500
77s 154ms/step - loss: 0.2926 - acc: 0.9596 - val_loss: 0.4055 - val_acc: 0.9272
Epoch 211/500
77s 154ms/step - loss: 0.2872 - acc: 0.9613 - val_loss: 0.4165 - val_acc: 0.9229
Epoch 212/500
77s 154ms/step - loss: 0.2909 - acc: 0.9597 - val_loss: 0.4018 - val_acc: 0.9249
Epoch 213/500
77s 154ms/step - loss: 0.2857 - acc: 0.9614 - val_loss: 0.4119 - val_acc: 0.9219
Epoch 214/500
77s 154ms/step - loss: 0.2858 - acc: 0.9603 - val_loss: 0.4023 - val_acc: 0.9258
Epoch 215/500
77s 154ms/step - loss: 0.2858 - acc: 0.9609 - val_loss: 0.4176 - val_acc: 0.9231
Epoch 216/500
77s 154ms/step - loss: 0.2861 - acc: 0.9601 - val_loss: 0.4137 - val_acc: 0.9246
Epoch 217/500
77s 154ms/step - loss: 0.2869 - acc: 0.9604 - val_loss: 0.4088 - val_acc: 0.9245
Epoch 218/500
77s 154ms/step - loss: 0.2828 - acc: 0.9609 - val_loss: 0.4092 - val_acc: 0.9234
Epoch 219/500
77s 154ms/step - loss: 0.2807 - acc: 0.9616 - val_loss: 0.4026 - val_acc: 0.9278
Epoch 220/500
77s 154ms/step - loss: 0.2810 - acc: 0.9608 - val_loss: 0.4045 - val_acc: 0.9275
Epoch 221/500
77s 154ms/step - loss: 0.2804 - acc: 0.9612 - val_loss: 0.4012 - val_acc: 0.9247
Epoch 222/500
77s 154ms/step - loss: 0.2819 - acc: 0.9588 - val_loss: 0.4046 - val_acc: 0.9219
Epoch 223/500
77s 154ms/step - loss: 0.2805 - acc: 0.9599 - val_loss: 0.4007 - val_acc: 0.9247
Epoch 224/500
77s 154ms/step - loss: 0.2785 - acc: 0.9608 - val_loss: 0.4117 - val_acc: 0.9224
Epoch 225/500
77s 154ms/step - loss: 0.2783 - acc: 0.9610 - val_loss: 0.4073 - val_acc: 0.9204
Epoch 226/500
77s 154ms/step - loss: 0.2830 - acc: 0.9599 - val_loss: 0.4135 - val_acc: 0.9203
Epoch 227/500
77s 154ms/step - loss: 0.2798 - acc: 0.9601 - val_loss: 0.3977 - val_acc: 0.9254
Epoch 228/500
77s 154ms/step - loss: 0.2780 - acc: 0.9602 - val_loss: 0.3916 - val_acc: 0.9254
Epoch 229/500
77s 154ms/step - loss: 0.2812 - acc: 0.9589 - val_loss: 0.4020 - val_acc: 0.9254
Epoch 230/500
77s 154ms/step - loss: 0.2786 - acc: 0.9592 - val_loss: 0.3981 - val_acc: 0.9258
Epoch 231/500
77s 154ms/step - loss: 0.2787 - acc: 0.9603 - val_loss: 0.4021 - val_acc: 0.9221
Epoch 232/500
77s 154ms/step - loss: 0.2775 - acc: 0.9607 - val_loss: 0.3934 - val_acc: 0.9268
Epoch 233/500
77s 154ms/step - loss: 0.2787 - acc: 0.9592 - val_loss: 0.3829 - val_acc: 0.9275
Epoch 234/500
77s 154ms/step - loss: 0.2748 - acc: 0.9609 - val_loss: 0.3967 - val_acc: 0.9274
Epoch 235/500
77s 154ms/step - loss: 0.2781 - acc: 0.9589 - val_loss: 0.3909 - val_acc: 0.9275
Epoch 236/500
77s 154ms/step - loss: 0.2758 - acc: 0.9607 - val_loss: 0.3941 - val_acc: 0.9270
Epoch 237/500
77s 154ms/step - loss: 0.2767 - acc: 0.9600 - val_loss: 0.4121 - val_acc: 0.9195
Epoch 238/500
77s 154ms/step - loss: 0.2754 - acc: 0.9608 - val_loss: 0.3978 - val_acc: 0.9221
Epoch 239/500
77s 154ms/step - loss: 0.2722 - acc: 0.9616 - val_loss: 0.4039 - val_acc: 0.9238
Epoch 240/500
77s 154ms/step - loss: 0.2685 - acc: 0.9618 - val_loss: 0.3889 - val_acc: 0.9277
Epoch 241/500
77s 154ms/step - loss: 0.2751 - acc: 0.9596 - val_loss: 0.3960 - val_acc: 0.9274
Epoch 242/500
77s 154ms/step - loss: 0.2686 - acc: 0.9619 - val_loss: 0.3881 - val_acc: 0.9288
Epoch 243/500
77s 154ms/step - loss: 0.2744 - acc: 0.9600 - val_loss: 0.3929 - val_acc: 0.9235
Epoch 244/500
77s 154ms/step - loss: 0.2741 - acc: 0.9596 - val_loss: 0.3775 - val_acc: 0.9274
Epoch 245/500
77s 154ms/step - loss: 0.2694 - acc: 0.9619 - val_loss: 0.4006 - val_acc: 0.9213
Epoch 246/500
77s 154ms/step - loss: 0.2760 - acc: 0.9586 - val_loss: 0.3956 - val_acc: 0.9236
Epoch 247/500
77s 154ms/step - loss: 0.2694 - acc: 0.9612 - val_loss: 0.3956 - val_acc: 0.9232
Epoch 248/500
77s 154ms/step - loss: 0.2714 - acc: 0.9603 - val_loss: 0.3947 - val_acc: 0.9253
Epoch 249/500
77s 154ms/step - loss: 0.2718 - acc: 0.9608 - val_loss: 0.4027 - val_acc: 0.9232
Epoch 250/500
77s 154ms/step - loss: 0.2665 - acc: 0.9624 - val_loss: 0.3955 - val_acc: 0.9243
Epoch 251/500
77s 154ms/step - loss: 0.2666 - acc: 0.9632 - val_loss: 0.4009 - val_acc: 0.9219
Epoch 252/500
77s 154ms/step - loss: 0.2733 - acc: 0.9592 - val_loss: 0.4097 - val_acc: 0.9204
Epoch 253/500
77s 154ms/step - loss: 0.2702 - acc: 0.9602 - val_loss: 0.3962 - val_acc: 0.9213
Epoch 254/500
77s 154ms/step - loss: 0.2718 - acc: 0.9603 - val_loss: 0.3998 - val_acc: 0.9235
Epoch 255/500
77s 155ms/step - loss: 0.2679 - acc: 0.9613 - val_loss: 0.4113 - val_acc: 0.9217
Epoch 256/500
77s 155ms/step - loss: 0.2702 - acc: 0.9605 - val_loss: 0.3947 - val_acc: 0.9203
Epoch 257/500
77s 155ms/step - loss: 0.2728 - acc: 0.9593 - val_loss: 0.4031 - val_acc: 0.9234
Epoch 258/500
77s 154ms/step - loss: 0.2719 - acc: 0.9593 - val_loss: 0.3979 - val_acc: 0.9250
Epoch 259/500
77s 154ms/step - loss: 0.2683 - acc: 0.9620 - val_loss: 0.3881 - val_acc: 0.9264
Epoch 260/500
77s 154ms/step - loss: 0.2730 - acc: 0.9599 - val_loss: 0.3837 - val_acc: 0.9264
Epoch 261/500
77s 154ms/step - loss: 0.2681 - acc: 0.9614 - val_loss: 0.3945 - val_acc: 0.9251
Epoch 262/500
77s 154ms/step - loss: 0.2722 - acc: 0.9595 - val_loss: 0.3893 - val_acc: 0.9248
Epoch 263/500
77s 154ms/step - loss: 0.2695 - acc: 0.9613 - val_loss: 0.3948 - val_acc: 0.9241
Epoch 264/500
77s 154ms/step - loss: 0.2691 - acc: 0.9616 - val_loss: 0.3995 - val_acc: 0.9251
Epoch 265/500
77s 154ms/step - loss: 0.2722 - acc: 0.9601 - val_loss: 0.3898 - val_acc: 0.9248
Epoch 266/500
77s 154ms/step - loss: 0.2673 - acc: 0.9601 - val_loss: 0.3847 - val_acc: 0.9269
Epoch 267/500
77s 154ms/step - loss: 0.2641 - acc: 0.9629 - val_loss: 0.3892 - val_acc: 0.9258
Epoch 268/500
77s 154ms/step - loss: 0.2642 - acc: 0.9622 - val_loss: 0.3875 - val_acc: 0.9266
Epoch 269/500
77s 154ms/step - loss: 0.2709 - acc: 0.9604 - val_loss: 0.3991 - val_acc: 0.9236
Epoch 270/500
77s 154ms/step - loss: 0.2675 - acc: 0.9607 - val_loss: 0.3841 - val_acc: 0.9275
Epoch 271/500
77s 154ms/step - loss: 0.2672 - acc: 0.9618 - val_loss: 0.3863 - val_acc: 0.9254
Epoch 272/500
77s 154ms/step - loss: 0.2651 - acc: 0.9629 - val_loss: 0.3993 - val_acc: 0.9249
Epoch 273/500
77s 154ms/step - loss: 0.2675 - acc: 0.9618 - val_loss: 0.3959 - val_acc: 0.9230
Epoch 274/500
77s 154ms/step - loss: 0.2650 - acc: 0.9625 - val_loss: 0.3901 - val_acc: 0.9248
Epoch 275/500
77s 154ms/step - loss: 0.2685 - acc: 0.9611 - val_loss: 0.3998 - val_acc: 0.9206
Epoch 276/500
77s 154ms/step - loss: 0.2645 - acc: 0.9630 - val_loss: 0.3983 - val_acc: 0.9244
Epoch 277/500
77s 154ms/step - loss: 0.2675 - acc: 0.9614 - val_loss: 0.4014 - val_acc: 0.9227
Epoch 278/500
77s 154ms/step - loss: 0.2648 - acc: 0.9628 - val_loss: 0.3990 - val_acc: 0.9239
Epoch 279/500
77s 154ms/step - loss: 0.2648 - acc: 0.9624 - val_loss: 0.4027 - val_acc: 0.9215
Epoch 280/500
77s 154ms/step - loss: 0.2630 - acc: 0.9634 - val_loss: 0.4132 - val_acc: 0.9198
Epoch 281/500
77s 154ms/step - loss: 0.2673 - acc: 0.9620 - val_loss: 0.4117 - val_acc: 0.9200
Epoch 282/500
77s 154ms/step - loss: 0.2654 - acc: 0.9617 - val_loss: 0.4133 - val_acc: 0.9187
Epoch 283/500
77s 154ms/step - loss: 0.2661 - acc: 0.9621 - val_loss: 0.3970 - val_acc: 0.9250
Epoch 284/500
77s 154ms/step - loss: 0.2653 - acc: 0.9613 - val_loss: 0.3930 - val_acc: 0.9256
Epoch 285/500
77s 154ms/step - loss: 0.2615 - acc: 0.9632 - val_loss: 0.4035 - val_acc: 0.9252
Epoch 286/500
77s 154ms/step - loss: 0.2701 - acc: 0.9606 - val_loss: 0.3921 - val_acc: 0.9264
Epoch 287/500
77s 154ms/step - loss: 0.2663 - acc: 0.9619 - val_loss: 0.3858 - val_acc: 0.9287
Epoch 288/500
77s 154ms/step - loss: 0.2623 - acc: 0.9632 - val_loss: 0.3950 - val_acc: 0.9235
Epoch 289/500
77s 154ms/step - loss: 0.2636 - acc: 0.9625 - val_loss: 0.3869 - val_acc: 0.9281
Epoch 290/500
77s 154ms/step - loss: 0.2625 - acc: 0.9636 - val_loss: 0.3885 - val_acc: 0.9272
Epoch 291/500
77s 154ms/step - loss: 0.2623 - acc: 0.9630 - val_loss: 0.3900 - val_acc: 0.9252
Epoch 292/500
77s 154ms/step - loss: 0.2650 - acc: 0.9615 - val_loss: 0.3916 - val_acc: 0.9264
Epoch 293/500
77s 154ms/step - loss: 0.2627 - acc: 0.9624 - val_loss: 0.3935 - val_acc: 0.9259
Epoch 294/500
77s 154ms/step - loss: 0.2664 - acc: 0.9621 - val_loss: 0.3898 - val_acc: 0.9264
Epoch 295/500
77s 154ms/step - loss: 0.2624 - acc: 0.9629 - val_loss: 0.3937 - val_acc: 0.9264
Epoch 296/500
77s 154ms/step - loss: 0.2606 - acc: 0.9633 - val_loss: 0.3959 - val_acc: 0.9252
Epoch 297/500
77s 154ms/step - loss: 0.2621 - acc: 0.9626 - val_loss: 0.3978 - val_acc: 0.9245
Epoch 298/500
77s 153ms/step - loss: 0.2616 - acc: 0.9624 - val_loss: 0.3976 - val_acc: 0.9245
Epoch 299/500
77s 154ms/step - loss: 0.2610 - acc: 0.9626 - val_loss: 0.3952 - val_acc: 0.9239
Epoch 300/500
77s 154ms/step - loss: 0.2659 - acc: 0.9620 - val_loss: 0.4040 - val_acc: 0.9214
Epoch 301/500
lr changed to 0.0009999999776482583
77s 154ms/step - loss: 0.2382 - acc: 0.9722 - val_loss: 0.3640 - val_acc: 0.9314
Epoch 302/500
77s 154ms/step - loss: 0.2191 - acc: 0.9797 - val_loss: 0.3560 - val_acc: 0.9333
Epoch 303/500
77s 154ms/step - loss: 0.2131 - acc: 0.9809 - val_loss: 0.3548 - val_acc: 0.9346
Epoch 304/500
77s 154ms/step - loss: 0.2059 - acc: 0.9836 - val_loss: 0.3561 - val_acc: 0.9340
Epoch 305/500
77s 154ms/step - loss: 0.2049 - acc: 0.9843 - val_loss: 0.3526 - val_acc: 0.9366
Epoch 306/500
77s 154ms/step - loss: 0.2019 - acc: 0.9852 - val_loss: 0.3509 - val_acc: 0.9373
Epoch 307/500
77s 154ms/step - loss: 0.2028 - acc: 0.9843 - val_loss: 0.3527 - val_acc: 0.9362
Epoch 308/500
77s 154ms/step - loss: 0.2023 - acc: 0.9846 - val_loss: 0.3534 - val_acc: 0.9363
Epoch 309/500
77s 154ms/step - loss: 0.1995 - acc: 0.9855 - val_loss: 0.3533 - val_acc: 0.9367
Epoch 310/500
77s 154ms/step - loss: 0.1957 - acc: 0.9871 - val_loss: 0.3547 - val_acc: 0.9369
Epoch 311/500
77s 154ms/step - loss: 0.1947 - acc: 0.9877 - val_loss: 0.3532 - val_acc: 0.9380
Epoch 312/500
77s 154ms/step - loss: 0.1928 - acc: 0.9878 - val_loss: 0.3533 - val_acc: 0.9380
Epoch 313/500
77s 154ms/step - loss: 0.1920 - acc: 0.9879 - val_loss: 0.3522 - val_acc: 0.9391
Epoch 314/500
77s 154ms/step - loss: 0.1923 - acc: 0.9871 - val_loss: 0.3523 - val_acc: 0.9385
Epoch 315/500
77s 154ms/step - loss: 0.1901 - acc: 0.9885 - val_loss: 0.3517 - val_acc: 0.9375
Epoch 316/500
77s 154ms/step - loss: 0.1903 - acc: 0.9879 - val_loss: 0.3518 - val_acc: 0.9391
Epoch 317/500
77s 154ms/step - loss: 0.1883 - acc: 0.9885 - val_loss: 0.3539 - val_acc: 0.9384
Epoch 318/500
77s 154ms/step - loss: 0.1884 - acc: 0.9883 - val_loss: 0.3568 - val_acc: 0.9376
Epoch 319/500
77s 154ms/step - loss: 0.1888 - acc: 0.9887 - val_loss: 0.3560 - val_acc: 0.9382
Epoch 320/500
77s 154ms/step - loss: 0.1862 - acc: 0.9893 - val_loss: 0.3573 - val_acc: 0.9371
Epoch 321/500
77s 154ms/step - loss: 0.1874 - acc: 0.9880 - val_loss: 0.3561 - val_acc: 0.9386
Epoch 322/500
77s 154ms/step - loss: 0.1855 - acc: 0.9895 - val_loss: 0.3553 - val_acc: 0.9395
Epoch 323/500
77s 154ms/step - loss: 0.1846 - acc: 0.9897 - val_loss: 0.3543 - val_acc: 0.9396
Epoch 324/500
77s 154ms/step - loss: 0.1860 - acc: 0.9890 - val_loss: 0.3560 - val_acc: 0.9382
Epoch 325/500
77s 154ms/step - loss: 0.1834 - acc: 0.9894 - val_loss: 0.3539 - val_acc: 0.9383
Epoch 326/500
77s 154ms/step - loss: 0.1847 - acc: 0.9894 - val_loss: 0.3555 - val_acc: 0.9375
Epoch 327/500
77s 154ms/step - loss: 0.1830 - acc: 0.9896 - val_loss: 0.3559 - val_acc: 0.9370
Epoch 328/500
77s 154ms/step - loss: 0.1839 - acc: 0.9894 - val_loss: 0.3584 - val_acc: 0.9363
Epoch 329/500
77s 154ms/step - loss: 0.1808 - acc: 0.9902 - val_loss: 0.3571 - val_acc: 0.9381
Epoch 330/500
77s 154ms/step - loss: 0.1818 - acc: 0.9899 - val_loss: 0.3556 - val_acc: 0.9377
Epoch 331/500
77s 154ms/step - loss: 0.1800 - acc: 0.9903 - val_loss: 0.3584 - val_acc: 0.9380
Epoch 332/500
77s 154ms/step - loss: 0.1811 - acc: 0.9898 - val_loss: 0.3571 - val_acc: 0.9399
Epoch 333/500
77s 154ms/step - loss: 0.1801 - acc: 0.9900 - val_loss: 0.3574 - val_acc: 0.9390
Epoch 334/500
77s 154ms/step - loss: 0.1802 - acc: 0.9900 - val_loss: 0.3582 - val_acc: 0.9381
Epoch 335/500
77s 154ms/step - loss: 0.1797 - acc: 0.9902 - val_loss: 0.3629 - val_acc: 0.9369
Epoch 336/500
77s 154ms/step - loss: 0.1783 - acc: 0.9908 - val_loss: 0.3563 - val_acc: 0.9390
Epoch 337/500
77s 154ms/step - loss: 0.1787 - acc: 0.9901 - val_loss: 0.3549 - val_acc: 0.9380
Epoch 338/500
77s 154ms/step - loss: 0.1780 - acc: 0.9907 - val_loss: 0.3594 - val_acc: 0.9368
Epoch 339/500
77s 154ms/step - loss: 0.1776 - acc: 0.9905 - val_loss: 0.3556 - val_acc: 0.9384
Epoch 340/500
77s 154ms/step - loss: 0.1763 - acc: 0.9912 - val_loss: 0.3543 - val_acc: 0.9397
Epoch 341/500
77s 154ms/step - loss: 0.1760 - acc: 0.9911 - val_loss: 0.3552 - val_acc: 0.9380
Epoch 342/500
77s 154ms/step - loss: 0.1754 - acc: 0.9911 - val_loss: 0.3567 - val_acc: 0.9387
Epoch 343/500
77s 154ms/step - loss: 0.1762 - acc: 0.9908 - val_loss: 0.3547 - val_acc: 0.9386
Epoch 344/500
77s 154ms/step - loss: 0.1746 - acc: 0.9913 - val_loss: 0.3569 - val_acc: 0.9377
Epoch 345/500
77s 154ms/step - loss: 0.1736 - acc: 0.9920 - val_loss: 0.3596 - val_acc: 0.9381
Epoch 346/500
77s 154ms/step - loss: 0.1730 - acc: 0.9915 - val_loss: 0.3580 - val_acc: 0.9382
Epoch 347/500
77s 154ms/step - loss: 0.1727 - acc: 0.9910 - val_loss: 0.3569 - val_acc: 0.9386
Epoch 348/500
77s 154ms/step - loss: 0.1750 - acc: 0.9907 - val_loss: 0.3595 - val_acc: 0.9379
Epoch 349/500
77s 154ms/step - loss: 0.1736 - acc: 0.9911 - val_loss: 0.3579 - val_acc: 0.9382
Epoch 350/500
77s 154ms/step - loss: 0.1740 - acc: 0.9910 - val_loss: 0.3560 - val_acc: 0.9394
Epoch 351/500
77s 154ms/step - loss: 0.1710 - acc: 0.9919 - val_loss: 0.3584 - val_acc: 0.9381
Epoch 352/500
77s 154ms/step - loss: 0.1724 - acc: 0.9914 - val_loss: 0.3606 - val_acc: 0.9377
Epoch 353/500
77s 154ms/step - loss: 0.1704 - acc: 0.9922 - val_loss: 0.3589 - val_acc: 0.9369
Epoch 354/500
77s 154ms/step - loss: 0.1696 - acc: 0.9928 - val_loss: 0.3549 - val_acc: 0.9397
Epoch 355/500
77s 154ms/step - loss: 0.1710 - acc: 0.9914 - val_loss: 0.3568 - val_acc: 0.9397
Epoch 356/500
77s 154ms/step - loss: 0.1690 - acc: 0.9919 - val_loss: 0.3574 - val_acc: 0.9390
Epoch 357/500
77s 154ms/step - loss: 0.1692 - acc: 0.9919 - val_loss: 0.3604 - val_acc: 0.9363
Epoch 358/500
77s 154ms/step - loss: 0.1689 - acc: 0.9920 - val_loss: 0.3587 - val_acc: 0.9385
Epoch 359/500
77s 154ms/step - loss: 0.1680 - acc: 0.9922 - val_loss: 0.3629 - val_acc: 0.9363
Epoch 360/500
77s 154ms/step - loss: 0.1690 - acc: 0.9923 - val_loss: 0.3575 - val_acc: 0.9385
Epoch 361/500
77s 154ms/step - loss: 0.1683 - acc: 0.9920 - val_loss: 0.3560 - val_acc: 0.9389
Epoch 362/500
77s 154ms/step - loss: 0.1699 - acc: 0.9913 - val_loss: 0.3572 - val_acc: 0.9366
Epoch 363/500
77s 154ms/step - loss: 0.1657 - acc: 0.9925 - val_loss: 0.3546 - val_acc: 0.9385
Epoch 364/500
77s 154ms/step - loss: 0.1675 - acc: 0.9920 - val_loss: 0.3581 - val_acc: 0.9381
Epoch 365/500
77s 154ms/step - loss: 0.1661 - acc: 0.9926 - val_loss: 0.3595 - val_acc: 0.9390
Epoch 366/500
77s 154ms/step - loss: 0.1664 - acc: 0.9927 - val_loss: 0.3576 - val_acc: 0.9391
Epoch 367/500
77s 154ms/step - loss: 0.1659 - acc: 0.9920 - val_loss: 0.3575 - val_acc: 0.9395
Epoch 368/500
77s 154ms/step - loss: 0.1662 - acc: 0.9920 - val_loss: 0.3577 - val_acc: 0.9383
Epoch 369/500
77s 154ms/step - loss: 0.1658 - acc: 0.9923 - val_loss: 0.3596 - val_acc: 0.9383
Epoch 370/500
77s 154ms/step - loss: 0.1634 - acc: 0.9933 - val_loss: 0.3575 - val_acc: 0.9386
...
Epoch 440/500
77s 153ms/step - loss: 0.1467 - acc: 0.9932 - val_loss: 0.3487 - val_acc: 0.9381
Epoch 441/500
76s 153ms/step - loss: 0.1455 - acc: 0.9939 - val_loss: 0.3450 - val_acc: 0.9383
Epoch 442/500
77s 153ms/step - loss: 0.1453 - acc: 0.9941 - val_loss: 0.3518 - val_acc: 0.9370
Epoch 443/500
76s 153ms/step - loss: 0.1450 - acc: 0.9939 - val_loss: 0.3510 - val_acc: 0.9360
Epoch 444/500
76s 153ms/step - loss: 0.1458 - acc: 0.9942 - val_loss: 0.3553 - val_acc: 0.9366
Epoch 445/500
76s 153ms/step - loss: 0.1447 - acc: 0.9942 - val_loss: 0.3484 - val_acc: 0.9375
Epoch 446/500
77s 153ms/step - loss: 0.1431 - acc: 0.9945 - val_loss: 0.3522 - val_acc: 0.9386
Epoch 447/500
76s 153ms/step - loss: 0.1448 - acc: 0.9939 - val_loss: 0.3548 - val_acc: 0.9359
Epoch 448/500
76s 153ms/step - loss: 0.1434 - acc: 0.9939 - val_loss: 0.3514 - val_acc: 0.9379
Epoch 449/500
76s 153ms/step - loss: 0.1439 - acc: 0.9939 - val_loss: 0.3488 - val_acc: 0.9389
Epoch 450/500
76s 153ms/step - loss: 0.1445 - acc: 0.9937 - val_loss: 0.3523 - val_acc: 0.9380
Epoch 451/500
lr changed to 9.999999310821295e-05
76s 153ms/step - loss: 0.1442 - acc: 0.9938 - val_loss: 0.3510 - val_acc: 0.9385
Epoch 452/500
77s 154ms/step - loss: 0.1436 - acc: 0.9941 - val_loss: 0.3507 - val_acc: 0.9394
Epoch 453/500
76s 153ms/step - loss: 0.1423 - acc: 0.9948 - val_loss: 0.3503 - val_acc: 0.9387
Epoch 454/500
77s 153ms/step - loss: 0.1430 - acc: 0.9942 - val_loss: 0.3499 - val_acc: 0.9391
Epoch 455/500
77s 153ms/step - loss: 0.1430 - acc: 0.9941 - val_loss: 0.3489 - val_acc: 0.9396
Epoch 456/500
77s 153ms/step - loss: 0.1418 - acc: 0.9947 - val_loss: 0.3487 - val_acc: 0.9393
Epoch 457/500
77s 153ms/step - loss: 0.1417 - acc: 0.9949 - val_loss: 0.3478 - val_acc: 0.9395
Epoch 458/500
76s 153ms/step - loss: 0.1423 - acc: 0.9946 - val_loss: 0.3476 - val_acc: 0.9393
Epoch 459/500
76s 153ms/step - loss: 0.1408 - acc: 0.9949 - val_loss: 0.3474 - val_acc: 0.9398
Epoch 460/500
77s 153ms/step - loss: 0.1415 - acc: 0.9947 - val_loss: 0.3482 - val_acc: 0.9397
Epoch 461/500
76s 153ms/step - loss: 0.1406 - acc: 0.9956 - val_loss: 0.3482 - val_acc: 0.9394
Epoch 462/500
77s 153ms/step - loss: 0.1409 - acc: 0.9951 - val_loss: 0.3475 - val_acc: 0.9396
Epoch 463/500
76s 153ms/step - loss: 0.1406 - acc: 0.9950 - val_loss: 0.3471 - val_acc: 0.9395
Epoch 464/500
76s 153ms/step - loss: 0.1406 - acc: 0.9951 - val_loss: 0.3474 - val_acc: 0.9390
Epoch 465/500
76s 153ms/step - loss: 0.1408 - acc: 0.9950 - val_loss: 0.3477 - val_acc: 0.9396
...
Epoch 490/500
77s 153ms/step - loss: 0.1399 - acc: 0.9953 - val_loss: 0.3482 - val_acc: 0.9390
Epoch 491/500
76s 153ms/step - loss: 0.1397 - acc: 0.9950 - val_loss: 0.3487 - val_acc: 0.9389
Epoch 492/500
77s 153ms/step - loss: 0.1395 - acc: 0.9952 - val_loss: 0.3487 - val_acc: 0.9388
Epoch 493/500
77s 153ms/step - loss: 0.1391 - acc: 0.9956 - val_loss: 0.3491 - val_acc: 0.9390
Epoch 494/500
76s 153ms/step - loss: 0.1396 - acc: 0.9952 - val_loss: 0.3481 - val_acc: 0.9389
Epoch 495/500
76s 153ms/step - loss: 0.1395 - acc: 0.9952 - val_loss: 0.3479 - val_acc: 0.9387
Epoch 496/500
76s 153ms/step - loss: 0.1389 - acc: 0.9953 - val_loss: 0.3482 - val_acc: 0.9386
Epoch 497/500
76s 153ms/step - loss: 0.1385 - acc: 0.9960 - val_loss: 0.3487 - val_acc: 0.9388
Epoch 498/500
76s 153ms/step - loss: 0.1386 - acc: 0.9957 - val_loss: 0.3483 - val_acc: 0.9388
Epoch 499/500
76s 153ms/step - loss: 0.1391 - acc: 0.9955 - val_loss: 0.3482 - val_acc: 0.9390
Epoch 500/500
76s 153ms/step - loss: 0.1397 - acc: 0.9951 - val_loss: 0.3481 - val_acc: 0.9396
Train loss: 0.1304543605595827
Train accuracy: 0.9980800018310547
Test loss: 0.3480722904205322
Test accuracy: 0.9396000015735626

相较于调参记录18,训练准确率和测试准确率都降了一点。同时,训练准确率比测试准确率大概高了6%,说明依然存在过拟合。

Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht, Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, IEEE Transactions on Industrial Electronics, DOI: 10.1109/TIE.2020.2972458, Date of Publication: 13 February 2020

https://ieeexplore.ieee.org/document/8998530

本文系转载,前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文系转载前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

评论
作者已关闭评论
0 条评论
热度
最新
推荐阅读
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档