前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >深度残差网络+自适应参数化ReLU激活函数(调参记录11)

深度残差网络+自适应参数化ReLU激活函数(调参记录11)

作者头像
用户6915903
修改2020-05-06 17:54:07
3440
修改2020-05-06 17:54:07
举报
文章被收录于专栏:深度神经网络

本文在调参记录10的基础上,将残差模块的数量从27个增加到60个,测试采用Adaptively Parametric ReLU(APReLU)激活函数的深度残差网络,在Cifar10图像集上的效果。

自适应参数化ReLU激活函数
自适应参数化ReLU激活函数

Keras程序如下:

代码语言:python
代码运行次数:0
复制
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Tue Apr 14 04:17:45 2020
Implemented using TensorFlow 1.0.1 and Keras 2.2.1

Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht,
Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, 
IEEE Transactions on Industrial Electronics, 2020,  DOI: 10.1109/TIE.2020.2972458 

@author: Minghang Zhao
"""

from __future__ import print_function
import keras
import numpy as np
from keras.datasets import cifar10
from keras.layers import Dense, Conv2D, BatchNormalization, Activation, Minimum
from keras.layers import AveragePooling2D, Input, GlobalAveragePooling2D, Concatenate, Reshape
from keras.regularizers import l2
from keras import backend as K
from keras.models import Model
from keras import optimizers
from keras.preprocessing.image import ImageDataGenerator
from keras.callbacks import LearningRateScheduler
K.set_learning_phase(1)

# The data, split between train and test sets
(x_train, y_train), (x_test, y_test) = cifar10.load_data()

# Noised data
x_train = x_train.astype('float32') / 255.
x_test = x_test.astype('float32') / 255.
x_test = x_test-np.mean(x_train)
x_train = x_train-np.mean(x_train)
print('x_train shape:', x_train.shape)
print(x_train.shape[0], 'train samples')
print(x_test.shape[0], 'test samples')

# convert class vectors to binary class matrices
y_train = keras.utils.to_categorical(y_train, 10)
y_test = keras.utils.to_categorical(y_test, 10)

# Schedule the learning rate, multiply 0.1 every 300 epoches
def scheduler(epoch):
    if epoch % 300 == 0 and epoch != 0:
        lr = K.get_value(model.optimizer.lr)
        K.set_value(model.optimizer.lr, lr * 0.1)
        print("lr changed to {}".format(lr * 0.1))
    return K.get_value(model.optimizer.lr)

# An adaptively parametric rectifier linear unit (APReLU)
def aprelu(inputs):
    # get the number of channels
    channels = inputs.get_shape().as_list()[-1]
    # get a zero feature map
    zeros_input = keras.layers.subtract([inputs, inputs])
    # get a feature map with only positive features
    pos_input = Activation('relu')(inputs)
    # get a feature map with only negative features
    neg_input = Minimum()([inputs,zeros_input])
    # define a network to obtain the scaling coefficients
    scales_p = GlobalAveragePooling2D()(pos_input)
    scales_n = GlobalAveragePooling2D()(neg_input)
    scales = Concatenate()([scales_n, scales_p])
    scales = Dense(channels, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales)
    scales = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(scales)
    scales = Activation('relu')(scales)
    scales = Dense(channels, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales)
    scales = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(scales)
    scales = Activation('sigmoid')(scales)
    scales = Reshape((1,1,channels))(scales)
    # apply a paramtetric relu
    neg_part = keras.layers.multiply([scales, neg_input])
    return keras.layers.add([pos_input, neg_part])

# Residual Block
def residual_block(incoming, nb_blocks, out_channels, downsample=False,
                   downsample_strides=2):
    
    residual = incoming
    in_channels = incoming.get_shape().as_list()[-1]
    
    for i in range(nb_blocks):
        
        identity = residual
        
        if not downsample:
            downsample_strides = 1
        
        residual = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(residual)
        residual = aprelu(residual)
        residual = Conv2D(out_channels, 3, strides=(downsample_strides, downsample_strides), 
                          padding='same', kernel_initializer='he_normal', 
                          kernel_regularizer=l2(1e-4))(residual)
        
        residual = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(residual)
        residual = aprelu(residual)
        residual = Conv2D(out_channels, 3, padding='same', kernel_initializer='he_normal', 
                          kernel_regularizer=l2(1e-4))(residual)
        
        # Downsampling
        if downsample_strides > 1:
            identity = AveragePooling2D(pool_size=(1,1), strides=(2,2))(identity)
            
        # Zero_padding to match channels
        if in_channels != out_channels:
            zeros_identity = keras.layers.subtract([identity, identity])
            identity = keras.layers.concatenate([identity, zeros_identity])
            in_channels = out_channels
        
        residual = keras.layers.add([residual, identity])
    
    return residual


# define and train a model
inputs = Input(shape=(32, 32, 3))
net = Conv2D(16, 3, padding='same', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(inputs)
net = residual_block(net, 20, 16, downsample=False)
net = residual_block(net,  1, 32, downsample=True)
net = residual_block(net, 19, 32, downsample=False)
net = residual_block(net,  1, 64, downsample=True)
net = residual_block(net, 19, 64, downsample=False)
net = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(net)
net = Activation('relu')(net)
net = GlobalAveragePooling2D()(net)
outputs = Dense(10, activation='softmax', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(net)
model = Model(inputs=inputs, outputs=outputs)
sgd = optimizers.SGD(lr=0.1, decay=0., momentum=0.9, nesterov=True)
model.compile(loss='categorical_crossentropy', optimizer=sgd, metrics=['accuracy'])

# data augmentation
datagen = ImageDataGenerator(
    # randomly rotate images in the range (deg 0 to 180)
    rotation_range=30,
    # shear angle in counter-clockwise direction in degrees
    shear_range = 30,
    # randomly flip images
    horizontal_flip=True,
    # randomly shift images horizontally
    width_shift_range=0.125,
    # randomly shift images vertically
    height_shift_range=0.125)

reduce_lr = LearningRateScheduler(scheduler)
# fit the model on the batches generated by datagen.flow().
model.fit_generator(datagen.flow(x_train, y_train, batch_size=100),
                    validation_data=(x_test, y_test), epochs=1000, 
                    verbose=1, callbacks=[reduce_lr], workers=4)

# get results
K.set_learning_phase(0)
DRSN_train_score = model.evaluate(x_train, y_train, batch_size=100, verbose=0)
print('Train loss:', DRSN_train_score[0])
print('Train accuracy:', DRSN_train_score[1])
DRSN_test_score = model.evaluate(x_test, y_test, batch_size=100, verbose=0)
print('Test loss:', DRSN_test_score[0])
print('Test accuracy:', DRSN_test_score[1])

实验结果如下(跑得好慢,不知道能不能跑完):

代码语言:python
代码运行次数:0
复制
Using TensorFlow backend.
x_train shape: (50000, 32, 32, 3)
50000 train samples
10000 test samples
Epoch 1/1000
500/500 [==============================] - 216s 433ms/step - loss: 5.3303 - acc: 0.3881 - val_loss: 4.6744 - val_acc: 0.5067
Epoch 2/1000
500/500 [==============================] - 142s 284ms/step - loss: 4.3438 - acc: 0.5292 - val_loss: 3.8578 - val_acc: 0.6084
Epoch 3/1000
500/500 [==============================] - 142s 284ms/step - loss: 3.6504 - acc: 0.5949 - val_loss: 3.2425 - val_acc: 0.6673
Epoch 4/1000
500/500 [==============================] - 142s 284ms/step - loss: 3.1230 - acc: 0.6384 - val_loss: 2.8284 - val_acc: 0.6826
Epoch 5/1000
500/500 [==============================] - 142s 284ms/step - loss: 2.7009 - acc: 0.6656 - val_loss: 2.4285 - val_acc: 0.7164
Epoch 6/1000
500/500 [==============================] - 142s 284ms/step - loss: 2.3806 - acc: 0.6838 - val_loss: 2.1267 - val_acc: 0.7293
Epoch 7/1000
500/500 [==============================] - 142s 284ms/step - loss: 2.1009 - acc: 0.7026 - val_loss: 1.9077 - val_acc: 0.7389
Epoch 8/1000
500/500 [==============================] - 142s 284ms/step - loss: 1.8769 - acc: 0.7181 - val_loss: 1.7067 - val_acc: 0.7544
Epoch 9/1000
500/500 [==============================] - 142s 284ms/step - loss: 1.6922 - acc: 0.7336 - val_loss: 1.5801 - val_acc: 0.7518
Epoch 10/1000
500/500 [==============================] - 142s 284ms/step - loss: 1.5452 - acc: 0.7440 - val_loss: 1.4281 - val_acc: 0.7685
Epoch 11/1000
500/500 [==============================] - 142s 284ms/step - loss: 1.4296 - acc: 0.7495 - val_loss: 1.3131 - val_acc: 0.7802
Epoch 12/1000
500/500 [==============================] - 142s 284ms/step - loss: 1.3341 - acc: 0.7572 - val_loss: 1.2388 - val_acc: 0.7803
Epoch 13/1000
500/500 [==============================] - 142s 284ms/step - loss: 1.2588 - acc: 0.7623 - val_loss: 1.1707 - val_acc: 0.7887
Epoch 14/1000
500/500 [==============================] - 142s 284ms/step - loss: 1.1930 - acc: 0.7688 - val_loss: 1.0920 - val_acc: 0.8042
Epoch 15/1000
500/500 [==============================] - 142s 284ms/step - loss: 1.1506 - acc: 0.7699 - val_loss: 1.0500 - val_acc: 0.8034
Epoch 16/1000
500/500 [==============================] - 142s 284ms/step - loss: 1.1056 - acc: 0.7766 - val_loss: 1.0199 - val_acc: 0.8052
Epoch 17/1000
500/500 [==============================] - 142s 284ms/step - loss: 1.0735 - acc: 0.7772 - val_loss: 0.9737 - val_acc: 0.8178
Epoch 18/1000
500/500 [==============================] - 142s 284ms/step - loss: 1.0420 - acc: 0.7833 - val_loss: 0.9912 - val_acc: 0.8025
Epoch 19/1000
500/500 [==============================] - 142s 284ms/step - loss: 1.0156 - acc: 0.7860 - val_loss: 0.9525 - val_acc: 0.8041
Epoch 20/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.9980 - acc: 0.7892 - val_loss: 0.9304 - val_acc: 0.8140
Epoch 21/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.9773 - acc: 0.7910 - val_loss: 0.9240 - val_acc: 0.8116
Epoch 22/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.9600 - acc: 0.7931 - val_loss: 0.8714 - val_acc: 0.8248
Epoch 23/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.9449 - acc: 0.7969 - val_loss: 0.8751 - val_acc: 0.8234
Epoch 24/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.9424 - acc: 0.7958 - val_loss: 0.8551 - val_acc: 0.8261
Epoch 25/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.9224 - acc: 0.8039 - val_loss: 0.8438 - val_acc: 0.8336
Epoch 26/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.9131 - acc: 0.8023 - val_loss: 0.8542 - val_acc: 0.8272
Epoch 27/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.8975 - acc: 0.8069 - val_loss: 0.8719 - val_acc: 0.8196
Epoch 28/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.8987 - acc: 0.8085 - val_loss: 0.8269 - val_acc: 0.8355
Epoch 29/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.8824 - acc: 0.8122 - val_loss: 0.8305 - val_acc: 0.8324
Epoch 30/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.8837 - acc: 0.8102 - val_loss: 0.8332 - val_acc: 0.8247
Epoch 31/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.8727 - acc: 0.8130 - val_loss: 0.8075 - val_acc: 0.8386
Epoch 32/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.8686 - acc: 0.8154 - val_loss: 0.8198 - val_acc: 0.8350
Epoch 33/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.8608 - acc: 0.8150 - val_loss: 0.8006 - val_acc: 0.8396
Epoch 34/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.8553 - acc: 0.8188 - val_loss: 0.8249 - val_acc: 0.8324
Epoch 35/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.8474 - acc: 0.8197 - val_loss: 0.7876 - val_acc: 0.8437
Epoch 36/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.8473 - acc: 0.8218 - val_loss: 0.7648 - val_acc: 0.8555
Epoch 37/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.8410 - acc: 0.8235 - val_loss: 0.7866 - val_acc: 0.8432
Epoch 38/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.8334 - acc: 0.8245 - val_loss: 0.7785 - val_acc: 0.8473
Epoch 39/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.8336 - acc: 0.8263 - val_loss: 0.7783 - val_acc: 0.8486
Epoch 40/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.8337 - acc: 0.8245 - val_loss: 0.7782 - val_acc: 0.8461
Epoch 41/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.8292 - acc: 0.8257 - val_loss: 0.7696 - val_acc: 0.8498
Epoch 42/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.8203 - acc: 0.8298 - val_loss: 0.7618 - val_acc: 0.8511
Epoch 43/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.8209 - acc: 0.8303 - val_loss: 0.7634 - val_acc: 0.8551
Epoch 44/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.8163 - acc: 0.8327 - val_loss: 0.7719 - val_acc: 0.8449
Epoch 45/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.8072 - acc: 0.8328 - val_loss: 0.7635 - val_acc: 0.8493
Epoch 46/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.8127 - acc: 0.8324 - val_loss: 0.7725 - val_acc: 0.8495
Epoch 47/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.8081 - acc: 0.8343 - val_loss: 0.7576 - val_acc: 0.8537
Epoch 48/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.8090 - acc: 0.8322 - val_loss: 0.7421 - val_acc: 0.8603
Epoch 49/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.8041 - acc: 0.8344 - val_loss: 0.7422 - val_acc: 0.8576
Epoch 50/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.8008 - acc: 0.8361 - val_loss: 0.7472 - val_acc: 0.8566
Epoch 51/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.8013 - acc: 0.8379 - val_loss: 0.7385 - val_acc: 0.8585
Epoch 52/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.7964 - acc: 0.8381 - val_loss: 0.7805 - val_acc: 0.8453
Epoch 53/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.7929 - acc: 0.8387 - val_loss: 0.7597 - val_acc: 0.8516
Epoch 54/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.7945 - acc: 0.8388 - val_loss: 0.7596 - val_acc: 0.8529
Epoch 55/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.7904 - acc: 0.8407 - val_loss: 0.7376 - val_acc: 0.8594
Epoch 56/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.7806 - acc: 0.8443 - val_loss: 0.7478 - val_acc: 0.8551
Epoch 57/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.7807 - acc: 0.8444 - val_loss: 0.7536 - val_acc: 0.8547
Epoch 58/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.7838 - acc: 0.8440 - val_loss: 0.7164 - val_acc: 0.8686
Epoch 59/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.7777 - acc: 0.8444 - val_loss: 0.7441 - val_acc: 0.8601
Epoch 60/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.7786 - acc: 0.8461 - val_loss: 0.7339 - val_acc: 0.8603
...
Epoch 261/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6880 - acc: 0.8771 - val_loss: 0.6855 - val_acc: 0.8788
Epoch 262/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6839 - acc: 0.8777 - val_loss: 0.6723 - val_acc: 0.8851
Epoch 263/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6819 - acc: 0.8783 - val_loss: 0.6738 - val_acc: 0.8845
Epoch 264/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6867 - acc: 0.8784 - val_loss: 0.6809 - val_acc: 0.8790
Epoch 265/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6805 - acc: 0.8810 - val_loss: 0.6750 - val_acc: 0.8846
Epoch 266/1000
500/500 [==============================] - 141s 283ms/step - loss: 0.6809 - acc: 0.8781 - val_loss: 0.6584 - val_acc: 0.8878
Epoch 267/1000
500/500 [==============================] - 142s 283ms/step - loss: 0.6944 - acc: 0.8722 - val_loss: 0.6598 - val_acc: 0.8875
Epoch 268/1000
500/500 [==============================] - 141s 283ms/step - loss: 0.6847 - acc: 0.8779 - val_loss: 0.6825 - val_acc: 0.8817
Epoch 269/1000
500/500 [==============================] - 141s 283ms/step - loss: 0.6824 - acc: 0.8786 - val_loss: 0.6552 - val_acc: 0.8908
Epoch 270/1000
500/500 [==============================] - 141s 283ms/step - loss: 0.6830 - acc: 0.8783 - val_loss: 0.6820 - val_acc: 0.8767
Epoch 271/1000
500/500 [==============================] - 141s 283ms/step - loss: 0.6903 - acc: 0.8752 - val_loss: 0.6685 - val_acc: 0.8855
Epoch 272/1000
500/500 [==============================] - 141s 283ms/step - loss: 0.6861 - acc: 0.8760 - val_loss: 0.6707 - val_acc: 0.8873
Epoch 273/1000
500/500 [==============================] - 142s 283ms/step - loss: 0.6823 - acc: 0.8782 - val_loss: 0.6721 - val_acc: 0.8864
Epoch 274/1000
500/500 [==============================] - 141s 283ms/step - loss: 0.6862 - acc: 0.8769 - val_loss: 0.6764 - val_acc: 0.8866
Epoch 275/1000
500/500 [==============================] - 141s 283ms/step - loss: 0.6825 - acc: 0.8785 - val_loss: 0.6673 - val_acc: 0.8861
Epoch 276/1000
500/500 [==============================] - 142s 283ms/step - loss: 0.6842 - acc: 0.8771 - val_loss: 0.6757 - val_acc: 0.8835
Epoch 277/1000
500/500 [==============================] - 142s 283ms/step - loss: 0.6855 - acc: 0.8777 - val_loss: 0.6769 - val_acc: 0.8814
Epoch 278/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6793 - acc: 0.8802 - val_loss: 0.6618 - val_acc: 0.8883
Epoch 279/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6854 - acc: 0.8766 - val_loss: 0.6965 - val_acc: 0.8743
Epoch 280/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6824 - acc: 0.8792 - val_loss: 0.6720 - val_acc: 0.8842
Epoch 281/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6786 - acc: 0.8790 - val_loss: 0.6589 - val_acc: 0.8883
Epoch 282/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6781 - acc: 0.8797 - val_loss: 0.6620 - val_acc: 0.8862
Epoch 283/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6845 - acc: 0.8786 - val_loss: 0.6936 - val_acc: 0.8802
Epoch 284/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6866 - acc: 0.8772 - val_loss: 0.6678 - val_acc: 0.8890
Epoch 285/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6829 - acc: 0.8787 - val_loss: 0.6630 - val_acc: 0.8866
Epoch 286/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6763 - acc: 0.8796 - val_loss: 0.6597 - val_acc: 0.8893
Epoch 287/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6833 - acc: 0.8774 - val_loss: 0.6752 - val_acc: 0.8866
Epoch 288/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6858 - acc: 0.8768 - val_loss: 0.6617 - val_acc: 0.8902
Epoch 289/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6784 - acc: 0.8799 - val_loss: 0.6634 - val_acc: 0.8872
Epoch 290/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6807 - acc: 0.8778 - val_loss: 0.6564 - val_acc: 0.8896
Epoch 291/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6835 - acc: 0.8769 - val_loss: 0.6628 - val_acc: 0.8877
Epoch 292/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6783 - acc: 0.8798 - val_loss: 0.6887 - val_acc: 0.8813
Epoch 293/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6795 - acc: 0.8810 - val_loss: 0.6590 - val_acc: 0.8899
Epoch 294/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6799 - acc: 0.8798 - val_loss: 0.6599 - val_acc: 0.8873
Epoch 295/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6856 - acc: 0.8792 - val_loss: 0.6636 - val_acc: 0.8880
Epoch 296/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6832 - acc: 0.8802 - val_loss: 0.6513 - val_acc: 0.8926
Epoch 297/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6785 - acc: 0.8794 - val_loss: 0.6568 - val_acc: 0.8886
Epoch 298/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6832 - acc: 0.8782 - val_loss: 0.6697 - val_acc: 0.8872
Epoch 299/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.6771 - acc: 0.8813 - val_loss: 0.6714 - val_acc: 0.8825
Epoch 300/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.6814 - acc: 0.8784 - val_loss: 0.6857 - val_acc: 0.8821
Epoch 301/1000
lr changed to 0.010000000149011612
500/500 [==============================] - 142s 284ms/step - loss: 0.5714 - acc: 0.9156 - val_loss: 0.5648 - val_acc: 0.9171
Epoch 302/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.5073 - acc: 0.9362 - val_loss: 0.5481 - val_acc: 0.9236
Epoch 303/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.4913 - acc: 0.9412 - val_loss: 0.5391 - val_acc: 0.9228
Epoch 304/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.4714 - acc: 0.9455 - val_loss: 0.5304 - val_acc: 0.9255
Epoch 305/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.4592 - acc: 0.9481 - val_loss: 0.5223 - val_acc: 0.9253
Epoch 306/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.4452 - acc: 0.9512 - val_loss: 0.5173 - val_acc: 0.9271
Epoch 307/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.4350 - acc: 0.9520 - val_loss: 0.5130 - val_acc: 0.9272
Epoch 308/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.4268 - acc: 0.9528 - val_loss: 0.5095 - val_acc: 0.9247
Epoch 309/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.4178 - acc: 0.9562 - val_loss: 0.5078 - val_acc: 0.9272
Epoch 310/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.4143 - acc: 0.9540 - val_loss: 0.5075 - val_acc: 0.9279
Epoch 311/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.4027 - acc: 0.9576 - val_loss: 0.4964 - val_acc: 0.9266
Epoch 312/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.3964 - acc: 0.9572 - val_loss: 0.4957 - val_acc: 0.9264
Epoch 313/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.3920 - acc: 0.9581 - val_loss: 0.4919 - val_acc: 0.9276
Epoch 314/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.3829 - acc: 0.9602 - val_loss: 0.4879 - val_acc: 0.9271
Epoch 315/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.3751 - acc: 0.9609 - val_loss: 0.4864 - val_acc: 0.9285
Epoch 316/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.3736 - acc: 0.9605 - val_loss: 0.4832 - val_acc: 0.9264
Epoch 317/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.3669 - acc: 0.9609 - val_loss: 0.4763 - val_acc: 0.9280
Epoch 318/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.3610 - acc: 0.9625 - val_loss: 0.4739 - val_acc: 0.9295
Epoch 319/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.3561 - acc: 0.9625 - val_loss: 0.4756 - val_acc: 0.9261
Epoch 320/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.3481 - acc: 0.9651 - val_loss: 0.4765 - val_acc: 0.9231
Epoch 321/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.3461 - acc: 0.9642 - val_loss: 0.4618 - val_acc: 0.9273
Epoch 322/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.3402 - acc: 0.9651 - val_loss: 0.4673 - val_acc: 0.9279
Epoch 323/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.3358 - acc: 0.9649 - val_loss: 0.4659 - val_acc: 0.9260
Epoch 324/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.3332 - acc: 0.9652 - val_loss: 0.4602 - val_acc: 0.9262
Epoch 325/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.3281 - acc: 0.9656 - val_loss: 0.4609 - val_acc: 0.9271
Epoch 326/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.3319 - acc: 0.9644 - val_loss: 0.4555 - val_acc: 0.9273
Epoch 327/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.3191 - acc: 0.9671 - val_loss: 0.4475 - val_acc: 0.9287
Epoch 328/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.3222 - acc: 0.9650 - val_loss: 0.4560 - val_acc: 0.9272
Epoch 329/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.3150 - acc: 0.9675 - val_loss: 0.4516 - val_acc: 0.9292
Epoch 330/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.3102 - acc: 0.9680 - val_loss: 0.4533 - val_acc: 0.9281
Epoch 331/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.3094 - acc: 0.9679 - val_loss: 0.4549 - val_acc: 0.9222
Epoch 332/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.3062 - acc: 0.9679 - val_loss: 0.4557 - val_acc: 0.9229
Epoch 333/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.3038 - acc: 0.9678 - val_loss: 0.4443 - val_acc: 0.9241
Epoch 334/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.3034 - acc: 0.9676 - val_loss: 0.4446 - val_acc: 0.9273
Epoch 335/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.2986 - acc: 0.9674 - val_loss: 0.4556 - val_acc: 0.9241
Epoch 336/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.2977 - acc: 0.9674 - val_loss: 0.4528 - val_acc: 0.9221
Epoch 337/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.2976 - acc: 0.9676 - val_loss: 0.4348 - val_acc: 0.9251
Epoch 338/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.2919 - acc: 0.9681 - val_loss: 0.4443 - val_acc: 0.9257
Epoch 339/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.2897 - acc: 0.9686 - val_loss: 0.4394 - val_acc: 0.9268
...
500/500 [==============================] - 142s 284ms/step - loss: 0.2314 - acc: 0.9732 - val_loss: 0.4116 - val_acc: 0.9241
Epoch 530/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.2334 - acc: 0.9717 - val_loss: 0.4208 - val_acc: 0.9202
Epoch 531/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.2307 - acc: 0.9724 - val_loss: 0.4264 - val_acc: 0.9214
Epoch 532/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.2347 - acc: 0.9712 - val_loss: 0.4253 - val_acc: 0.9213
Epoch 533/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.2318 - acc: 0.9730 - val_loss: 0.4152 - val_acc: 0.9239
Epoch 534/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.2376 - acc: 0.9710 - val_loss: 0.4279 - val_acc: 0.9204
Epoch 535/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.2320 - acc: 0.9725 - val_loss: 0.4242 - val_acc: 0.9180
Epoch 536/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.2344 - acc: 0.9713 - val_loss: 0.4106 - val_acc: 0.9234
Epoch 537/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.2350 - acc: 0.9710 - val_loss: 0.4169 - val_acc: 0.9216
Epoch 538/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.2355 - acc: 0.9712 - val_loss: 0.4265 - val_acc: 0.9223
Epoch 539/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.2333 - acc: 0.9719 - val_loss: 0.4132 - val_acc: 0.9220
Epoch 540/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.2294 - acc: 0.9734 - val_loss: 0.4203 - val_acc: 0.9245
Epoch 541/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.2343 - acc: 0.9715 - val_loss: 0.4176 - val_acc: 0.9236
Epoch 542/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.2330 - acc: 0.9723 - val_loss: 0.4146 - val_acc: 0.9241
Epoch 543/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.2361 - acc: 0.9706 - val_loss: 0.4125 - val_acc: 0.9213
...
Epoch 644/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1541 - acc: 0.9969 - val_loss: 0.3741 - val_acc: 0.9375
Epoch 645/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1539 - acc: 0.9967 - val_loss: 0.3731 - val_acc: 0.9396
Epoch 646/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1526 - acc: 0.9973 - val_loss: 0.3739 - val_acc: 0.9387
Epoch 647/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1531 - acc: 0.9970 - val_loss: 0.3740 - val_acc: 0.9380
Epoch 648/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1535 - acc: 0.9967 - val_loss: 0.3743 - val_acc: 0.9390
Epoch 649/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.1531 - acc: 0.9966 - val_loss: 0.3761 - val_acc: 0.9387
Epoch 650/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1512 - acc: 0.9976 - val_loss: 0.3743 - val_acc: 0.9388
Epoch 651/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.1521 - acc: 0.9969 - val_loss: 0.3741 - val_acc: 0.9382
Epoch 652/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1522 - acc: 0.9969 - val_loss: 0.3747 - val_acc: 0.9381
Epoch 653/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1510 - acc: 0.9972 - val_loss: 0.3752 - val_acc: 0.9390
Epoch 654/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1503 - acc: 0.9974 - val_loss: 0.3745 - val_acc: 0.9377
Epoch 655/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.1500 - acc: 0.9976 - val_loss: 0.3761 - val_acc: 0.9377
Epoch 656/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1500 - acc: 0.9974 - val_loss: 0.3752 - val_acc: 0.9377
Epoch 657/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1502 - acc: 0.9973 - val_loss: 0.3767 - val_acc: 0.9375
Epoch 658/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.1490 - acc: 0.9974 - val_loss: 0.3744 - val_acc: 0.9397
Epoch 659/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1498 - acc: 0.9969 - val_loss: 0.3783 - val_acc: 0.9377
Epoch 660/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.1486 - acc: 0.9973 - val_loss: 0.3764 - val_acc: 0.9377
Epoch 661/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1485 - acc: 0.9975 - val_loss: 0.3745 - val_acc: 0.9394
Epoch 662/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1481 - acc: 0.9975 - val_loss: 0.3750 - val_acc: 0.9385
Epoch 663/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1487 - acc: 0.9971 - val_loss: 0.3776 - val_acc: 0.9378
Epoch 664/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1483 - acc: 0.9972 - val_loss: 0.3758 - val_acc: 0.9400
Epoch 665/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1477 - acc: 0.9974 - val_loss: 0.3741 - val_acc: 0.9387
Epoch 666/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1471 - acc: 0.9974 - val_loss: 0.3771 - val_acc: 0.9392
Epoch 667/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1468 - acc: 0.9978 - val_loss: 0.3773 - val_acc: 0.9384
Epoch 668/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1465 - acc: 0.9975 - val_loss: 0.3782 - val_acc: 0.9388
Epoch 669/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.1459 - acc: 0.9977 - val_loss: 0.3796 - val_acc: 0.9372
Epoch 670/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1462 - acc: 0.9973 - val_loss: 0.3774 - val_acc: 0.9377
Epoch 671/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1467 - acc: 0.9968 - val_loss: 0.3788 - val_acc: 0.9375
Epoch 672/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1465 - acc: 0.9972 - val_loss: 0.3784 - val_acc: 0.9373
Epoch 673/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.1457 - acc: 0.9975 - val_loss: 0.3771 - val_acc: 0.9372
Epoch 674/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.1445 - acc: 0.9977 - val_loss: 0.3774 - val_acc: 0.9372
Epoch 675/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1444 - acc: 0.9974 - val_loss: 0.3748 - val_acc: 0.9373
Epoch 676/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1447 - acc: 0.9973 - val_loss: 0.3768 - val_acc: 0.9374
Epoch 677/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1437 - acc: 0.9978 - val_loss: 0.3752 - val_acc: 0.9376
Epoch 678/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1439 - acc: 0.9974 - val_loss: 0.3731 - val_acc: 0.9394
Epoch 679/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1437 - acc: 0.9974 - val_loss: 0.3730 - val_acc: 0.9378
Epoch 680/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1433 - acc: 0.9978 - val_loss: 0.3761 - val_acc: 0.9368
Epoch 681/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1433 - acc: 0.9976 - val_loss: 0.3764 - val_acc: 0.9380
Epoch 682/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1427 - acc: 0.9976 - val_loss: 0.3746 - val_acc: 0.9372
Epoch 683/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1427 - acc: 0.9978 - val_loss: 0.3752 - val_acc: 0.9378
Epoch 684/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1424 - acc: 0.9976 - val_loss: 0.3754 - val_acc: 0.9380
Epoch 685/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1420 - acc: 0.9975 - val_loss: 0.3767 - val_acc: 0.9384
Epoch 686/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1410 - acc: 0.9980 - val_loss: 0.3773 - val_acc: 0.9378
Epoch 687/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1424 - acc: 0.9972 - val_loss: 0.3741 - val_acc: 0.9375
Epoch 688/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1417 - acc: 0.9975 - val_loss: 0.3757 - val_acc: 0.9377
Epoch 689/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1412 - acc: 0.9975 - val_loss: 0.3760 - val_acc: 0.9365
Epoch 690/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1414 - acc: 0.9974 - val_loss: 0.3730 - val_acc: 0.9389
Epoch 691/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1405 - acc: 0.9979 - val_loss: 0.3722 - val_acc: 0.9392
Epoch 692/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.1409 - acc: 0.9975 - val_loss: 0.3744 - val_acc: 0.9375
Epoch 693/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.1396 - acc: 0.9980 - val_loss: 0.3718 - val_acc: 0.9382
Epoch 694/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1393 - acc: 0.9979 - val_loss: 0.3700 - val_acc: 0.9390
Epoch 695/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1394 - acc: 0.9976 - val_loss: 0.3728 - val_acc: 0.9377
Epoch 696/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.1402 - acc: 0.9973 - val_loss: 0.3712 - val_acc: 0.9389
Epoch 697/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1391 - acc: 0.9976 - val_loss: 0.3726 - val_acc: 0.9397
Epoch 698/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1384 - acc: 0.9981 - val_loss: 0.3763 - val_acc: 0.9380
Epoch 699/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.1394 - acc: 0.9973 - val_loss: 0.3747 - val_acc: 0.9371
Epoch 700/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1379 - acc: 0.9979 - val_loss: 0.3742 - val_acc: 0.9378
...
Epoch 860/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1060 - acc: 0.9988 - val_loss: 0.3581 - val_acc: 0.9376
Epoch 861/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1070 - acc: 0.9982 - val_loss: 0.3575 - val_acc: 0.9367
Epoch 862/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1074 - acc: 0.9980 - val_loss: 0.3581 - val_acc: 0.9357
Epoch 863/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1070 - acc: 0.9982 - val_loss: 0.3527 - val_acc: 0.9374
Epoch 864/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1063 - acc: 0.9984 - val_loss: 0.3543 - val_acc: 0.9374
Epoch 865/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1057 - acc: 0.9986 - val_loss: 0.3533 - val_acc: 0.9377
Epoch 866/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.1062 - acc: 0.9978 - val_loss: 0.3545 - val_acc: 0.9369
Epoch 867/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1054 - acc: 0.9984 - val_loss: 0.3542 - val_acc: 0.9355
Epoch 868/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1060 - acc: 0.9983 - val_loss: 0.3482 - val_acc: 0.9394
Epoch 869/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.1054 - acc: 0.9984 - val_loss: 0.3560 - val_acc: 0.9375
Epoch 870/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1064 - acc: 0.9978 - val_loss: 0.3537 - val_acc: 0.9370
Epoch 871/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1050 - acc: 0.9984 - val_loss: 0.3555 - val_acc: 0.9374
Epoch 872/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1049 - acc: 0.9985 - val_loss: 0.3539 - val_acc: 0.9367
Epoch 873/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1050 - acc: 0.9984 - val_loss: 0.3574 - val_acc: 0.9373
Epoch 874/1000
500/500 [==============================] - 143s 285ms/step - loss: 0.1044 - acc: 0.9987 - val_loss: 0.3623 - val_acc: 0.9359
Epoch 875/1000
500/500 [==============================] - 142s 283ms/step - loss: 0.1048 - acc: 0.9982 - val_loss: 0.3600 - val_acc: 0.9370
Epoch 876/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1051 - acc: 0.9982 - val_loss: 0.3594 - val_acc: 0.9366
Epoch 877/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1042 - acc: 0.9985 - val_loss: 0.3558 - val_acc: 0.9357
Epoch 878/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1046 - acc: 0.9982 - val_loss: 0.3549 - val_acc: 0.9360
Epoch 879/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1042 - acc: 0.9984 - val_loss: 0.3520 - val_acc: 0.9385
Epoch 880/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.1040 - acc: 0.9984 - val_loss: 0.3598 - val_acc: 0.9367
Epoch 881/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.1036 - acc: 0.9984 - val_loss: 0.3550 - val_acc: 0.9364
Epoch 882/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1031 - acc: 0.9985 - val_loss: 0.3544 - val_acc: 0.9381
Epoch 883/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.1042 - acc: 0.9981 - val_loss: 0.3513 - val_acc: 0.9380
Epoch 884/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1036 - acc: 0.9982 - val_loss: 0.3541 - val_acc: 0.9364
Epoch 885/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1033 - acc: 0.9985 - val_loss: 0.3532 - val_acc: 0.9376
Epoch 886/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1032 - acc: 0.9981 - val_loss: 0.3566 - val_acc: 0.9376
Epoch 887/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1033 - acc: 0.9981 - val_loss: 0.3518 - val_acc: 0.9368
Epoch 888/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.1020 - acc: 0.9987 - val_loss: 0.3521 - val_acc: 0.9378
Epoch 889/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1020 - acc: 0.9984 - val_loss: 0.3524 - val_acc: 0.9368
Epoch 890/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1024 - acc: 0.9983 - val_loss: 0.3523 - val_acc: 0.9364
Epoch 891/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1029 - acc: 0.9983 - val_loss: 0.3582 - val_acc: 0.9355
Epoch 892/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1018 - acc: 0.9984 - val_loss: 0.3555 - val_acc: 0.9365
Epoch 893/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1021 - acc: 0.9985 - val_loss: 0.3559 - val_acc: 0.9367
Epoch 894/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1026 - acc: 0.9977 - val_loss: 0.3563 - val_acc: 0.9360
Epoch 895/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1027 - acc: 0.9980 - val_loss: 0.3575 - val_acc: 0.9365
Epoch 896/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1023 - acc: 0.9980 - val_loss: 0.3541 - val_acc: 0.9375
Epoch 897/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1016 - acc: 0.9982 - val_loss: 0.3518 - val_acc: 0.9372
Epoch 898/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.1018 - acc: 0.9979 - val_loss: 0.3473 - val_acc: 0.9372
Epoch 899/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1014 - acc: 0.9986 - val_loss: 0.3507 - val_acc: 0.9376
Epoch 900/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1010 - acc: 0.9985 - val_loss: 0.3568 - val_acc: 0.9366
Epoch 901/1000
lr changed to 9.999999310821295e-05
500/500 [==============================] - 142s 284ms/step - loss: 0.1014 - acc: 0.9982 - val_loss: 0.3548 - val_acc: 0.9366
Epoch 902/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1009 - acc: 0.9983 - val_loss: 0.3535 - val_acc: 0.9372
Epoch 903/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1008 - acc: 0.9981 - val_loss: 0.3523 - val_acc: 0.9370
Epoch 904/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1002 - acc: 0.9986 - val_loss: 0.3526 - val_acc: 0.9375
Epoch 905/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.1000 - acc: 0.9987 - val_loss: 0.3519 - val_acc: 0.9372
Epoch 906/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0997 - acc: 0.9989 - val_loss: 0.3520 - val_acc: 0.9374
Epoch 907/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0999 - acc: 0.9989 - val_loss: 0.3520 - val_acc: 0.9377
Epoch 908/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0994 - acc: 0.9989 - val_loss: 0.3518 - val_acc: 0.9376
Epoch 909/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0991 - acc: 0.9990 - val_loss: 0.3520 - val_acc: 0.9378
Epoch 910/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0996 - acc: 0.9988 - val_loss: 0.3515 - val_acc: 0.9375
Epoch 911/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0990 - acc: 0.9990 - val_loss: 0.3513 - val_acc: 0.9372
Epoch 912/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0994 - acc: 0.9987 - val_loss: 0.3508 - val_acc: 0.9371
Epoch 913/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0997 - acc: 0.9988 - val_loss: 0.3510 - val_acc: 0.9373
Epoch 914/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0996 - acc: 0.9989 - val_loss: 0.3509 - val_acc: 0.9374
Epoch 915/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.1001 - acc: 0.9986 - val_loss: 0.3513 - val_acc: 0.9375
Epoch 916/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0991 - acc: 0.9990 - val_loss: 0.3508 - val_acc: 0.9388
Epoch 917/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0987 - acc: 0.9989 - val_loss: 0.3512 - val_acc: 0.9377
Epoch 918/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0990 - acc: 0.9988 - val_loss: 0.3510 - val_acc: 0.9381
Epoch 919/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0997 - acc: 0.9986 - val_loss: 0.3515 - val_acc: 0.9380
Epoch 920/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0993 - acc: 0.9987 - val_loss: 0.3519 - val_acc: 0.9379
Epoch 921/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0988 - acc: 0.9990 - val_loss: 0.3508 - val_acc: 0.9375
Epoch 922/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0993 - acc: 0.9988 - val_loss: 0.3497 - val_acc: 0.9376
Epoch 923/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0990 - acc: 0.9988 - val_loss: 0.3492 - val_acc: 0.9385
Epoch 924/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0993 - acc: 0.9988 - val_loss: 0.3494 - val_acc: 0.9384
Epoch 925/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0988 - acc: 0.9988 - val_loss: 0.3494 - val_acc: 0.9382
Epoch 926/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0992 - acc: 0.9988 - val_loss: 0.3499 - val_acc: 0.9383
Epoch 927/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0985 - acc: 0.9990 - val_loss: 0.3499 - val_acc: 0.9385
Epoch 928/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0989 - acc: 0.9987 - val_loss: 0.3495 - val_acc: 0.9379
Epoch 929/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0987 - acc: 0.9991 - val_loss: 0.3494 - val_acc: 0.9385
...
Epoch 980/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0980 - acc: 0.9987 - val_loss: 0.3509 - val_acc: 0.9380
Epoch 981/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0976 - acc: 0.9992 - val_loss: 0.3507 - val_acc: 0.9382
Epoch 982/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0977 - acc: 0.9990 - val_loss: 0.3512 - val_acc: 0.9391
Epoch 983/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0978 - acc: 0.9988 - val_loss: 0.3503 - val_acc: 0.9389
Epoch 984/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.0975 - acc: 0.9991 - val_loss: 0.3503 - val_acc: 0.9383
Epoch 985/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0977 - acc: 0.9989 - val_loss: 0.3497 - val_acc: 0.9388
Epoch 986/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0977 - acc: 0.9990 - val_loss: 0.3498 - val_acc: 0.9390
Epoch 987/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0972 - acc: 0.9992 - val_loss: 0.3502 - val_acc: 0.9382
Epoch 988/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0973 - acc: 0.9991 - val_loss: 0.3506 - val_acc: 0.9391
Epoch 989/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0977 - acc: 0.9989 - val_loss: 0.3504 - val_acc: 0.9396
Epoch 990/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0978 - acc: 0.9990 - val_loss: 0.3502 - val_acc: 0.9393
Epoch 991/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0976 - acc: 0.9988 - val_loss: 0.3501 - val_acc: 0.9391
Epoch 992/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0973 - acc: 0.9992 - val_loss: 0.3500 - val_acc: 0.9386
Epoch 993/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0970 - acc: 0.9992 - val_loss: 0.3497 - val_acc: 0.9387
Epoch 994/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0973 - acc: 0.9990 - val_loss: 0.3499 - val_acc: 0.9391
Epoch 995/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0978 - acc: 0.9988 - val_loss: 0.3500 - val_acc: 0.9397
Epoch 996/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0975 - acc: 0.9991 - val_loss: 0.3500 - val_acc: 0.9392
Epoch 997/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0975 - acc: 0.9990 - val_loss: 0.3495 - val_acc: 0.9393
Epoch 998/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.0980 - acc: 0.9988 - val_loss: 0.3498 - val_acc: 0.9386
Epoch 999/1000
500/500 [==============================] - 142s 284ms/step - loss: 0.0976 - acc: 0.9991 - val_loss: 0.3490 - val_acc: 0.9382
Epoch 1000/1000
500/500 [==============================] - 142s 285ms/step - loss: 0.0974 - acc: 0.9990 - val_loss: 0.3497 - val_acc: 0.9385
Train loss: 0.09523774388432503
Train accuracy: 0.9995200004577637
Test loss: 0.34968149244785307
Test accuracy: 0.938500000834465

准确率略有提升,但是这是以残差模块的数量翻了一倍为代价的,运算时间长了很多,似乎没有必要这么多层。

本身网络就比较复杂了,还有那么多层,也加大了训练难度。

Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht, Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, IEEE Transactions on Industrial Electronics, 2020, DOI: 10.1109/TIE.2020.2972458

https://ieeexplore.ieee.org/document/8998530

————————————————

版权声明:本文为CSDN博主「dangqing1988」的原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接及本声明。

原文链接:https://blog.csdn.net/dangqing1988/article/details/105728417

本文系转载,前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文系转载前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档