前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >深度残差网络+自适应参数化ReLU激活函数(调参记录6)

深度残差网络+自适应参数化ReLU激活函数(调参记录6)

作者头像
用户6915903
修改2020-04-23 10:25:08
4070
修改2020-04-23 10:25:08
举报
文章被收录于专栏:深度神经网络

续上一篇:

深度残差网络+自适应参数化ReLU激活函数(调参记录5)

https://blog.csdn.net/dangqing1988/article/details/105627351

本文继续调整超参数,测试Adaptively Parametric ReLU(APReLU)激活函数在Cifar10图像集上的效果。APReLU的基本原理如下图所示:

自适应参数化ReLU激活函数
自适应参数化ReLU激活函数

首先,从之前的调参发现,当学习率从0.1降到0.01和从0.01降到0.001的时候,loss会有大幅的下降。之前学习率降到0.001就结束了,那么如果学习率继续往下降的话,是不是loss还会继续下降呢?

其次,当采用APReLU激活函数时,深度残差网络的结构比较复杂,更难训练,也许需要更多的迭代次数。

因此,这次测试将迭代次数恢复成1000个epoch,将1-300、301-600、601-900、901-1000个epoch的学习率分别设置成了0.1、0.01、0.001、0.0001。

同时,最后全局均值池化之前,如果采用APReLU的话,似乎是不利于模型训练的。这是因为APReLU里面用到了sigmoid函数。因此,全局均值池化之前的APReLU改成了普通的ReLU。(对于残差模块里面的APReLU,由于恒等路径的存在,其所导致训练难度的增加应该是可以容忍的)

Keras代码如下:

代码语言:python
代码运行次数:0
复制
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Tue Apr 14 04:17:45 2020
Implemented using TensorFlow 1.0.1 and Keras 2.2.1

Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht,
Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, 
IEEE Transactions on Industrial Electronics, 2020,  DOI: 10.1109/TIE.2020.2972458 

@author: Minghang Zhao
"""

from __future__ import print_function
import keras
import numpy as np
from keras.datasets import cifar10
from keras.layers import Dense, Conv2D, BatchNormalization, Activation, Minimum
from keras.layers import AveragePooling2D, Input, GlobalAveragePooling2D, Concatenate, Reshape
from keras.regularizers import l2
from keras import backend as K
from keras.models import Model
from keras import optimizers
from keras.preprocessing.image import ImageDataGenerator
from keras.callbacks import LearningRateScheduler
K.set_learning_phase(1)

# The data, split between train and test sets
(x_train, y_train), (x_test, y_test) = cifar10.load_data()

# Noised data
x_train = x_train.astype('float32') / 255.
x_test = x_test.astype('float32') / 255.
x_test = x_test-np.mean(x_train)
x_train = x_train-np.mean(x_train)
print('x_train shape:', x_train.shape)
print(x_train.shape[0], 'train samples')
print(x_test.shape[0], 'test samples')

# convert class vectors to binary class matrices
y_train = keras.utils.to_categorical(y_train, 10)
y_test = keras.utils.to_categorical(y_test, 10)

# Schedule the learning rate, multiply 0.1 every 300 epoches
def scheduler(epoch):
    if epoch % 300 == 0 and epoch != 0:
        lr = K.get_value(model.optimizer.lr)
        K.set_value(model.optimizer.lr, lr * 0.1)
        print("lr changed to {}".format(lr * 0.1))
    return K.get_value(model.optimizer.lr)

# An adaptively parametric rectifier linear unit (APReLU)
def aprelu(inputs):
    # get the number of channels
    channels = inputs.get_shape().as_list()[-1]
    # get a zero feature map
    zeros_input = keras.layers.subtract([inputs, inputs])
    # get a feature map with only positive features
    pos_input = Activation('relu')(inputs)
    # get a feature map with only negative features
    neg_input = Minimum()([inputs,zeros_input])
    # define a network to obtain the scaling coefficients
    scales_p = GlobalAveragePooling2D()(pos_input)
    scales_n = GlobalAveragePooling2D()(neg_input)
    scales = Concatenate()([scales_n, scales_p])
    scales = Dense(channels, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales)
    scales = BatchNormalization()(scales)
    scales = Activation('relu')(scales)
    scales = Dense(channels, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales)
    scales = BatchNormalization()(scales)
    scales = Activation('sigmoid')(scales)
    scales = Reshape((1,1,channels))(scales)
    # apply a paramtetric relu
    neg_part = keras.layers.multiply([scales, neg_input])
    return keras.layers.add([pos_input, neg_part])

# Residual Block
def residual_block(incoming, nb_blocks, out_channels, downsample=False,
                   downsample_strides=2):
    
    residual = incoming
    in_channels = incoming.get_shape().as_list()[-1]
    
    for i in range(nb_blocks):
        
        identity = residual
        
        if not downsample:
            downsample_strides = 1
        
        residual = BatchNormalization()(residual)
        residual = aprelu(residual)
        residual = Conv2D(out_channels, 3, strides=(downsample_strides, downsample_strides), 
                          padding='same', kernel_initializer='he_normal', 
                          kernel_regularizer=l2(1e-4))(residual)
        
        residual = BatchNormalization()(residual)
        residual = aprelu(residual)
        residual = Conv2D(out_channels, 3, padding='same', kernel_initializer='he_normal', 
                          kernel_regularizer=l2(1e-4))(residual)
        
        # Downsampling
        if downsample_strides > 1:
            identity = AveragePooling2D(pool_size=(1,1), strides=(2,2))(identity)
            
        # Zero_padding to match channels
        if in_channels != out_channels:
            zeros_identity = keras.layers.subtract([identity, identity])
            identity = keras.layers.concatenate([identity, zeros_identity])
            in_channels = out_channels
        
        residual = keras.layers.add([residual, identity])
    
    return residual


# define and train a model
inputs = Input(shape=(32, 32, 3))
net = Conv2D(16, 3, padding='same', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(inputs)
net = residual_block(net, 9, 16, downsample=False)
net = residual_block(net, 1, 32, downsample=True)
net = residual_block(net, 8, 32, downsample=False)
net = residual_block(net, 1, 64, downsample=True)
net = residual_block(net, 8, 64, downsample=False)
net = BatchNormalization()(net)
net = Activation('relu')(net)
net = GlobalAveragePooling2D()(net)
outputs = Dense(10, activation='softmax', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(net)
model = Model(inputs=inputs, outputs=outputs)
sgd = optimizers.SGD(lr=0.1, decay=0., momentum=0.9, nesterov=True)
model.compile(loss='categorical_crossentropy', optimizer=sgd, metrics=['accuracy'])

# data augmentation
datagen = ImageDataGenerator(
    # randomly rotate images in the range (deg 0 to 180)
    rotation_range=30,
    # randomly flip images
    horizontal_flip=True,
    # randomly shift images horizontally
    width_shift_range=0.125,
    # randomly shift images vertically
    height_shift_range=0.125)

reduce_lr = LearningRateScheduler(scheduler)
# fit the model on the batches generated by datagen.flow().
model.fit_generator(datagen.flow(x_train, y_train, batch_size=100),
                    validation_data=(x_test, y_test), epochs=1000, 
                    verbose=1, callbacks=[reduce_lr], workers=4)

# get results
K.set_learning_phase(0)
DRSN_train_score = model.evaluate(x_train, y_train, batch_size=100, verbose=0)
print('Train loss:', DRSN_train_score[0])
print('Train accuracy:', DRSN_train_score[1])
DRSN_test_score = model.evaluate(x_test, y_test, batch_size=100, verbose=0)
print('Test loss:', DRSN_test_score[0])
print('Test accuracy:', DRSN_test_score[1])

实验结果如下:

代码语言:python
代码运行次数:0
复制
Epoch 800/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0725 - acc: 0.9997 - val_loss: 0.3784 - val_acc: 0.9329
Epoch 801/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0723 - acc: 0.9997 - val_loss: 0.3741 - val_acc: 0.9342
Epoch 802/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0723 - acc: 0.9996 - val_loss: 0.3772 - val_acc: 0.9332
Epoch 803/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0721 - acc: 0.9996 - val_loss: 0.3778 - val_acc: 0.9329
Epoch 804/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0722 - acc: 0.9995 - val_loss: 0.3759 - val_acc: 0.9337
Epoch 805/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0719 - acc: 0.9996 - val_loss: 0.3788 - val_acc: 0.9335
Epoch 806/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0719 - acc: 0.9995 - val_loss: 0.3815 - val_acc: 0.9332
Epoch 807/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0716 - acc: 0.9997 - val_loss: 0.3774 - val_acc: 0.9321
Epoch 808/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0714 - acc: 0.9997 - val_loss: 0.3774 - val_acc: 0.9337
Epoch 809/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0714 - acc: 0.9997 - val_loss: 0.3786 - val_acc: 0.9320
Epoch 810/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0713 - acc: 0.9996 - val_loss: 0.3776 - val_acc: 0.9322
Epoch 811/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0712 - acc: 0.9996 - val_loss: 0.3782 - val_acc: 0.9332
Epoch 812/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0709 - acc: 0.9997 - val_loss: 0.3837 - val_acc: 0.9322
Epoch 813/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0705 - acc: 0.9998 - val_loss: 0.3839 - val_acc: 0.9322
Epoch 814/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0707 - acc: 0.9996 - val_loss: 0.3820 - val_acc: 0.9318
Epoch 815/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0705 - acc: 0.9997 - val_loss: 0.3829 - val_acc: 0.9309
Epoch 816/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0703 - acc: 0.9996 - val_loss: 0.3810 - val_acc: 0.9318
Epoch 817/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0700 - acc: 0.9998 - val_loss: 0.3799 - val_acc: 0.9316
Epoch 818/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0701 - acc: 0.9996 - val_loss: 0.3789 - val_acc: 0.9314
Epoch 819/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0698 - acc: 0.9997 - val_loss: 0.3802 - val_acc: 0.9326
Epoch 820/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0699 - acc: 0.9996 - val_loss: 0.3837 - val_acc: 0.9301
Epoch 821/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0697 - acc: 0.9996 - val_loss: 0.3833 - val_acc: 0.9317
Epoch 822/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0698 - acc: 0.9995 - val_loss: 0.3851 - val_acc: 0.9305
Epoch 823/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0694 - acc: 0.9997 - val_loss: 0.3824 - val_acc: 0.9311
Epoch 824/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0693 - acc: 0.9995 - val_loss: 0.3830 - val_acc: 0.9303
Epoch 825/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0690 - acc: 0.9998 - val_loss: 0.3802 - val_acc: 0.9298
Epoch 826/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0689 - acc: 0.9996 - val_loss: 0.3810 - val_acc: 0.9305
Epoch 827/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0689 - acc: 0.9997 - val_loss: 0.3813 - val_acc: 0.9309
Epoch 828/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0688 - acc: 0.9996 - val_loss: 0.3799 - val_acc: 0.9316
Epoch 829/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0687 - acc: 0.9996 - val_loss: 0.3766 - val_acc: 0.9322
Epoch 830/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0688 - acc: 0.9995 - val_loss: 0.3764 - val_acc: 0.9329
Epoch 831/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0684 - acc: 0.9996 - val_loss: 0.3750 - val_acc: 0.9324
Epoch 832/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0685 - acc: 0.9996 - val_loss: 0.3781 - val_acc: 0.9314
Epoch 833/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0682 - acc: 0.9997 - val_loss: 0.3741 - val_acc: 0.9313
Epoch 834/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0682 - acc: 0.9996 - val_loss: 0.3738 - val_acc: 0.9312
Epoch 835/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0680 - acc: 0.9996 - val_loss: 0.3753 - val_acc: 0.9319
Epoch 836/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0679 - acc: 0.9997 - val_loss: 0.3753 - val_acc: 0.9317
Epoch 837/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0677 - acc: 0.9996 - val_loss: 0.3769 - val_acc: 0.9320
Epoch 838/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0674 - acc: 0.9997 - val_loss: 0.3775 - val_acc: 0.9317
Epoch 839/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0678 - acc: 0.9995 - val_loss: 0.3779 - val_acc: 0.9327
Epoch 840/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0673 - acc: 0.9995 - val_loss: 0.3773 - val_acc: 0.9319
Epoch 841/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0672 - acc: 0.9996 - val_loss: 0.3764 - val_acc: 0.9333
Epoch 842/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0670 - acc: 0.9997 - val_loss: 0.3741 - val_acc: 0.9323
Epoch 843/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0667 - acc: 0.9997 - val_loss: 0.3723 - val_acc: 0.9321
Epoch 844/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0667 - acc: 0.9997 - val_loss: 0.3731 - val_acc: 0.9315
Epoch 845/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0668 - acc: 0.9996 - val_loss: 0.3733 - val_acc: 0.9320
Epoch 846/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0666 - acc: 0.9996 - val_loss: 0.3722 - val_acc: 0.9315
Epoch 847/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0664 - acc: 0.9996 - val_loss: 0.3719 - val_acc: 0.9327
Epoch 848/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0662 - acc: 0.9997 - val_loss: 0.3720 - val_acc: 0.9309
Epoch 849/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0660 - acc: 0.9996 - val_loss: 0.3716 - val_acc: 0.9316
Epoch 850/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0660 - acc: 0.9996 - val_loss: 0.3732 - val_acc: 0.9310
Epoch 851/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0661 - acc: 0.9995 - val_loss: 0.3718 - val_acc: 0.9314
Epoch 852/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0657 - acc: 0.9997 - val_loss: 0.3748 - val_acc: 0.9300
Epoch 853/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0654 - acc: 0.9997 - val_loss: 0.3724 - val_acc: 0.9314
Epoch 854/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0658 - acc: 0.9995 - val_loss: 0.3750 - val_acc: 0.9283
Epoch 855/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0652 - acc: 0.9998 - val_loss: 0.3719 - val_acc: 0.9314
Epoch 856/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0652 - acc: 0.9998 - val_loss: 0.3724 - val_acc: 0.9314
Epoch 857/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0652 - acc: 0.9995 - val_loss: 0.3732 - val_acc: 0.9300
Epoch 858/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0648 - acc: 0.9997 - val_loss: 0.3714 - val_acc: 0.9307
Epoch 859/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0654 - acc: 0.9994 - val_loss: 0.3719 - val_acc: 0.9315
Epoch 860/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0645 - acc: 0.9997 - val_loss: 0.3726 - val_acc: 0.9308
Epoch 861/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0648 - acc: 0.9996 - val_loss: 0.3725 - val_acc: 0.9308
Epoch 862/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0645 - acc: 0.9997 - val_loss: 0.3698 - val_acc: 0.9312
Epoch 863/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0642 - acc: 0.9997 - val_loss: 0.3715 - val_acc: 0.9305
Epoch 864/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0643 - acc: 0.9997 - val_loss: 0.3724 - val_acc: 0.9302
Epoch 865/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0639 - acc: 0.9998 - val_loss: 0.3748 - val_acc: 0.9304
Epoch 866/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0641 - acc: 0.9995 - val_loss: 0.3751 - val_acc: 0.9315
Epoch 867/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0638 - acc: 0.9997 - val_loss: 0.3729 - val_acc: 0.9325
Epoch 868/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0637 - acc: 0.9997 - val_loss: 0.3750 - val_acc: 0.9320
Epoch 869/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0634 - acc: 0.9997 - val_loss: 0.3738 - val_acc: 0.9312
Epoch 870/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0634 - acc: 0.9996 - val_loss: 0.3731 - val_acc: 0.9313
Epoch 871/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0632 - acc: 0.9998 - val_loss: 0.3750 - val_acc: 0.9311
Epoch 872/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0633 - acc: 0.9997 - val_loss: 0.3784 - val_acc: 0.9313
Epoch 873/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0635 - acc: 0.9995 - val_loss: 0.3719 - val_acc: 0.9312
Epoch 874/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0631 - acc: 0.9996 - val_loss: 0.3706 - val_acc: 0.9330
Epoch 875/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0626 - acc: 0.9997 - val_loss: 0.3711 - val_acc: 0.9331
Epoch 876/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0628 - acc: 0.9996 - val_loss: 0.3730 - val_acc: 0.9332
Epoch 877/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0626 - acc: 0.9997 - val_loss: 0.3744 - val_acc: 0.9323
Epoch 878/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0623 - acc: 0.9997 - val_loss: 0.3724 - val_acc: 0.9321
Epoch 879/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0624 - acc: 0.9996 - val_loss: 0.3749 - val_acc: 0.9312
Epoch 880/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0621 - acc: 0.9997 - val_loss: 0.3728 - val_acc: 0.9314
Epoch 881/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0620 - acc: 0.9997 - val_loss: 0.3733 - val_acc: 0.9317
Epoch 882/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0623 - acc: 0.9996 - val_loss: 0.3779 - val_acc: 0.9298
Epoch 883/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0621 - acc: 0.9996 - val_loss: 0.3733 - val_acc: 0.9309
Epoch 884/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0617 - acc: 0.9997 - val_loss: 0.3714 - val_acc: 0.9312
Epoch 885/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0615 - acc: 0.9997 - val_loss: 0.3708 - val_acc: 0.9313
Epoch 886/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0615 - acc: 0.9997 - val_loss: 0.3727 - val_acc: 0.9305
Epoch 887/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0616 - acc: 0.9996 - val_loss: 0.3699 - val_acc: 0.9313
Epoch 888/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0611 - acc: 0.9997 - val_loss: 0.3709 - val_acc: 0.9310
Epoch 889/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0611 - acc: 0.9997 - val_loss: 0.3718 - val_acc: 0.9309
Epoch 890/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0611 - acc: 0.9997 - val_loss: 0.3721 - val_acc: 0.9315
Epoch 891/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0606 - acc: 0.9998 - val_loss: 0.3726 - val_acc: 0.9324
Epoch 892/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0606 - acc: 0.9997 - val_loss: 0.3737 - val_acc: 0.9321
Epoch 893/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0607 - acc: 0.9996 - val_loss: 0.3709 - val_acc: 0.9325
Epoch 894/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0602 - acc: 0.9999 - val_loss: 0.3701 - val_acc: 0.9325
Epoch 895/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0604 - acc: 0.9997 - val_loss: 0.3670 - val_acc: 0.9327
Epoch 896/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0606 - acc: 0.9995 - val_loss: 0.3646 - val_acc: 0.9325
Epoch 897/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0603 - acc: 0.9997 - val_loss: 0.3693 - val_acc: 0.9315
Epoch 898/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0602 - acc: 0.9996 - val_loss: 0.3705 - val_acc: 0.9312
Epoch 899/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0599 - acc: 0.9997 - val_loss: 0.3697 - val_acc: 0.9309
Epoch 900/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0600 - acc: 0.9997 - val_loss: 0.3694 - val_acc: 0.9313
Epoch 901/1000
lr changed to 9.999999310821295e-05
500/500 [==============================] - 62s 123ms/step - loss: 0.0597 - acc: 0.9998 - val_loss: 0.3694 - val_acc: 0.9313
Epoch 902/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0595 - acc: 0.9998 - val_loss: 0.3685 - val_acc: 0.9316
Epoch 903/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0597 - acc: 0.9998 - val_loss: 0.3685 - val_acc: 0.9314
Epoch 904/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0599 - acc: 0.9997 - val_loss: 0.3686 - val_acc: 0.9316
Epoch 905/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0598 - acc: 0.9997 - val_loss: 0.3684 - val_acc: 0.9316
Epoch 906/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0596 - acc: 0.9998 - val_loss: 0.3683 - val_acc: 0.9313
Epoch 907/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0596 - acc: 0.9998 - val_loss: 0.3681 - val_acc: 0.9314
Epoch 908/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0594 - acc: 0.9998 - val_loss: 0.3679 - val_acc: 0.9311
Epoch 909/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0597 - acc: 0.9997 - val_loss: 0.3676 - val_acc: 0.9309
Epoch 910/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0596 - acc: 0.9997 - val_loss: 0.3673 - val_acc: 0.9311
Epoch 911/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0597 - acc: 0.9997 - val_loss: 0.3675 - val_acc: 0.9311
Epoch 912/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0596 - acc: 0.9997 - val_loss: 0.3671 - val_acc: 0.9311
Epoch 913/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0595 - acc: 0.9997 - val_loss: 0.3666 - val_acc: 0.9314
Epoch 914/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0594 - acc: 0.9998 - val_loss: 0.3663 - val_acc: 0.9317
Epoch 915/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0598 - acc: 0.9996 - val_loss: 0.3660 - val_acc: 0.9318
Epoch 916/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0596 - acc: 0.9997 - val_loss: 0.3658 - val_acc: 0.9320
Epoch 917/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0595 - acc: 0.9997 - val_loss: 0.3658 - val_acc: 0.9318
Epoch 918/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0596 - acc: 0.9997 - val_loss: 0.3656 - val_acc: 0.9316
Epoch 919/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0595 - acc: 0.9997 - val_loss: 0.3654 - val_acc: 0.9316
Epoch 920/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0596 - acc: 0.9997 - val_loss: 0.3651 - val_acc: 0.9314
Epoch 921/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0593 - acc: 0.9998 - val_loss: 0.3650 - val_acc: 0.9314
Epoch 922/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0594 - acc: 0.9998 - val_loss: 0.3650 - val_acc: 0.9316
Epoch 923/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0593 - acc: 0.9998 - val_loss: 0.3645 - val_acc: 0.9316
Epoch 924/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0594 - acc: 0.9998 - val_loss: 0.3644 - val_acc: 0.9315
Epoch 925/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0592 - acc: 0.9998 - val_loss: 0.3648 - val_acc: 0.9315
Epoch 926/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0593 - acc: 0.9997 - val_loss: 0.3647 - val_acc: 0.9318
Epoch 927/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0594 - acc: 0.9998 - val_loss: 0.3646 - val_acc: 0.9313
Epoch 928/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0594 - acc: 0.9998 - val_loss: 0.3649 - val_acc: 0.9317
Epoch 929/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0592 - acc: 0.9998 - val_loss: 0.3649 - val_acc: 0.9320
Epoch 930/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0591 - acc: 0.9998 - val_loss: 0.3649 - val_acc: 0.9322
Epoch 931/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0592 - acc: 0.9999 - val_loss: 0.3647 - val_acc: 0.9318
Epoch 932/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0592 - acc: 0.9998 - val_loss: 0.3644 - val_acc: 0.9319
Epoch 933/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0594 - acc: 0.9997 - val_loss: 0.3642 - val_acc: 0.9319
Epoch 934/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0594 - acc: 0.9998 - val_loss: 0.3646 - val_acc: 0.9318
Epoch 935/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0594 - acc: 0.9997 - val_loss: 0.3641 - val_acc: 0.9318
Epoch 936/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0592 - acc: 0.9998 - val_loss: 0.3639 - val_acc: 0.9313
Epoch 937/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0592 - acc: 0.9998 - val_loss: 0.3637 - val_acc: 0.9321
Epoch 938/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0591 - acc: 0.9998 - val_loss: 0.3638 - val_acc: 0.9320
Epoch 939/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0590 - acc: 0.9999 - val_loss: 0.3638 - val_acc: 0.9320
Epoch 940/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0591 - acc: 0.9998 - val_loss: 0.3633 - val_acc: 0.9324
Epoch 941/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0591 - acc: 0.9998 - val_loss: 0.3635 - val_acc: 0.9325
Epoch 942/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0592 - acc: 0.9997 - val_loss: 0.3632 - val_acc: 0.9324
Epoch 943/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0591 - acc: 0.9997 - val_loss: 0.3637 - val_acc: 0.9330
Epoch 944/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0590 - acc: 0.9998 - val_loss: 0.3638 - val_acc: 0.9327
Epoch 945/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0591 - acc: 0.9997 - val_loss: 0.3641 - val_acc: 0.9329
Epoch 946/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0590 - acc: 0.9998 - val_loss: 0.3641 - val_acc: 0.9328
Epoch 947/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0590 - acc: 0.9997 - val_loss: 0.3645 - val_acc: 0.9328
Epoch 948/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0591 - acc: 0.9997 - val_loss: 0.3643 - val_acc: 0.9329
Epoch 949/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0591 - acc: 0.9997 - val_loss: 0.3643 - val_acc: 0.9328
Epoch 950/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0591 - acc: 0.9998 - val_loss: 0.3647 - val_acc: 0.9329
Epoch 951/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0589 - acc: 0.9998 - val_loss: 0.3646 - val_acc: 0.9330
Epoch 952/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0588 - acc: 0.9998 - val_loss: 0.3645 - val_acc: 0.9326
Epoch 953/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0591 - acc: 0.9997 - val_loss: 0.3645 - val_acc: 0.9326
Epoch 954/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0589 - acc: 0.9998 - val_loss: 0.3648 - val_acc: 0.9329
Epoch 955/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0587 - acc: 0.9999 - val_loss: 0.3648 - val_acc: 0.9327
Epoch 956/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0588 - acc: 0.9998 - val_loss: 0.3651 - val_acc: 0.9325
Epoch 957/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0589 - acc: 0.9998 - val_loss: 0.3649 - val_acc: 0.9324
Epoch 958/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0588 - acc: 0.9998 - val_loss: 0.3650 - val_acc: 0.9320
Epoch 959/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0588 - acc: 0.9998 - val_loss: 0.3649 - val_acc: 0.9322
Epoch 960/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0590 - acc: 0.9998 - val_loss: 0.3651 - val_acc: 0.9320
Epoch 961/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0588 - acc: 0.9998 - val_loss: 0.3652 - val_acc: 0.9325
Epoch 962/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0590 - acc: 0.9998 - val_loss: 0.3646 - val_acc: 0.9322
Epoch 963/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0590 - acc: 0.9997 - val_loss: 0.3649 - val_acc: 0.9324
Epoch 964/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0590 - acc: 0.9998 - val_loss: 0.3651 - val_acc: 0.9322
Epoch 965/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0588 - acc: 0.9998 - val_loss: 0.3649 - val_acc: 0.9324
Epoch 966/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0589 - acc: 0.9997 - val_loss: 0.3649 - val_acc: 0.9322
Epoch 967/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0590 - acc: 0.9996 - val_loss: 0.3649 - val_acc: 0.9328
Epoch 968/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0588 - acc: 0.9997 - val_loss: 0.3648 - val_acc: 0.9328
Epoch 969/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0589 - acc: 0.9998 - val_loss: 0.3647 - val_acc: 0.9325
Epoch 970/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0587 - acc: 0.9998 - val_loss: 0.3645 - val_acc: 0.9325
Epoch 971/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0587 - acc: 0.9997 - val_loss: 0.3646 - val_acc: 0.9328
Epoch 972/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0588 - acc: 0.9998 - val_loss: 0.3650 - val_acc: 0.9331
Epoch 973/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0586 - acc: 0.9998 - val_loss: 0.3649 - val_acc: 0.9331
Epoch 974/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0587 - acc: 0.9998 - val_loss: 0.3646 - val_acc: 0.9326
Epoch 975/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0586 - acc: 0.9998 - val_loss: 0.3645 - val_acc: 0.9322
Epoch 976/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0585 - acc: 0.9998 - val_loss: 0.3645 - val_acc: 0.9326
Epoch 977/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0591 - acc: 0.9996 - val_loss: 0.3646 - val_acc: 0.9323
Epoch 978/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0587 - acc: 0.9998 - val_loss: 0.3645 - val_acc: 0.9326
Epoch 979/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0588 - acc: 0.9997 - val_loss: 0.3646 - val_acc: 0.9320
Epoch 980/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0588 - acc: 0.9997 - val_loss: 0.3648 - val_acc: 0.9319
Epoch 981/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0587 - acc: 0.9998 - val_loss: 0.3650 - val_acc: 0.9317
Epoch 982/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0584 - acc: 0.9999 - val_loss: 0.3650 - val_acc: 0.9317
Epoch 983/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0586 - acc: 0.9998 - val_loss: 0.3650 - val_acc: 0.9323
Epoch 984/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0586 - acc: 0.9998 - val_loss: 0.3651 - val_acc: 0.9322
Epoch 985/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0587 - acc: 0.9997 - val_loss: 0.3651 - val_acc: 0.9321
Epoch 986/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0588 - acc: 0.9998 - val_loss: 0.3648 - val_acc: 0.9323
Epoch 987/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0586 - acc: 0.9997 - val_loss: 0.3644 - val_acc: 0.9318
Epoch 988/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0586 - acc: 0.9997 - val_loss: 0.3648 - val_acc: 0.9322
Epoch 989/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0585 - acc: 0.9998 - val_loss: 0.3650 - val_acc: 0.9322
Epoch 990/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0586 - acc: 0.9998 - val_loss: 0.3646 - val_acc: 0.9319
Epoch 991/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0584 - acc: 0.9998 - val_loss: 0.3647 - val_acc: 0.9323
Epoch 992/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0585 - acc: 0.9997 - val_loss: 0.3647 - val_acc: 0.9320
Epoch 993/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0587 - acc: 0.9997 - val_loss: 0.3646 - val_acc: 0.9318
Epoch 994/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0584 - acc: 0.9999 - val_loss: 0.3650 - val_acc: 0.9320
Epoch 995/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0586 - acc: 0.9997 - val_loss: 0.3650 - val_acc: 0.9315
Epoch 996/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0585 - acc: 0.9998 - val_loss: 0.3649 - val_acc: 0.9319
Epoch 997/1000
500/500 [==============================] - 62s 123ms/step - loss: 0.0585 - acc: 0.9998 - val_loss: 0.3645 - val_acc: 0.9318
Epoch 998/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0584 - acc: 0.9998 - val_loss: 0.3648 - val_acc: 0.9320
Epoch 999/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0584 - acc: 0.9999 - val_loss: 0.3646 - val_acc: 0.9320
Epoch 1000/1000
500/500 [==============================] - 62s 124ms/step - loss: 0.0581 - acc: 0.9999 - val_loss: 0.3646 - val_acc: 0.9323
Train loss: 0.062079589650034905
Train accuracy: 0.9986200013160705
Test loss: 0.3645842906832695
Test accuracy: 0.9323000019788742

中间有一部分epoch(从第701到754个epoch)的结果没记录下来。

Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht, Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, IEEE Transactions on Industrial Electronics, 2020, DOI: 10.1109/TIE.2020.2972458

https://ieeexplore.ieee.org/document/8998530

————————————————

版权声明:本文为CSDN博主「dangqing1988」的原创文章,遵循 CC 4.0 BY-SA 版权协议,转载请附上原文出处链接及本声明。

原文链接:https://blog.csdn.net/dangqing1988/article/details/105628681

本文系转载,前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文系转载前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档