前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >【哈工大团队】动态ReLU:自适应参数化ReLU及Keras代码(调参记录13)

【哈工大团队】动态ReLU:自适应参数化ReLU及Keras代码(调参记录13)

作者头像
用户7368967
修改2020-05-28 09:55:57
4080
修改2020-05-28 09:55:57
举报
文章被收录于专栏:深度学习知识深度学习知识

本文介绍哈工大团队提出的一种动态ReLU(Dynamic ReLU)激活函数,即自适应参数化ReLU激活函数,原本是应用在基于一维振动信号的故障诊断,能够让每个样本有自己独特的ReLU参数,在2019年5月3日投稿至IEEE Transactions on Industrial Electronics,2020年1月24日录用,2020年2月13日在IEEE官网公布

从以往的调参结果来看,过拟合是最主要的问题。本文在调参记录12的基础上,将层数减少,减到9个残差模块,再试一次。

自适应参数化ReLU激活函数原理如下:

自适应参数化ReLU:一种Dynamic ReLU(动态ReLU)激活函数
自适应参数化ReLU:一种Dynamic ReLU(动态ReLU)激活函数

Keras程序如下:

代码语言:python
复制
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Tue Apr 14 04:17:45 2020
Implemented using TensorFlow 1.0.1 and Keras 2.2.1

Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht,
Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, 
IEEE Transactions on Industrial Electronics, DOI: 10.1109/TIE.2020.2972458,
Date of Publication: 13 February 2020

@author: Minghang Zhao
"""

from __future__ import print_function
import keras
import numpy as np
from keras.datasets import cifar10
from keras.layers import Dense, Conv2D, BatchNormalization, Activation, Minimum
from keras.layers import AveragePooling2D, Input, GlobalAveragePooling2D, Concatenate, Reshape
from keras.regularizers import l2
from keras import backend as K
from keras.models import Model
from keras import optimizers
from keras.preprocessing.image import ImageDataGenerator
from keras.callbacks import LearningRateScheduler
K.set_learning_phase(1)

# The data, split between train and test sets
(x_train, y_train), (x_test, y_test) = cifar10.load_data()

# Noised data
x_train = x_train.astype('float32') / 255.
x_test = x_test.astype('float32') / 255.
x_test = x_test-np.mean(x_train)
x_train = x_train-np.mean(x_train)
print('x_train shape:', x_train.shape)
print(x_train.shape[0], 'train samples')
print(x_test.shape[0], 'test samples')

# convert class vectors to binary class matrices
y_train = keras.utils.to_categorical(y_train, 10)
y_test = keras.utils.to_categorical(y_test, 10)

# Schedule the learning rate, multiply 0.1 every 1500 epoches
def scheduler(epoch):
    if epoch % 1500 == 0 and epoch != 0:
        lr = K.get_value(model.optimizer.lr)
        K.set_value(model.optimizer.lr, lr * 0.1)
        print("lr changed to {}".format(lr * 0.1))
    return K.get_value(model.optimizer.lr)

# An adaptively parametric rectifier linear unit (APReLU)
def aprelu(inputs):
    # get the number of channels
    channels = inputs.get_shape().as_list()[-1]
    # get a zero feature map
    zeros_input = keras.layers.subtract([inputs, inputs])
    # get a feature map with only positive features
    pos_input = Activation('relu')(inputs)
    # get a feature map with only negative features
    neg_input = Minimum()([inputs,zeros_input])
    # define a network to obtain the scaling coefficients
    scales_p = GlobalAveragePooling2D()(pos_input)
    scales_n = GlobalAveragePooling2D()(neg_input)
    scales = Concatenate()([scales_n, scales_p])
    scales = Dense(channels, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales)
    scales = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(scales)
    scales = Activation('relu')(scales)
    scales = Dense(channels, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales)
    scales = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(scales)
    scales = Activation('sigmoid')(scales)
    scales = Reshape((1,1,channels))(scales)
    # apply a paramtetric relu
    neg_part = keras.layers.multiply([scales, neg_input])
    return keras.layers.add([pos_input, neg_part])

# Residual Block
def residual_block(incoming, nb_blocks, out_channels, downsample=False,
                   downsample_strides=2):
    
    residual = incoming
    in_channels = incoming.get_shape().as_list()[-1]
    
    for i in range(nb_blocks):
        
        identity = residual
        
        if not downsample:
            downsample_strides = 1
        
        residual = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(residual)
        residual = aprelu(residual)
        residual = Conv2D(out_channels, 3, strides=(downsample_strides, downsample_strides), 
                          padding='same', kernel_initializer='he_normal', 
                          kernel_regularizer=l2(1e-4))(residual)
        
        residual = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(residual)
        residual = aprelu(residual)
        residual = Conv2D(out_channels, 3, padding='same', kernel_initializer='he_normal', 
                          kernel_regularizer=l2(1e-4))(residual)
        
        # Downsampling
        if downsample_strides > 1:
            identity = AveragePooling2D(pool_size=(1,1), strides=(2,2))(identity)
            
        # Zero_padding to match channels
        if in_channels != out_channels:
            zeros_identity = keras.layers.subtract([identity, identity])
            identity = keras.layers.concatenate([identity, zeros_identity])
            in_channels = out_channels
        
        residual = keras.layers.add([residual, identity])
    
    return residual


# define and train a model
inputs = Input(shape=(32, 32, 3))
net = Conv2D(16, 3, padding='same', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(inputs)
net = residual_block(net, 3, 16, downsample=False)
net = residual_block(net, 1, 32, downsample=True)
net = residual_block(net, 2, 32, downsample=False)
net = residual_block(net, 1, 64, downsample=True)
net = residual_block(net, 2, 64, downsample=False)
net = BatchNormalization(momentum=0.9, gamma_regularizer=l2(1e-4))(net)
net = Activation('relu')(net)
net = GlobalAveragePooling2D()(net)
outputs = Dense(10, activation='softmax', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(net)
model = Model(inputs=inputs, outputs=outputs)
sgd = optimizers.SGD(lr=0.1, decay=0., momentum=0.9, nesterov=True)
model.compile(loss='categorical_crossentropy', optimizer=sgd, metrics=['accuracy'])

# data augmentation
datagen = ImageDataGenerator(
    # randomly rotate images in the range (deg 0 to 180)
    rotation_range=30,
    # Range for random zoom
    zoom_range = 0.2,
    # shear angle in counter-clockwise direction in degrees
    shear_range = 30,
    # randomly flip images
    horizontal_flip=True,
    # randomly shift images horizontally
    width_shift_range=0.125,
    # randomly shift images vertically
    height_shift_range=0.125)

reduce_lr = LearningRateScheduler(scheduler)
# fit the model on the batches generated by datagen.flow().
model.fit_generator(datagen.flow(x_train, y_train, batch_size=625),
                    validation_data=(x_test, y_test), epochs=5000, 
                    verbose=1, callbacks=[reduce_lr], workers=10)

# get results
K.set_learning_phase(0)
DRSN_train_score = model.evaluate(x_train, y_train, batch_size=625, verbose=0)
print('Train loss:', DRSN_train_score[0])
print('Train accuracy:', DRSN_train_score[1])
DRSN_test_score = model.evaluate(x_test, y_test, batch_size=625, verbose=0)
print('Test loss:', DRSN_test_score[0])
print('Test accuracy:', DRSN_test_score[1])

实验结果如下:

代码语言:javascript
复制
Epoch 2500/5000
12s 151ms/step - loss: 0.1258 - acc: 0.9867 - val_loss: 0.4697 - val_acc: 0.9024
Epoch 2501/5000
12s 151ms/step - loss: 0.1274 - acc: 0.9852 - val_loss: 0.4688 - val_acc: 0.9026
Epoch 2502/5000
12s 151ms/step - loss: 0.1260 - acc: 0.9861 - val_loss: 0.4585 - val_acc: 0.9040
Epoch 2503/5000
12s 152ms/step - loss: 0.1241 - acc: 0.9869 - val_loss: 0.4489 - val_acc: 0.9066
Epoch 2504/5000
12s 152ms/step - loss: 0.1236 - acc: 0.9869 - val_loss: 0.4469 - val_acc: 0.9106
Epoch 2505/5000
12s 151ms/step - loss: 0.1276 - acc: 0.9850 - val_loss: 0.4515 - val_acc: 0.9034
Epoch 2506/5000
12s 151ms/step - loss: 0.1252 - acc: 0.9864 - val_loss: 0.4586 - val_acc: 0.9074
Epoch 2507/5000
12s 151ms/step - loss: 0.1289 - acc: 0.9852 - val_loss: 0.4585 - val_acc: 0.9057
Epoch 2508/5000
12s 151ms/step - loss: 0.1285 - acc: 0.9853 - val_loss: 0.4485 - val_acc: 0.9077
Epoch 2509/5000
12s 151ms/step - loss: 0.1284 - acc: 0.9851 - val_loss: 0.4529 - val_acc: 0.9032
Epoch 2510/5000
12s 151ms/step - loss: 0.1287 - acc: 0.9855 - val_loss: 0.4567 - val_acc: 0.9040
Epoch 2511/5000
12s 151ms/step - loss: 0.1253 - acc: 0.9862 - val_loss: 0.4554 - val_acc: 0.9080
Epoch 2512/5000
12s 152ms/step - loss: 0.1262 - acc: 0.9859 - val_loss: 0.4477 - val_acc: 0.9086
Epoch 2513/5000
12s 151ms/step - loss: 0.1241 - acc: 0.9864 - val_loss: 0.4531 - val_acc: 0.9063
Epoch 2514/5000
12s 150ms/step - loss: 0.1247 - acc: 0.9866 - val_loss: 0.4484 - val_acc: 0.9073
Epoch 2515/5000
12s 151ms/step - loss: 0.1239 - acc: 0.9869 - val_loss: 0.4502 - val_acc: 0.9078
Epoch 2516/5000
12s 151ms/step - loss: 0.1275 - acc: 0.9857 - val_loss: 0.4790 - val_acc: 0.8981
Epoch 2517/5000
12s 152ms/step - loss: 0.1259 - acc: 0.9862 - val_loss: 0.4625 - val_acc: 0.9063
Epoch 2518/5000
12s 151ms/step - loss: 0.1278 - acc: 0.9853 - val_loss: 0.4751 - val_acc: 0.9009
Epoch 2519/5000
12s 151ms/step - loss: 0.1283 - acc: 0.9857 - val_loss: 0.4655 - val_acc: 0.9056
Epoch 2520/5000
12s 151ms/step - loss: 0.1275 - acc: 0.9859 - val_loss: 0.4386 - val_acc: 0.9085
Epoch 2521/5000
12s 151ms/step - loss: 0.1245 - acc: 0.9871 - val_loss: 0.4699 - val_acc: 0.9006
Epoch 2522/5000
12s 151ms/step - loss: 0.1278 - acc: 0.9860 - val_loss: 0.4520 - val_acc: 0.9050
Epoch 2523/5000
12s 151ms/step - loss: 0.1249 - acc: 0.9864 - val_loss: 0.4566 - val_acc: 0.9056
Epoch 2524/5000
12s 152ms/step - loss: 0.1278 - acc: 0.9855 - val_loss: 0.4650 - val_acc: 0.9018
Epoch 2525/5000
12s 151ms/step - loss: 0.1235 - acc: 0.9873 - val_loss: 0.4555 - val_acc: 0.9061
Epoch 2526/5000
12s 151ms/step - loss: 0.1260 - acc: 0.9862 - val_loss: 0.4556 - val_acc: 0.9061
Epoch 2527/5000
12s 152ms/step - loss: 0.1261 - acc: 0.9866 - val_loss: 0.4667 - val_acc: 0.9040
Epoch 2528/5000
12s 152ms/step - loss: 0.1240 - acc: 0.9874 - val_loss: 0.4539 - val_acc: 0.9083
Epoch 2529/5000
12s 152ms/step - loss: 0.1281 - acc: 0.9856 - val_loss: 0.4584 - val_acc: 0.9048
Epoch 2530/5000
12s 151ms/step - loss: 0.1234 - acc: 0.9871 - val_loss: 0.4538 - val_acc: 0.9048
Epoch 2531/5000
12s 151ms/step - loss: 0.1235 - acc: 0.9868 - val_loss: 0.4504 - val_acc: 0.9056
Epoch 2532/5000
12s 151ms/step - loss: 0.1247 - acc: 0.9871 - val_loss: 0.4529 - val_acc: 0.9053
Epoch 2533/5000
12s 150ms/step - loss: 0.1241 - acc: 0.9872 - val_loss: 0.4591 - val_acc: 0.9034
Epoch 2534/5000
12s 152ms/step - loss: 0.1255 - acc: 0.9865 - val_loss: 0.4502 - val_acc: 0.9058
Epoch 2535/5000
12s 151ms/step - loss: 0.1254 - acc: 0.9865 - val_loss: 0.4596 - val_acc: 0.9039
Epoch 2536/5000
12s 152ms/step - loss: 0.1239 - acc: 0.9872 - val_loss: 0.4488 - val_acc: 0.9040
Epoch 2537/5000
12s 151ms/step - loss: 0.1260 - acc: 0.9865 - val_loss: 0.4494 - val_acc: 0.9042
Epoch 2538/5000
12s 150ms/step - loss: 0.1288 - acc: 0.9851 - val_loss: 0.4621 - val_acc: 0.9039
Epoch 2539/5000
12s 152ms/step - loss: 0.1267 - acc: 0.9855 - val_loss: 0.4497 - val_acc: 0.9068
Epoch 2540/5000
12s 151ms/step - loss: 0.1250 - acc: 0.9869 - val_loss: 0.4626 - val_acc: 0.9024
Epoch 2541/5000
12s 152ms/step - loss: 0.1272 - acc: 0.9856 - val_loss: 0.4621 - val_acc: 0.9038
Epoch 2542/5000
12s 151ms/step - loss: 0.1258 - acc: 0.9862 - val_loss: 0.4738 - val_acc: 0.9044
Epoch 2543/5000
12s 152ms/step - loss: 0.1257 - acc: 0.9862 - val_loss: 0.4597 - val_acc: 0.9061
Epoch 2544/5000
12s 151ms/step - loss: 0.1271 - acc: 0.9854 - val_loss: 0.4571 - val_acc: 0.9008
Epoch 2545/5000
12s 151ms/step - loss: 0.1247 - acc: 0.9861 - val_loss: 0.4450 - val_acc: 0.9065
Epoch 2546/5000
12s 152ms/step - loss: 0.1273 - acc: 0.9860 - val_loss: 0.4568 - val_acc: 0.9031
Epoch 2547/5000
12s 151ms/step - loss: 0.1291 - acc: 0.9855 - val_loss: 0.4558 - val_acc: 0.9034
Epoch 2548/5000
12s 152ms/step - loss: 0.1280 - acc: 0.9849 - val_loss: 0.4463 - val_acc: 0.9077
Epoch 2549/5000
12s 151ms/step - loss: 0.1237 - acc: 0.9868 - val_loss: 0.4427 - val_acc: 0.9083
Epoch 2550/5000
12s 151ms/step - loss: 0.1247 - acc: 0.9865 - val_loss: 0.4486 - val_acc: 0.9060
Epoch 2551/5000
12s 152ms/step - loss: 0.1265 - acc: 0.9864 - val_loss: 0.4414 - val_acc: 0.9047
Epoch 2552/5000
12s 151ms/step - loss: 0.1275 - acc: 0.9859 - val_loss: 0.4652 - val_acc: 0.9003
Epoch 2553/5000
12s 151ms/step - loss: 0.1241 - acc: 0.9864 - val_loss: 0.4713 - val_acc: 0.8976
Epoch 2554/5000
12s 152ms/step - loss: 0.1258 - acc: 0.9862 - val_loss: 0.4549 - val_acc: 0.9048
Epoch 2555/5000
12s 151ms/step - loss: 0.1249 - acc: 0.9866 - val_loss: 0.4376 - val_acc: 0.9069
Epoch 2556/5000
12s 152ms/step - loss: 0.1251 - acc: 0.9866 - val_loss: 0.4519 - val_acc: 0.9062
Epoch 2557/5000
12s 151ms/step - loss: 0.1269 - acc: 0.9857 - val_loss: 0.4479 - val_acc: 0.9069
Epoch 2558/5000
12s 151ms/step - loss: 0.1240 - acc: 0.9870 - val_loss: 0.4629 - val_acc: 0.9023
Epoch 2559/5000
12s 151ms/step - loss: 0.1257 - acc: 0.9866 - val_loss: 0.4487 - val_acc: 0.9039
Epoch 2560/5000
12s 151ms/step - loss: 0.1272 - acc: 0.9859 - val_loss: 0.4574 - val_acc: 0.9029
Epoch 2561/5000
12s 152ms/step - loss: 0.1238 - acc: 0.9872 - val_loss: 0.4530 - val_acc: 0.9073
Epoch 2562/5000
12s 152ms/step - loss: 0.1226 - acc: 0.9872 - val_loss: 0.4589 - val_acc: 0.9048
Epoch 2563/5000
12s 151ms/step - loss: 0.1283 - acc: 0.9854 - val_loss: 0.4525 - val_acc: 0.9032
Epoch 2564/5000
12s 151ms/step - loss: 0.1286 - acc: 0.9851 - val_loss: 0.4488 - val_acc: 0.9063
Epoch 2565/5000
12s 150ms/step - loss: 0.1263 - acc: 0.9862 - val_loss: 0.4520 - val_acc: 0.9044
Epoch 2566/5000
12s 152ms/step - loss: 0.1280 - acc: 0.9854 - val_loss: 0.4561 - val_acc: 0.9025
Epoch 2567/5000
12s 151ms/step - loss: 0.1259 - acc: 0.9860 - val_loss: 0.4532 - val_acc: 0.9034
Epoch 2568/5000
12s 156ms/step - loss: 0.1249 - acc: 0.9864 - val_loss: 0.4449 - val_acc: 0.9072
Epoch 2569/5000
12s 152ms/step - loss: 0.1269 - acc: 0.9857 - val_loss: 0.4465 - val_acc: 0.9056
Epoch 2570/5000
12s 153ms/step - loss: 0.1282 - acc: 0.9853 - val_loss: 0.4445 - val_acc: 0.9074
Epoch 2571/5000
12s 153ms/step - loss: 0.1268 - acc: 0.9857 - val_loss: 0.4496 - val_acc: 0.9028
Epoch 2572/5000
12s 152ms/step - loss: 0.1255 - acc: 0.9860 - val_loss: 0.4600 - val_acc: 0.9038
Epoch 2573/5000
12s 153ms/step - loss: 0.1206 - acc: 0.9884 - val_loss: 0.4555 - val_acc: 0.9057
Epoch 2574/5000
12s 152ms/step - loss: 0.1242 - acc: 0.9867 - val_loss: 0.4483 - val_acc: 0.9071
Epoch 2575/5000
12s 153ms/step - loss: 0.1225 - acc: 0.9871 - val_loss: 0.4497 - val_acc: 0.9054
Epoch 2576/5000
12s 152ms/step - loss: 0.1233 - acc: 0.9876 - val_loss: 0.4645 - val_acc: 0.9039
Epoch 2577/5000
12s 153ms/step - loss: 0.1247 - acc: 0.9865 - val_loss: 0.4584 - val_acc: 0.9036
Epoch 2578/5000
12s 153ms/step - loss: 0.1248 - acc: 0.9864 - val_loss: 0.4666 - val_acc: 0.9045
Epoch 2579/5000
12s 153ms/step - loss: 0.1245 - acc: 0.9868 - val_loss: 0.4668 - val_acc: 0.9063
Epoch 2580/5000
12s 153ms/step - loss: 0.1253 - acc: 0.9862 - val_loss: 0.4609 - val_acc: 0.9023
Epoch 2581/5000
12s 153ms/step - loss: 0.1242 - acc: 0.9869 - val_loss: 0.4450 - val_acc: 0.9058
Epoch 2582/5000
12s 152ms/step - loss: 0.1253 - acc: 0.9863 - val_loss: 0.4391 - val_acc: 0.9068
Epoch 2583/5000
12s 153ms/step - loss: 0.1266 - acc: 0.9861 - val_loss: 0.4420 - val_acc: 0.9066
Epoch 2584/5000
12s 153ms/step - loss: 0.1255 - acc: 0.9865 - val_loss: 0.4480 - val_acc: 0.9056
Epoch 2585/5000
12s 152ms/step - loss: 0.1281 - acc: 0.9851 - val_loss: 0.4449 - val_acc: 0.9052
Epoch 2586/5000
12s 152ms/step - loss: 0.1247 - acc: 0.9868 - val_loss: 0.4536 - val_acc: 0.9050
Epoch 2587/5000
12s 152ms/step - loss: 0.1273 - acc: 0.9857 - val_loss: 0.4712 - val_acc: 0.9007
Epoch 2588/5000
12s 153ms/step - loss: 0.1292 - acc: 0.9852 - val_loss: 0.4495 - val_acc: 0.9059
Epoch 2589/5000
12s 153ms/step - loss: 0.1253 - acc: 0.9866 - val_loss: 0.4626 - val_acc: 0.9051
Epoch 2590/5000
12s 153ms/step - loss: 0.1248 - acc: 0.9867 - val_loss: 0.4609 - val_acc: 0.9021
Epoch 2591/5000
12s 152ms/step - loss: 0.1273 - acc: 0.9855 - val_loss: 0.4594 - val_acc: 0.9039
Epoch 2592/5000
12s 152ms/step - loss: 0.1257 - acc: 0.9857 - val_loss: 0.4519 - val_acc: 0.9023
Epoch 2593/5000
12s 152ms/step - loss: 0.1317 - acc: 0.9845 - val_loss: 0.4526 - val_acc: 0.9063
Epoch 2594/5000
12s 153ms/step - loss: 0.1255 - acc: 0.9864 - val_loss: 0.4529 - val_acc: 0.9066
Epoch 2595/5000
12s 153ms/step - loss: 0.1244 - acc: 0.9863 - val_loss: 0.4540 - val_acc: 0.9076
Epoch 2596/5000
12s 153ms/step - loss: 0.1268 - acc: 0.9859 - val_loss: 0.4632 - val_acc: 0.9022
Epoch 2597/5000
12s 153ms/step - loss: 0.1250 - acc: 0.9864 - val_loss: 0.4440 - val_acc: 0.9057
Epoch 2598/5000
12s 153ms/step - loss: 0.1246 - acc: 0.9870 - val_loss: 0.4489 - val_acc: 0.9035
Epoch 2599/5000
12s 153ms/step - loss: 0.1252 - acc: 0.9857 - val_loss: 0.4671 - val_acc: 0.9035
Epoch 2600/5000
12s 153ms/step - loss: 0.1253 - acc: 0.9866 - val_loss: 0.4532 - val_acc: 0.9077
Epoch 2601/5000
12s 153ms/step - loss: 0.1228 - acc: 0.9870 - val_loss: 0.4503 - val_acc: 0.9026
Epoch 2602/5000
12s 153ms/step - loss: 0.1225 - acc: 0.9873 - val_loss: 0.4490 - val_acc: 0.9027
Epoch 2603/5000
12s 152ms/step - loss: 0.1238 - acc: 0.9871 - val_loss: 0.4430 - val_acc: 0.9066
Epoch 2604/5000
12s 152ms/step - loss: 0.1279 - acc: 0.9856 - val_loss: 0.4576 - val_acc: 0.9054
Epoch 2605/5000
12s 152ms/step - loss: 0.1253 - acc: 0.9864 - val_loss: 0.4425 - val_acc: 0.9069
Epoch 2606/5000
12s 152ms/step - loss: 0.1269 - acc: 0.9859 - val_loss: 0.4542 - val_acc: 0.9024
Epoch 2607/5000
12s 152ms/step - loss: 0.1281 - acc: 0.9852 - val_loss: 0.4673 - val_acc: 0.9023
Epoch 2608/5000
12s 152ms/step - loss: 0.1269 - acc: 0.9864 - val_loss: 0.4638 - val_acc: 0.9025
Epoch 2609/5000
12s 152ms/step - loss: 0.1261 - acc: 0.9861 - val_loss: 0.4499 - val_acc: 0.9059
Epoch 2610/5000
12s 152ms/step - loss: 0.1240 - acc: 0.9871 - val_loss: 0.4502 - val_acc: 0.9070
Epoch 2611/5000
12s 151ms/step - loss: 0.1236 - acc: 0.9874 - val_loss: 0.4592 - val_acc: 0.9018
Epoch 2612/5000
12s 151ms/step - loss: 0.1233 - acc: 0.9874 - val_loss: 0.4603 - val_acc: 0.9032
Epoch 2613/5000
12s 151ms/step - loss: 0.1265 - acc: 0.9853 - val_loss: 0.4574 - val_acc: 0.9056
Epoch 2614/5000
12s 152ms/step - loss: 0.1229 - acc: 0.9871 - val_loss: 0.4514 - val_acc: 0.9052
Epoch 2615/5000
12s 152ms/step - loss: 0.1233 - acc: 0.9869 - val_loss: 0.4699 - val_acc: 0.9013
Epoch 2616/5000
12s 151ms/step - loss: 0.1248 - acc: 0.9863 - val_loss: 0.4715 - val_acc: 0.8995
Epoch 2617/5000
12s 151ms/step - loss: 0.1284 - acc: 0.9853 - val_loss: 0.4647 - val_acc: 0.9043
Epoch 2618/5000
12s 151ms/step - loss: 0.1267 - acc: 0.9857 - val_loss: 0.4656 - val_acc: 0.9005
Epoch 2619/5000
12s 152ms/step - loss: 0.1232 - acc: 0.9874 - val_loss: 0.4657 - val_acc: 0.9035
Epoch 2620/5000
12s 152ms/step - loss: 0.1274 - acc: 0.9859 - val_loss: 0.4522 - val_acc: 0.9051
Epoch 2621/5000
12s 151ms/step - loss: 0.1275 - acc: 0.9859 - val_loss: 0.4528 - val_acc: 0.9034
Epoch 2622/5000
12s 152ms/step - loss: 0.1258 - acc: 0.9863 - val_loss: 0.4600 - val_acc: 0.9036
Epoch 2623/5000
12s 152ms/step - loss: 0.1245 - acc: 0.9865 - val_loss: 0.4626 - val_acc: 0.9047
Epoch 2624/5000
12s 152ms/step - loss: 0.1241 - acc: 0.9866 - val_loss: 0.4644 - val_acc: 0.9043
Epoch 2625/5000
12s 152ms/step - loss: 0.1245 - acc: 0.9871 - val_loss: 0.4762 - val_acc: 0.9035
Epoch 2626/5000
12s 152ms/step - loss: 0.1263 - acc: 0.9859 - val_loss: 0.4579 - val_acc: 0.9033
Epoch 2627/5000
12s 151ms/step - loss: 0.1253 - acc: 0.9867 - val_loss: 0.4616 - val_acc: 0.9022
Epoch 2628/5000
12s 151ms/step - loss: 0.1268 - acc: 0.9858 - val_loss: 0.4721 - val_acc: 0.9026
Epoch 2629/5000
12s 151ms/step - loss: 0.1270 - acc: 0.9854 - val_loss: 0.4528 - val_acc: 0.9048
Epoch 2630/5000
12s 151ms/step - loss: 0.1258 - acc: 0.9863 - val_loss: 0.4496 - val_acc: 0.9056
Epoch 2631/5000
12s 152ms/step - loss: 0.1241 - acc: 0.9868 - val_loss: 0.4469 - val_acc: 0.9058
Epoch 2632/5000
12s 151ms/step - loss: 0.1261 - acc: 0.9865 - val_loss: 0.4923 - val_acc: 0.8972
Epoch 2633/5000
12s 152ms/step - loss: 0.1255 - acc: 0.9860 - val_loss: 0.4662 - val_acc: 0.9011
Epoch 2634/5000
12s 151ms/step - loss: 0.1230 - acc: 0.9873 - val_loss: 0.4461 - val_acc: 0.9055
Epoch 2635/5000
12s 151ms/step - loss: 0.1206 - acc: 0.9877 - val_loss: 0.4495 - val_acc: 0.9055
Epoch 2636/5000
12s 152ms/step - loss: 0.1234 - acc: 0.9874 - val_loss: 0.4671 - val_acc: 0.9053
Epoch 2637/5000
12s 152ms/step - loss: 0.1233 - acc: 0.9872 - val_loss: 0.4637 - val_acc: 0.9032
Epoch 2638/5000
12s 151ms/step - loss: 0.1221 - acc: 0.9874 - val_loss: 0.4634 - val_acc: 0.9042
Epoch 2639/5000
12s 151ms/step - loss: 0.1209 - acc: 0.9877 - val_loss: 0.4655 - val_acc: 0.9023
Epoch 2640/5000
12s 152ms/step - loss: 0.1258 - acc: 0.9864 - val_loss: 0.4556 - val_acc: 0.9065
Epoch 2641/5000
12s 152ms/step - loss: 0.1247 - acc: 0.9867 - val_loss: 0.4576 - val_acc: 0.9018
Epoch 2642/5000
12s 152ms/step - loss: 0.1274 - acc: 0.9855 - val_loss: 0.4584 - val_acc: 0.9051
Epoch 2643/5000
12s 152ms/step - loss: 0.1282 - acc: 0.9856 - val_loss: 0.4528 - val_acc: 0.9066
Epoch 2644/5000
12s 151ms/step - loss: 0.1270 - acc: 0.9858 - val_loss: 0.4617 - val_acc: 0.9015
Epoch 2645/5000
12s 152ms/step - loss: 0.1279 - acc: 0.9853 - val_loss: 0.4448 - val_acc: 0.9063
Epoch 2646/5000
12s 151ms/step - loss: 0.1256 - acc: 0.9865 - val_loss: 0.4449 - val_acc: 0.9055
Epoch 2647/5000
12s 152ms/step - loss: 0.1259 - acc: 0.9864 - val_loss: 0.4429 - val_acc: 0.9052
Epoch 2648/5000
12s 152ms/step - loss: 0.1244 - acc: 0.9869 - val_loss: 0.4474 - val_acc: 0.9038
Epoch 2649/5000
12s 151ms/step - loss: 0.1236 - acc: 0.9874 - val_loss: 0.4459 - val_acc: 0.9072
Epoch 2650/5000
12s 151ms/step - loss: 0.1246 - acc: 0.9872 - val_loss: 0.4469 - val_acc: 0.9039
Epoch 2651/5000
12s 151ms/step - loss: 0.1254 - acc: 0.9868 - val_loss: 0.4540 - val_acc: 0.9056
Epoch 2652/5000
12s 151ms/step - loss: 0.1261 - acc: 0.9866 - val_loss: 0.4616 - val_acc: 0.9003
Epoch 2653/5000
12s 151ms/step - loss: 0.1254 - acc: 0.9860 - val_loss: 0.4525 - val_acc: 0.9029
Epoch 2654/5000
12s 151ms/step - loss: 0.1226 - acc: 0.9874 - val_loss: 0.4589 - val_acc: 0.9032
Epoch 2655/5000
12s 151ms/step - loss: 0.1244 - acc: 0.9868 - val_loss: 0.4548 - val_acc: 0.9027
Epoch 2656/5000
12s 151ms/step - loss: 0.1252 - acc: 0.9871 - val_loss: 0.4438 - val_acc: 0.9057
Epoch 2657/5000
12s 151ms/step - loss: 0.1228 - acc: 0.9869 - val_loss: 0.4554 - val_acc: 0.9045
Epoch 2658/5000
12s 152ms/step - loss: 0.1280 - acc: 0.9857 - val_loss: 0.4481 - val_acc: 0.9066
Epoch 2659/5000
12s 152ms/step - loss: 0.1251 - acc: 0.9861 - val_loss: 0.4492 - val_acc: 0.9075
Epoch 2660/5000
12s 151ms/step - loss: 0.1222 - acc: 0.9873 - val_loss: 0.4501 - val_acc: 0.9045
...
Epoch 2985/5000
12s 152ms/step - loss: 0.1257 - acc: 0.9858 - val_loss: 0.4490 - val_acc: 0.9038
Epoch 2986/5000
12s 151ms/step - loss: 0.1252 - acc: 0.9863 - val_loss: 0.4498 - val_acc: 0.9043
Epoch 2987/5000
12s 151ms/step - loss: 0.1248 - acc: 0.9864 - val_loss: 0.4459 - val_acc: 0.9058
Epoch 2988/5000
12s 152ms/step - loss: 0.1226 - acc: 0.9868 - val_loss: 0.4559 - val_acc: 0.9032
Epoch 2989/5000
12s 151ms/step - loss: 0.1237 - acc: 0.9868 - val_loss: 0.4560 - val_acc: 0.9051
Epoch 2990/5000
12s 151ms/step - loss: 0.1242 - acc: 0.9870 - val_loss: 0.4648 - val_acc: 0.9033
Epoch 2991/5000
12s 152ms/step - loss: 0.1245 - acc: 0.9863 - val_loss: 0.4604 - val_acc: 0.9018
Epoch 2992/5000
12s 151ms/step - loss: 0.1240 - acc: 0.9865 - val_loss: 0.4575 - val_acc: 0.9036
Epoch 2993/5000
13s 157ms/step - loss: 0.1226 - acc: 0.9872 - val_loss: 0.4626 - val_acc: 0.9028
Epoch 2994/5000
12s 152ms/step - loss: 0.1255 - acc: 0.9860 - val_loss: 0.4777 - val_acc: 0.8998
Epoch 2995/5000
12s 151ms/step - loss: 0.1256 - acc: 0.9860 - val_loss: 0.4574 - val_acc: 0.9023
Epoch 2996/5000
12s 151ms/step - loss: 0.1232 - acc: 0.9866 - val_loss: 0.4663 - val_acc: 0.9024
Epoch 2997/5000
12s 151ms/step - loss: 0.1210 - acc: 0.9881 - val_loss: 0.4663 - val_acc: 0.9046
Epoch 2998/5000
12s 152ms/step - loss: 0.1201 - acc: 0.9876 - val_loss: 0.4492 - val_acc: 0.9065
Epoch 2999/5000
12s 152ms/step - loss: 0.1260 - acc: 0.9861 - val_loss: 0.4677 - val_acc: 0.9031
Epoch 3000/5000
12s 151ms/step - loss: 0.1256 - acc: 0.9861 - val_loss: 0.4517 - val_acc: 0.9044
Epoch 3001/5000
lr changed to 0.0009999999776482583
12s 151ms/step - loss: 0.1226 - acc: 0.9877 - val_loss: 0.4332 - val_acc: 0.9071
Epoch 3002/5000
12s 151ms/step - loss: 0.1123 - acc: 0.9911 - val_loss: 0.4282 - val_acc: 0.9088
Epoch 3003/5000
12s 152ms/step - loss: 0.1072 - acc: 0.9926 - val_loss: 0.4277 - val_acc: 0.9110
Epoch 3004/5000
12s 152ms/step - loss: 0.1051 - acc: 0.9938 - val_loss: 0.4253 - val_acc: 0.9108
Epoch 3005/5000
12s 152ms/step - loss: 0.1041 - acc: 0.9941 - val_loss: 0.4242 - val_acc: 0.9101
Epoch 3006/5000
12s 151ms/step - loss: 0.1021 - acc: 0.9945 - val_loss: 0.4259 - val_acc: 0.9098
Epoch 3007/5000
12s 151ms/step - loss: 0.1034 - acc: 0.9940 - val_loss: 0.4255 - val_acc: 0.9100
Epoch 3008/5000
12s 152ms/step - loss: 0.1018 - acc: 0.9949 - val_loss: 0.4252 - val_acc: 0.9100
Epoch 3009/5000
12s 152ms/step - loss: 0.1029 - acc: 0.9945 - val_loss: 0.4276 - val_acc: 0.9103
Epoch 3010/5000
12s 151ms/step - loss: 0.1018 - acc: 0.9947 - val_loss: 0.4275 - val_acc: 0.9102
Epoch 3011/5000
12s 152ms/step - loss: 0.1004 - acc: 0.9951 - val_loss: 0.4237 - val_acc: 0.9106
Epoch 3012/5000
12s 152ms/step - loss: 0.0996 - acc: 0.9954 - val_loss: 0.4213 - val_acc: 0.9120
Epoch 3013/5000
12s 151ms/step - loss: 0.0997 - acc: 0.9953 - val_loss: 0.4247 - val_acc: 0.9112
Epoch 3014/5000
12s 151ms/step - loss: 0.0998 - acc: 0.9956 - val_loss: 0.4249 - val_acc: 0.9111
Epoch 3015/5000
12s 152ms/step - loss: 0.0999 - acc: 0.9953 - val_loss: 0.4261 - val_acc: 0.9103
Epoch 3016/5000
12s 151ms/step - loss: 0.0984 - acc: 0.9958 - val_loss: 0.4285 - val_acc: 0.9102
Epoch 3017/5000
12s 151ms/step - loss: 0.0999 - acc: 0.9954 - val_loss: 0.4284 - val_acc: 0.9098
Epoch 3018/5000
12s 154ms/step - loss: 0.0997 - acc: 0.9952 - val_loss: 0.4290 - val_acc: 0.9105
Epoch 3019/5000
12s 152ms/step - loss: 0.0992 - acc: 0.9955 - val_loss: 0.4273 - val_acc: 0.9118
Epoch 3020/5000
12s 151ms/step - loss: 0.0988 - acc: 0.9953 - val_loss: 0.4270 - val_acc: 0.9110
Epoch 3021/5000
12s 152ms/step - loss: 0.0988 - acc: 0.9957 - val_loss: 0.4298 - val_acc: 0.9104
Epoch 3022/5000
12s 151ms/step - loss: 0.0984 - acc: 0.9957 - val_loss: 0.4317 - val_acc: 0.9103
Epoch 3023/5000
12s 151ms/step - loss: 0.0976 - acc: 0.9960 - val_loss: 0.4282 - val_acc: 0.9107
Epoch 3024/5000
12s 152ms/step - loss: 0.0984 - acc: 0.9959 - val_loss: 0.4283 - val_acc: 0.9111
Epoch 3025/5000
12s 152ms/step - loss: 0.0969 - acc: 0.9960 - val_loss: 0.4288 - val_acc: 0.9090

过拟合依然严重,还是得继续减小网络。

Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht, Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, IEEE Transactions on Industrial Electronics, DOI: 10.1109/TIE.2020.2972458, Date of Publication: 13 February 2020

https://ieeexplore.ieee.org/document/8998530

本文系转载,前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文系转载前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

评论
作者已关闭评论
0 条评论
热度
最新
推荐阅读
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档