前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >【哈工大版】Dynamic ReLU:Adaptively Parametric ReLU及Keras代码(调参记录5)

【哈工大版】Dynamic ReLU:Adaptively Parametric ReLU及Keras代码(调参记录5)

作者头像
用户7368967
修改2020-05-27 18:09:22
4470
修改2020-05-27 18:09:22
举报
文章被收录于专栏:深度学习知识深度学习知识

本文介绍哈工大团队提出的一种Dynamic ReLU激活函数,即自适应参数化ReLU激活函数,原本是应用在基于振动信号的故障诊断,能够让每个样本有自己独特的ReLU参数,在2019年5月3日投稿至IEEE Transactions on Industrial Electronics,2020年1月24日录用,2020年2月13日在IEEE官网公布

续上一篇:

【哈工大版】Dynamic ReLU:Adaptively Parametric ReLU及Keras代码(调参记录4)

本文继续测试自适应参数化ReLU(APReLU)激活函数在Cifar10图像集上的效果,每个残差模块包含两个3×3的卷积层,一共有27个残差模块,卷积核的个数分别是16个、32个和64个。

在APReLU激活函数中,全连接层的神经元个数,与输入特征图的通道数,保持一致。(这也是原始论文中的设置,在之前的四次调参中,将全连接层的神经元个数,设置成了输入特征图通道数的1/4,想着可以避免过拟合)

自适应参数化ReLU是Parametric ReLU激活函数的动态改进版本:

自适应参数化ReLU:一种动态ReLU激活函数
自适应参数化ReLU:一种动态ReLU激活函数

Keras代码如下:

代码语言:python
复制
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Tue Apr 14 04:17:45 2020
Implemented using TensorFlow 1.0.1 and Keras 2.2.1

Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht,
Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, 
IEEE Transactions on Industrial Electronics, DOI: 10.1109/TIE.2020.2972458,
Date of Publication: 13 February 2020

@author: Minghang Zhao
"""

from __future__ import print_function
import keras
import numpy as np
from keras.datasets import cifar10
from keras.layers import Dense, Conv2D, BatchNormalization, Activation, Minimum
from keras.layers import AveragePooling2D, Input, GlobalAveragePooling2D, Concatenate, Reshape
from keras.regularizers import l2
from keras import backend as K
from keras.models import Model
from keras import optimizers
from keras.preprocessing.image import ImageDataGenerator
from keras.callbacks import LearningRateScheduler
K.set_learning_phase(1)

# The data, split between train and test sets
(x_train, y_train), (x_test, y_test) = cifar10.load_data()
x_train = x_train.astype('float32') / 255.
x_test = x_test.astype('float32') / 255.
x_test = x_test-np.mean(x_train)
x_train = x_train-np.mean(x_train)
print('x_train shape:', x_train.shape)
print(x_train.shape[0], 'train samples')
print(x_test.shape[0], 'test samples')

# convert class vectors to binary class matrices
y_train = keras.utils.to_categorical(y_train, 10)
y_test = keras.utils.to_categorical(y_test, 10)

# Schedule the learning rate, multiply 0.1 every 200 epoches
def scheduler(epoch):
    if epoch % 200 == 0 and epoch != 0:
        lr = K.get_value(model.optimizer.lr)
        K.set_value(model.optimizer.lr, lr * 0.1)
        print("lr changed to {}".format(lr * 0.1))
    return K.get_value(model.optimizer.lr)

# An adaptively parametric rectifier linear unit (APReLU)
def aprelu(inputs):
    # get the number of channels
    channels = inputs.get_shape().as_list()[-1]
    # get a zero feature map
    zeros_input = keras.layers.subtract([inputs, inputs])
    # get a feature map with only positive features
    pos_input = Activation('relu')(inputs)
    # get a feature map with only negative features
    neg_input = Minimum()([inputs,zeros_input])
    # define a network to obtain the scaling coefficients
    scales_p = GlobalAveragePooling2D()(pos_input)
    scales_n = GlobalAveragePooling2D()(neg_input)
    scales = Concatenate()([scales_n, scales_p])
    scales = Dense(channels, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales)
    scales = BatchNormalization()(scales)
    scales = Activation('relu')(scales)
    scales = Dense(channels, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales)
    scales = BatchNormalization()(scales)
    scales = Activation('sigmoid')(scales)
    scales = Reshape((1,1,channels))(scales)
    # apply a paramtetric relu
    neg_part = keras.layers.multiply([scales, neg_input])
    return keras.layers.add([pos_input, neg_part])

# Residual Block
def residual_block(incoming, nb_blocks, out_channels, downsample=False,
                   downsample_strides=2):
    
    residual = incoming
    in_channels = incoming.get_shape().as_list()[-1]
    
    for i in range(nb_blocks):
        
        identity = residual
        
        if not downsample:
            downsample_strides = 1
        
        residual = BatchNormalization()(residual)
        residual = aprelu(residual)
        residual = Conv2D(out_channels, 3, strides=(downsample_strides, downsample_strides), 
                          padding='same', kernel_initializer='he_normal', 
                          kernel_regularizer=l2(1e-4))(residual)
        
        residual = BatchNormalization()(residual)
        residual = aprelu(residual)
        residual = Conv2D(out_channels, 3, padding='same', kernel_initializer='he_normal', 
                          kernel_regularizer=l2(1e-4))(residual)
        
        # Downsampling
        if downsample_strides > 1:
            identity = AveragePooling2D(pool_size=(1,1), strides=(2,2))(identity)
            
        # Zero_padding to match channels
        if in_channels != out_channels:
            zeros_identity = keras.layers.subtract([identity, identity])
            identity = keras.layers.concatenate([identity, zeros_identity])
            in_channels = out_channels
        
        residual = keras.layers.add([residual, identity])
    
    return residual


# define and train a model
inputs = Input(shape=(32, 32, 3))
net = Conv2D(16, 3, padding='same', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(inputs)
net = residual_block(net, 9, 16, downsample=False)
net = residual_block(net, 1, 32, downsample=True)
net = residual_block(net, 8, 32, downsample=False)
net = residual_block(net, 1, 64, downsample=True)
net = residual_block(net, 8, 64, downsample=False)
net = BatchNormalization()(net)
net = aprelu(net)
net = GlobalAveragePooling2D()(net)
outputs = Dense(10, activation='softmax', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(net)
model = Model(inputs=inputs, outputs=outputs)
sgd = optimizers.SGD(lr=0.1, decay=0., momentum=0.9, nesterov=True)
model.compile(loss='categorical_crossentropy', optimizer=sgd, metrics=['accuracy'])

# data augmentation
datagen = ImageDataGenerator(
    # randomly rotate images in the range (deg 0 to 180)
    rotation_range=30,
    # randomly flip images
    horizontal_flip=True,
    # randomly shift images horizontally
    width_shift_range=0.125,
    # randomly shift images vertically
    height_shift_range=0.125)

reduce_lr = LearningRateScheduler(scheduler)
# fit the model on the batches generated by datagen.flow().
model.fit_generator(datagen.flow(x_train, y_train, batch_size=100),
                    validation_data=(x_test, y_test), epochs=500, 
                    verbose=1, callbacks=[reduce_lr], workers=4)

# get results
K.set_learning_phase(0)
DRSN_train_score1 = model.evaluate(x_train, y_train, batch_size=100, verbose=0)
print('Train loss:', DRSN_train_score1[0])
print('Train accuracy:', DRSN_train_score1[1])
DRSN_test_score1 = model.evaluate(x_test, y_test, batch_size=100, verbose=0)
print('Test loss:', DRSN_test_score1[0])
print('Test accuracy:', DRSN_test_score1[1])

实验结果如下(前254个epoch的结果,在spyder窗口里不显示了):

代码语言:python
复制
Epoch 255/500
62s 125ms/step - loss: 0.1883 - acc: 0.9839 - val_loss: 0.4781 - val_acc: 0.9102
Epoch 256/500
63s 125ms/step - loss: 0.1862 - acc: 0.9842 - val_loss: 0.4776 - val_acc: 0.9114
Epoch 257/500
62s 125ms/step - loss: 0.1860 - acc: 0.9830 - val_loss: 0.4627 - val_acc: 0.9150
Epoch 258/500
63s 125ms/step - loss: 0.1809 - acc: 0.9847 - val_loss: 0.4602 - val_acc: 0.9123
Epoch 259/500
62s 125ms/step - loss: 0.1820 - acc: 0.9836 - val_loss: 0.4704 - val_acc: 0.9113
Epoch 260/500
63s 125ms/step - loss: 0.1843 - acc: 0.9829 - val_loss: 0.4656 - val_acc: 0.9110
Epoch 261/500
62s 125ms/step - loss: 0.1777 - acc: 0.9855 - val_loss: 0.4682 - val_acc: 0.9113
Epoch 262/500
62s 125ms/step - loss: 0.1821 - acc: 0.9827 - val_loss: 0.4697 - val_acc: 0.9126
Epoch 263/500
63s 125ms/step - loss: 0.1773 - acc: 0.9839 - val_loss: 0.4607 - val_acc: 0.9108
Epoch 264/500
62s 125ms/step - loss: 0.1751 - acc: 0.9848 - val_loss: 0.4596 - val_acc: 0.9123
Epoch 265/500
62s 125ms/step - loss: 0.1753 - acc: 0.9840 - val_loss: 0.4695 - val_acc: 0.9090
Epoch 266/500
62s 125ms/step - loss: 0.1793 - acc: 0.9826 - val_loss: 0.4642 - val_acc: 0.9104
Epoch 267/500
63s 125ms/step - loss: 0.1745 - acc: 0.9842 - val_loss: 0.4540 - val_acc: 0.9134
Epoch 268/500
63s 125ms/step - loss: 0.1764 - acc: 0.9835 - val_loss: 0.4707 - val_acc: 0.9105
Epoch 269/500
63s 125ms/step - loss: 0.1780 - acc: 0.9822 - val_loss: 0.4477 - val_acc: 0.9134
Epoch 270/500
62s 125ms/step - loss: 0.1762 - acc: 0.9825 - val_loss: 0.4677 - val_acc: 0.9110
Epoch 271/500
62s 125ms/step - loss: 0.1735 - acc: 0.9835 - val_loss: 0.4532 - val_acc: 0.9133
Epoch 272/500
62s 125ms/step - loss: 0.1733 - acc: 0.9833 - val_loss: 0.4501 - val_acc: 0.9154
Epoch 273/500
62s 125ms/step - loss: 0.1684 - acc: 0.9847 - val_loss: 0.4520 - val_acc: 0.9119
Epoch 274/500
62s 125ms/step - loss: 0.1745 - acc: 0.9822 - val_loss: 0.4507 - val_acc: 0.9142
Epoch 275/500
63s 125ms/step - loss: 0.1726 - acc: 0.9826 - val_loss: 0.4537 - val_acc: 0.9118
Epoch 276/500
62s 125ms/step - loss: 0.1722 - acc: 0.9826 - val_loss: 0.4514 - val_acc: 0.9109
Epoch 277/500
62s 125ms/step - loss: 0.1762 - acc: 0.9808 - val_loss: 0.4654 - val_acc: 0.9096
Epoch 278/500
62s 125ms/step - loss: 0.1709 - acc: 0.9837 - val_loss: 0.4556 - val_acc: 0.9081
Epoch 279/500
62s 125ms/step - loss: 0.1685 - acc: 0.9836 - val_loss: 0.4474 - val_acc: 0.9151
Epoch 280/500
62s 125ms/step - loss: 0.1692 - acc: 0.9828 - val_loss: 0.4597 - val_acc: 0.9106
Epoch 281/500
62s 125ms/step - loss: 0.1722 - acc: 0.9815 - val_loss: 0.4582 - val_acc: 0.9070
Epoch 282/500
63s 125ms/step - loss: 0.1728 - acc: 0.9813 - val_loss: 0.4625 - val_acc: 0.9085
Epoch 283/500
63s 125ms/step - loss: 0.1695 - acc: 0.9818 - val_loss: 0.4460 - val_acc: 0.9123
Epoch 284/500
63s 125ms/step - loss: 0.1710 - acc: 0.9812 - val_loss: 0.4481 - val_acc: 0.9132
Epoch 285/500
62s 125ms/step - loss: 0.1720 - acc: 0.9819 - val_loss: 0.4575 - val_acc: 0.9079
Epoch 286/500
62s 125ms/step - loss: 0.1709 - acc: 0.9814 - val_loss: 0.4417 - val_acc: 0.9118
Epoch 287/500
62s 125ms/step - loss: 0.1678 - acc: 0.9821 - val_loss: 0.4432 - val_acc: 0.9143
Epoch 288/500
62s 125ms/step - loss: 0.1679 - acc: 0.9824 - val_loss: 0.4468 - val_acc: 0.9111
Epoch 289/500
63s 125ms/step - loss: 0.1690 - acc: 0.9818 - val_loss: 0.4449 - val_acc: 0.9140
Epoch 290/500
63s 125ms/step - loss: 0.1664 - acc: 0.9825 - val_loss: 0.4552 - val_acc: 0.9098
Epoch 291/500
63s 125ms/step - loss: 0.1688 - acc: 0.9815 - val_loss: 0.4412 - val_acc: 0.9128
Epoch 292/500
62s 125ms/step - loss: 0.1673 - acc: 0.9819 - val_loss: 0.4430 - val_acc: 0.9100
Epoch 293/500
63s 125ms/step - loss: 0.1666 - acc: 0.9833 - val_loss: 0.4490 - val_acc: 0.9121
Epoch 294/500
63s 125ms/step - loss: 0.1677 - acc: 0.9818 - val_loss: 0.4471 - val_acc: 0.9114
Epoch 295/500
62s 125ms/step - loss: 0.1635 - acc: 0.9830 - val_loss: 0.4577 - val_acc: 0.9094
Epoch 296/500
63s 125ms/step - loss: 0.1670 - acc: 0.9817 - val_loss: 0.4633 - val_acc: 0.9074
Epoch 297/500
62s 125ms/step - loss: 0.1660 - acc: 0.9830 - val_loss: 0.4606 - val_acc: 0.9074
Epoch 298/500
63s 125ms/step - loss: 0.1678 - acc: 0.9816 - val_loss: 0.4606 - val_acc: 0.9067
Epoch 299/500
62s 125ms/step - loss: 0.1629 - acc: 0.9827 - val_loss: 0.4622 - val_acc: 0.9075
Epoch 300/500
62s 125ms/step - loss: 0.1629 - acc: 0.9833 - val_loss: 0.4640 - val_acc: 0.9104
Epoch 301/500
62s 125ms/step - loss: 0.1685 - acc: 0.9810 - val_loss: 0.4658 - val_acc: 0.9083
Epoch 302/500
62s 125ms/step - loss: 0.1657 - acc: 0.9820 - val_loss: 0.4497 - val_acc: 0.9105
Epoch 303/500
63s 125ms/step - loss: 0.1657 - acc: 0.9816 - val_loss: 0.4565 - val_acc: 0.9122
Epoch 304/500
62s 125ms/step - loss: 0.1668 - acc: 0.9816 - val_loss: 0.4435 - val_acc: 0.9134
Epoch 305/500
62s 125ms/step - loss: 0.1679 - acc: 0.9802 - val_loss: 0.4566 - val_acc: 0.9094
Epoch 306/500
62s 125ms/step - loss: 0.1609 - acc: 0.9832 - val_loss: 0.4529 - val_acc: 0.9116
Epoch 307/500
63s 125ms/step - loss: 0.1666 - acc: 0.9814 - val_loss: 0.4518 - val_acc: 0.9121
Epoch 308/500
63s 125ms/step - loss: 0.1617 - acc: 0.9821 - val_loss: 0.4450 - val_acc: 0.9152
Epoch 309/500
63s 125ms/step - loss: 0.1664 - acc: 0.9806 - val_loss: 0.4430 - val_acc: 0.9131
Epoch 310/500
63s 125ms/step - loss: 0.1624 - acc: 0.9830 - val_loss: 0.4416 - val_acc: 0.9121
Epoch 311/500
62s 125ms/step - loss: 0.1619 - acc: 0.9828 - val_loss: 0.4499 - val_acc: 0.9090
Epoch 312/500
62s 124ms/step - loss: 0.1658 - acc: 0.9818 - val_loss: 0.4532 - val_acc: 0.9099
Epoch 313/500
62s 125ms/step - loss: 0.1653 - acc: 0.9815 - val_loss: 0.4498 - val_acc: 0.9070
Epoch 314/500
63s 125ms/step - loss: 0.1650 - acc: 0.9820 - val_loss: 0.4723 - val_acc: 0.9085
Epoch 315/500
63s 125ms/step - loss: 0.1646 - acc: 0.9820 - val_loss: 0.4576 - val_acc: 0.9089
Epoch 316/500
63s 126ms/step - loss: 0.1628 - acc: 0.9826 - val_loss: 0.4623 - val_acc: 0.9068
Epoch 317/500
63s 125ms/step - loss: 0.1660 - acc: 0.9810 - val_loss: 0.4444 - val_acc: 0.9074
Epoch 318/500
62s 125ms/step - loss: 0.1653 - acc: 0.9813 - val_loss: 0.4438 - val_acc: 0.9088
Epoch 319/500
62s 125ms/step - loss: 0.1640 - acc: 0.9819 - val_loss: 0.4679 - val_acc: 0.9079
Epoch 320/500
62s 125ms/step - loss: 0.1626 - acc: 0.9826 - val_loss: 0.4472 - val_acc: 0.9100
Epoch 321/500
63s 125ms/step - loss: 0.1630 - acc: 0.9821 - val_loss: 0.4482 - val_acc: 0.9071
Epoch 322/500
62s 125ms/step - loss: 0.1606 - acc: 0.9833 - val_loss: 0.4515 - val_acc: 0.9103
Epoch 323/500
63s 125ms/step - loss: 0.1636 - acc: 0.9821 - val_loss: 0.4472 - val_acc: 0.9119
Epoch 324/500
63s 125ms/step - loss: 0.1633 - acc: 0.9822 - val_loss: 0.4620 - val_acc: 0.9071
Epoch 325/500
63s 125ms/step - loss: 0.1627 - acc: 0.9826 - val_loss: 0.4571 - val_acc: 0.9107
Epoch 326/500
63s 125ms/step - loss: 0.1629 - acc: 0.9820 - val_loss: 0.4450 - val_acc: 0.9120
Epoch 327/500
62s 125ms/step - loss: 0.1643 - acc: 0.9813 - val_loss: 0.4529 - val_acc: 0.9112
Epoch 328/500
62s 125ms/step - loss: 0.1619 - acc: 0.9826 - val_loss: 0.4394 - val_acc: 0.9109
Epoch 329/500
62s 125ms/step - loss: 0.1616 - acc: 0.9831 - val_loss: 0.4396 - val_acc: 0.9117
Epoch 330/500
62s 125ms/step - loss: 0.1614 - acc: 0.9819 - val_loss: 0.4493 - val_acc: 0.9125
Epoch 331/500
63s 125ms/step - loss: 0.1619 - acc: 0.9824 - val_loss: 0.4362 - val_acc: 0.9089
Epoch 332/500
62s 125ms/step - loss: 0.1609 - acc: 0.9820 - val_loss: 0.4592 - val_acc: 0.9061
Epoch 333/500
62s 125ms/step - loss: 0.1621 - acc: 0.9821 - val_loss: 0.4408 - val_acc: 0.9089
Epoch 334/500
62s 125ms/step - loss: 0.1605 - acc: 0.9832 - val_loss: 0.4357 - val_acc: 0.9135
Epoch 335/500
62s 125ms/step - loss: 0.1645 - acc: 0.9812 - val_loss: 0.4413 - val_acc: 0.9137
Epoch 336/500
62s 125ms/step - loss: 0.1607 - acc: 0.9831 - val_loss: 0.4592 - val_acc: 0.9093
Epoch 337/500
62s 125ms/step - loss: 0.1667 - acc: 0.9812 - val_loss: 0.4590 - val_acc: 0.9085
Epoch 338/500
62s 125ms/step - loss: 0.1639 - acc: 0.9818 - val_loss: 0.4423 - val_acc: 0.9113
Epoch 339/500
62s 125ms/step - loss: 0.1622 - acc: 0.9820 - val_loss: 0.4565 - val_acc: 0.9094
Epoch 340/500
63s 125ms/step - loss: 0.1589 - acc: 0.9837 - val_loss: 0.4534 - val_acc: 0.9104
Epoch 341/500
63s 125ms/step - loss: 0.1636 - acc: 0.9817 - val_loss: 0.4643 - val_acc: 0.9055
Epoch 342/500
62s 125ms/step - loss: 0.1599 - acc: 0.9831 - val_loss: 0.4666 - val_acc: 0.9043
Epoch 343/500
62s 125ms/step - loss: 0.1629 - acc: 0.9812 - val_loss: 0.4635 - val_acc: 0.9065
Epoch 344/500
62s 125ms/step - loss: 0.1592 - acc: 0.9831 - val_loss: 0.4563 - val_acc: 0.9083
Epoch 345/500
62s 125ms/step - loss: 0.1634 - acc: 0.9818 - val_loss: 0.4451 - val_acc: 0.9096
Epoch 346/500
63s 125ms/step - loss: 0.1664 - acc: 0.9811 - val_loss: 0.4450 - val_acc: 0.9111
Epoch 347/500
63s 125ms/step - loss: 0.1624 - acc: 0.9814 - val_loss: 0.4458 - val_acc: 0.9133
Epoch 348/500
62s 125ms/step - loss: 0.1594 - acc: 0.9830 - val_loss: 0.4765 - val_acc: 0.9067
Epoch 349/500
62s 125ms/step - loss: 0.1595 - acc: 0.9832 - val_loss: 0.4469 - val_acc: 0.9118
Epoch 350/500
62s 125ms/step - loss: 0.1603 - acc: 0.9830 - val_loss: 0.4754 - val_acc: 0.9049
Epoch 351/500
63s 125ms/step - loss: 0.1615 - acc: 0.9828 - val_loss: 0.4567 - val_acc: 0.9098
Epoch 352/500
62s 125ms/step - loss: 0.1615 - acc: 0.9824 - val_loss: 0.4540 - val_acc: 0.9071
Epoch 353/500
62s 125ms/step - loss: 0.1628 - acc: 0.9822 - val_loss: 0.4546 - val_acc: 0.9080
Epoch 354/500
68s 136ms/step - loss: 0.1610 - acc: 0.9826 - val_loss: 0.4602 - val_acc: 0.9068
Epoch 355/500
71s 143ms/step - loss: 0.1659 - acc: 0.9813 - val_loss: 0.4482 - val_acc: 0.9095
Epoch 356/500
71s 143ms/step - loss: 0.1602 - acc: 0.9828 - val_loss: 0.4471 - val_acc: 0.9121
Epoch 357/500
71s 143ms/step - loss: 0.1578 - acc: 0.9845 - val_loss: 0.4429 - val_acc: 0.9083
Epoch 358/500
66s 131ms/step - loss: 0.1609 - acc: 0.9827 - val_loss: 0.4488 - val_acc: 0.9090
Epoch 359/500
62s 125ms/step - loss: 0.1558 - acc: 0.9845 - val_loss: 0.4614 - val_acc: 0.9065
Epoch 360/500
63s 125ms/step - loss: 0.1646 - acc: 0.9816 - val_loss: 0.4671 - val_acc: 0.9052
Epoch 361/500
62s 125ms/step - loss: 0.1621 - acc: 0.9822 - val_loss: 0.4514 - val_acc: 0.9090
Epoch 362/500
63s 125ms/step - loss: 0.1605 - acc: 0.9827 - val_loss: 0.4596 - val_acc: 0.9103
Epoch 363/500
62s 125ms/step - loss: 0.1579 - acc: 0.9836 - val_loss: 0.4621 - val_acc: 0.9051
Epoch 364/500
63s 125ms/step - loss: 0.1593 - acc: 0.9834 - val_loss: 0.4434 - val_acc: 0.9105
Epoch 365/500
63s 125ms/step - loss: 0.1586 - acc: 0.9832 - val_loss: 0.4541 - val_acc: 0.9126
Epoch 366/500
62s 125ms/step - loss: 0.1595 - acc: 0.9821 - val_loss: 0.4512 - val_acc: 0.9108
Epoch 367/500
62s 125ms/step - loss: 0.1584 - acc: 0.9831 - val_loss: 0.4637 - val_acc: 0.9079
Epoch 368/500
62s 125ms/step - loss: 0.1589 - acc: 0.9829 - val_loss: 0.4460 - val_acc: 0.9110
Epoch 369/500
63s 125ms/step - loss: 0.1586 - acc: 0.9839 - val_loss: 0.4686 - val_acc: 0.9063
Epoch 370/500
63s 125ms/step - loss: 0.1601 - acc: 0.9823 - val_loss: 0.4517 - val_acc: 0.9119
Epoch 371/500
62s 125ms/step - loss: 0.1547 - acc: 0.9843 - val_loss: 0.4656 - val_acc: 0.9085
Epoch 372/500
62s 125ms/step - loss: 0.1569 - acc: 0.9840 - val_loss: 0.4640 - val_acc: 0.9103
Epoch 373/500
62s 125ms/step - loss: 0.1640 - acc: 0.9814 - val_loss: 0.4515 - val_acc: 0.9086
Epoch 374/500
63s 125ms/step - loss: 0.1613 - acc: 0.9823 - val_loss: 0.4643 - val_acc: 0.9050
Epoch 375/500
62s 125ms/step - loss: 0.1625 - acc: 0.9823 - val_loss: 0.4410 - val_acc: 0.9146
Epoch 376/500
62s 125ms/step - loss: 0.1606 - acc: 0.9825 - val_loss: 0.4516 - val_acc: 0.9119
Epoch 377/500
63s 125ms/step - loss: 0.1573 - acc: 0.9841 - val_loss: 0.4450 - val_acc: 0.9114
Epoch 378/500
63s 125ms/step - loss: 0.1640 - acc: 0.9804 - val_loss: 0.4494 - val_acc: 0.9094
Epoch 379/500
63s 125ms/step - loss: 0.1643 - acc: 0.9816 - val_loss: 0.4491 - val_acc: 0.9101
Epoch 380/500
62s 125ms/step - loss: 0.1578 - acc: 0.9833 - val_loss: 0.4539 - val_acc: 0.9109
Epoch 381/500
63s 125ms/step - loss: 0.1577 - acc: 0.9833 - val_loss: 0.4436 - val_acc: 0.9121
Epoch 382/500
62s 125ms/step - loss: 0.1597 - acc: 0.9827 - val_loss: 0.4577 - val_acc: 0.9090
Epoch 383/500
62s 125ms/step - loss: 0.1635 - acc: 0.9820 - val_loss: 0.4659 - val_acc: 0.9019
Epoch 384/500
63s 125ms/step - loss: 0.1600 - acc: 0.9829 - val_loss: 0.4539 - val_acc: 0.9101
Epoch 385/500
63s 125ms/step - loss: 0.1581 - acc: 0.9838 - val_loss: 0.4469 - val_acc: 0.9128
Epoch 386/500
63s 125ms/step - loss: 0.1569 - acc: 0.9835 - val_loss: 0.4710 - val_acc: 0.9094
Epoch 387/500
63s 125ms/step - loss: 0.1622 - acc: 0.9816 - val_loss: 0.4414 - val_acc: 0.9130
Epoch 388/500
62s 125ms/step - loss: 0.1572 - acc: 0.9838 - val_loss: 0.4461 - val_acc: 0.9093
Epoch 389/500
63s 125ms/step - loss: 0.1581 - acc: 0.9837 - val_loss: 0.4594 - val_acc: 0.9081
Epoch 390/500
63s 125ms/step - loss: 0.1582 - acc: 0.9835 - val_loss: 0.4500 - val_acc: 0.9139
Epoch 391/500
62s 125ms/step - loss: 0.1584 - acc: 0.9836 - val_loss: 0.4566 - val_acc: 0.9076
Epoch 392/500
62s 125ms/step - loss: 0.1599 - acc: 0.9827 - val_loss: 0.4594 - val_acc: 0.9099
Epoch 393/500
62s 125ms/step - loss: 0.1618 - acc: 0.9822 - val_loss: 0.4599 - val_acc: 0.9075
Epoch 394/500
62s 125ms/step - loss: 0.1573 - acc: 0.9837 - val_loss: 0.4698 - val_acc: 0.9071
Epoch 395/500
63s 125ms/step - loss: 0.1599 - acc: 0.9830 - val_loss: 0.4630 - val_acc: 0.9105
Epoch 396/500
63s 125ms/step - loss: 0.1586 - acc: 0.9832 - val_loss: 0.4705 - val_acc: 0.9099
Epoch 397/500
62s 125ms/step - loss: 0.1591 - acc: 0.9834 - val_loss: 0.4925 - val_acc: 0.9037
Epoch 398/500
62s 125ms/step - loss: 0.1575 - acc: 0.9833 - val_loss: 0.4476 - val_acc: 0.9126
Epoch 399/500
62s 125ms/step - loss: 0.1571 - acc: 0.9844 - val_loss: 0.4561 - val_acc: 0.9098
Epoch 400/500
63s 125ms/step - loss: 0.1592 - acc: 0.9832 - val_loss: 0.4602 - val_acc: 0.9069
Epoch 401/500
lr changed to 0.0009999999776482583
63s 125ms/step - loss: 0.1424 - acc: 0.9892 - val_loss: 0.4326 - val_acc: 0.9167
Epoch 402/500
62s 125ms/step - loss: 0.1313 - acc: 0.9935 - val_loss: 0.4261 - val_acc: 0.9191
Epoch 403/500
62s 125ms/step - loss: 0.1280 - acc: 0.9949 - val_loss: 0.4215 - val_acc: 0.9205
Epoch 404/500
63s 125ms/step - loss: 0.1250 - acc: 0.9958 - val_loss: 0.4211 - val_acc: 0.9215
Epoch 405/500
63s 125ms/step - loss: 0.1241 - acc: 0.9960 - val_loss: 0.4207 - val_acc: 0.9197
Epoch 406/500
63s 125ms/step - loss: 0.1230 - acc: 0.9962 - val_loss: 0.4201 - val_acc: 0.9221
Epoch 407/500
62s 125ms/step - loss: 0.1228 - acc: 0.9962 - val_loss: 0.4209 - val_acc: 0.9227
Epoch 408/500
62s 125ms/step - loss: 0.1206 - acc: 0.9969 - val_loss: 0.4220 - val_acc: 0.9218
Epoch 409/500
62s 125ms/step - loss: 0.1208 - acc: 0.9967 - val_loss: 0.4209 - val_acc: 0.9233
Epoch 410/500
63s 125ms/step - loss: 0.1197 - acc: 0.9970 - val_loss: 0.4204 - val_acc: 0.9225
Epoch 411/500
62s 125ms/step - loss: 0.1196 - acc: 0.9971 - val_loss: 0.4201 - val_acc: 0.9239
Epoch 412/500
63s 125ms/step - loss: 0.1185 - acc: 0.9973 - val_loss: 0.4205 - val_acc: 0.9232
Epoch 413/500
63s 125ms/step - loss: 0.1177 - acc: 0.9973 - val_loss: 0.4199 - val_acc: 0.9232
Epoch 414/500
62s 125ms/step - loss: 0.1176 - acc: 0.9974 - val_loss: 0.4226 - val_acc: 0.9239
Epoch 415/500
63s 125ms/step - loss: 0.1171 - acc: 0.9975 - val_loss: 0.4222 - val_acc: 0.9236
Epoch 416/500
62s 125ms/step - loss: 0.1165 - acc: 0.9978 - val_loss: 0.4228 - val_acc: 0.9249
Epoch 417/500
62s 125ms/step - loss: 0.1156 - acc: 0.9978 - val_loss: 0.4213 - val_acc: 0.9249
Epoch 418/500
63s 125ms/step - loss: 0.1152 - acc: 0.9981 - val_loss: 0.4210 - val_acc: 0.9241
Epoch 419/500
62s 125ms/step - loss: 0.1149 - acc: 0.9981 - val_loss: 0.4229 - val_acc: 0.9238
Epoch 420/500
63s 125ms/step - loss: 0.1142 - acc: 0.9979 - val_loss: 0.4223 - val_acc: 0.9256
Epoch 421/500
63s 125ms/step - loss: 0.1149 - acc: 0.9977 - val_loss: 0.4237 - val_acc: 0.9249
Epoch 422/500
63s 125ms/step - loss: 0.1130 - acc: 0.9983 - val_loss: 0.4246 - val_acc: 0.9235
Epoch 423/500
62s 125ms/step - loss: 0.1137 - acc: 0.9978 - val_loss: 0.4248 - val_acc: 0.9249
Epoch 424/500
63s 125ms/step - loss: 0.1126 - acc: 0.9984 - val_loss: 0.4270 - val_acc: 0.9228
Epoch 425/500
63s 125ms/step - loss: 0.1127 - acc: 0.9982 - val_loss: 0.4265 - val_acc: 0.9239
Epoch 426/500
63s 125ms/step - loss: 0.1117 - acc: 0.9986 - val_loss: 0.4282 - val_acc: 0.9251
Epoch 427/500
63s 125ms/step - loss: 0.1120 - acc: 0.9983 - val_loss: 0.4266 - val_acc: 0.9240
Epoch 428/500
63s 125ms/step - loss: 0.1115 - acc: 0.9985 - val_loss: 0.4266 - val_acc: 0.9255
Epoch 429/500
63s 125ms/step - loss: 0.1119 - acc: 0.9982 - val_loss: 0.4273 - val_acc: 0.9265
Epoch 430/500
62s 125ms/step - loss: 0.1109 - acc: 0.9987 - val_loss: 0.4266 - val_acc: 0.9263
Epoch 431/500
62s 125ms/step - loss: 0.1105 - acc: 0.9985 - val_loss: 0.4255 - val_acc: 0.9257
Epoch 432/500
62s 125ms/step - loss: 0.1104 - acc: 0.9986 - val_loss: 0.4244 - val_acc: 0.9242
Epoch 433/500
62s 125ms/step - loss: 0.1099 - acc: 0.9985 - val_loss: 0.4246 - val_acc: 0.9262
Epoch 434/500
62s 125ms/step - loss: 0.1098 - acc: 0.9987 - val_loss: 0.4267 - val_acc: 0.9247
Epoch 435/500
63s 125ms/step - loss: 0.1097 - acc: 0.9984 - val_loss: 0.4302 - val_acc: 0.9245
Epoch 436/500
63s 125ms/step - loss: 0.1095 - acc: 0.9987 - val_loss: 0.4312 - val_acc: 0.9252
Epoch 437/500
62s 125ms/step - loss: 0.1097 - acc: 0.9984 - val_loss: 0.4281 - val_acc: 0.9238
Epoch 438/500
62s 125ms/step - loss: 0.1091 - acc: 0.9985 - val_loss: 0.4271 - val_acc: 0.9253
Epoch 439/500
62s 125ms/step - loss: 0.1086 - acc: 0.9987 - val_loss: 0.4268 - val_acc: 0.9258
Epoch 440/500
62s 125ms/step - loss: 0.1077 - acc: 0.9988 - val_loss: 0.4301 - val_acc: 0.9265
Epoch 441/500
62s 125ms/step - loss: 0.1076 - acc: 0.9989 - val_loss: 0.4288 - val_acc: 0.9253
Epoch 442/500
62s 125ms/step - loss: 0.1083 - acc: 0.9987 - val_loss: 0.4308 - val_acc: 0.9247
Epoch 443/500
62s 125ms/step - loss: 0.1073 - acc: 0.9986 - val_loss: 0.4315 - val_acc: 0.9249
Epoch 444/500
62s 125ms/step - loss: 0.1072 - acc: 0.9987 - val_loss: 0.4343 - val_acc: 0.9258
Epoch 445/500
62s 125ms/step - loss: 0.1067 - acc: 0.9987 - val_loss: 0.4325 - val_acc: 0.9249
Epoch 446/500
63s 125ms/step - loss: 0.1065 - acc: 0.9989 - val_loss: 0.4333 - val_acc: 0.9248
Epoch 447/500
62s 125ms/step - loss: 0.1061 - acc: 0.9988 - val_loss: 0.4342 - val_acc: 0.9245
Epoch 448/500
62s 125ms/step - loss: 0.1056 - acc: 0.9988 - val_loss: 0.4359 - val_acc: 0.9247
Epoch 449/500
62s 125ms/step - loss: 0.1058 - acc: 0.9988 - val_loss: 0.4357 - val_acc: 0.9241
Epoch 450/500
62s 125ms/step - loss: 0.1051 - acc: 0.9991 - val_loss: 0.4366 - val_acc: 0.9251
Epoch 451/500
63s 125ms/step - loss: 0.1054 - acc: 0.9991 - val_loss: 0.4377 - val_acc: 0.9241
Epoch 452/500
62s 125ms/step - loss: 0.1051 - acc: 0.9989 - val_loss: 0.4354 - val_acc: 0.9246
Epoch 453/500
63s 125ms/step - loss: 0.1055 - acc: 0.9987 - val_loss: 0.4350 - val_acc: 0.9239
Epoch 454/500
63s 125ms/step - loss: 0.1045 - acc: 0.9990 - val_loss: 0.4346 - val_acc: 0.9239
Epoch 455/500
62s 125ms/step - loss: 0.1047 - acc: 0.9987 - val_loss: 0.4340 - val_acc: 0.9243
Epoch 456/500
62s 125ms/step - loss: 0.1043 - acc: 0.9988 - val_loss: 0.4346 - val_acc: 0.9238
Epoch 457/500
62s 125ms/step - loss: 0.1037 - acc: 0.9990 - val_loss: 0.4334 - val_acc: 0.9249
Epoch 458/500
63s 125ms/step - loss: 0.1039 - acc: 0.9989 - val_loss: 0.4337 - val_acc: 0.9239
Epoch 459/500
63s 125ms/step - loss: 0.1040 - acc: 0.9987 - val_loss: 0.4344 - val_acc: 0.9233
Epoch 460/500
63s 125ms/step - loss: 0.1033 - acc: 0.9991 - val_loss: 0.4353 - val_acc: 0.9240
Epoch 461/500
63s 125ms/step - loss: 0.1033 - acc: 0.9987 - val_loss: 0.4383 - val_acc: 0.9236
Epoch 462/500
62s 125ms/step - loss: 0.1034 - acc: 0.9987 - val_loss: 0.4362 - val_acc: 0.9246
Epoch 463/500
63s 125ms/step - loss: 0.1030 - acc: 0.9988 - val_loss: 0.4339 - val_acc: 0.9237
Epoch 464/500
63s 125ms/step - loss: 0.1024 - acc: 0.9990 - val_loss: 0.4329 - val_acc: 0.9249
Epoch 465/500
62s 125ms/step - loss: 0.1018 - acc: 0.9992 - val_loss: 0.4323 - val_acc: 0.9241
Epoch 466/500
62s 125ms/step - loss: 0.1017 - acc: 0.9991 - val_loss: 0.4331 - val_acc: 0.9243
Epoch 467/500
63s 125ms/step - loss: 0.1018 - acc: 0.9989 - val_loss: 0.4331 - val_acc: 0.9245
Epoch 468/500
63s 125ms/step - loss: 0.1012 - acc: 0.9992 - val_loss: 0.4335 - val_acc: 0.9254
Epoch 469/500
62s 125ms/step - loss: 0.1011 - acc: 0.9990 - val_loss: 0.4332 - val_acc: 0.9247
Epoch 470/500
63s 125ms/step - loss: 0.1006 - acc: 0.9993 - val_loss: 0.4344 - val_acc: 0.9250
Epoch 471/500
62s 125ms/step - loss: 0.1009 - acc: 0.9989 - val_loss: 0.4377 - val_acc: 0.9251
Epoch 472/500
62s 125ms/step - loss: 0.1006 - acc: 0.9991 - val_loss: 0.4345 - val_acc: 0.9243
Epoch 473/500
63s 125ms/step - loss: 0.1005 - acc: 0.9992 - val_loss: 0.4328 - val_acc: 0.9245
Epoch 474/500
63s 125ms/step - loss: 0.1002 - acc: 0.9991 - val_loss: 0.4365 - val_acc: 0.9250
Epoch 475/500
63s 125ms/step - loss: 0.1005 - acc: 0.9989 - val_loss: 0.4350 - val_acc: 0.9263
Epoch 476/500
63s 125ms/step - loss: 0.0995 - acc: 0.9992 - val_loss: 0.4334 - val_acc: 0.9255
Epoch 477/500
63s 125ms/step - loss: 0.0997 - acc: 0.9989 - val_loss: 0.4335 - val_acc: 0.9251
Epoch 478/500
62s 125ms/step - loss: 0.0993 - acc: 0.9992 - val_loss: 0.4348 - val_acc: 0.9250
Epoch 479/500
62s 125ms/step - loss: 0.0992 - acc: 0.9990 - val_loss: 0.4356 - val_acc: 0.9258
Epoch 480/500
63s 125ms/step - loss: 0.0985 - acc: 0.9993 - val_loss: 0.4347 - val_acc: 0.9260
Epoch 481/500
63s 125ms/step - loss: 0.0985 - acc: 0.9992 - val_loss: 0.4349 - val_acc: 0.9247
Epoch 482/500
63s 125ms/step - loss: 0.0994 - acc: 0.9987 - val_loss: 0.4366 - val_acc: 0.9237
Epoch 483/500
63s 125ms/step - loss: 0.0982 - acc: 0.9993 - val_loss: 0.4361 - val_acc: 0.9258
Epoch 484/500
63s 125ms/step - loss: 0.0981 - acc: 0.9991 - val_loss: 0.4387 - val_acc: 0.9250
Epoch 485/500
63s 125ms/step - loss: 0.0983 - acc: 0.9989 - val_loss: 0.4367 - val_acc: 0.9248
Epoch 486/500
63s 125ms/step - loss: 0.0975 - acc: 0.9993 - val_loss: 0.4364 - val_acc: 0.9255
Epoch 487/500
63s 125ms/step - loss: 0.0979 - acc: 0.9990 - val_loss: 0.4356 - val_acc: 0.9246
Epoch 488/500
63s 125ms/step - loss: 0.0972 - acc: 0.9993 - val_loss: 0.4332 - val_acc: 0.9257
Epoch 489/500
62s 125ms/step - loss: 0.0970 - acc: 0.9991 - val_loss: 0.4337 - val_acc: 0.9255
Epoch 490/500
62s 125ms/step - loss: 0.0968 - acc: 0.9994 - val_loss: 0.4312 - val_acc: 0.9250
Epoch 491/500
62s 125ms/step - loss: 0.0964 - acc: 0.9994 - val_loss: 0.4325 - val_acc: 0.9251
Epoch 492/500
63s 125ms/step - loss: 0.0967 - acc: 0.9993 - val_loss: 0.4354 - val_acc: 0.9246
Epoch 493/500
63s 125ms/step - loss: 0.0960 - acc: 0.9993 - val_loss: 0.4337 - val_acc: 0.9250
Epoch 494/500
63s 125ms/step - loss: 0.0963 - acc: 0.9991 - val_loss: 0.4350 - val_acc: 0.9255
Epoch 495/500
63s 125ms/step - loss: 0.0961 - acc: 0.9991 - val_loss: 0.4354 - val_acc: 0.9255
Epoch 496/500
62s 125ms/step - loss: 0.0961 - acc: 0.9990 - val_loss: 0.4339 - val_acc: 0.9256
Epoch 497/500
63s 125ms/step - loss: 0.0951 - acc: 0.9994 - val_loss: 0.4338 - val_acc: 0.9243
Epoch 498/500
62s 125ms/step - loss: 0.0953 - acc: 0.9992 - val_loss: 0.4326 - val_acc: 0.9259
Epoch 499/500
63s 125ms/step - loss: 0.0949 - acc: 0.9993 - val_loss: 0.4353 - val_acc: 0.9255
Epoch 500/500
63s 125ms/step - loss: 0.0952 - acc: 0.9992 - val_loss: 0.4356 - val_acc: 0.9252
Train loss: 0.09905459056794644
Train accuracy: 0.9974600024223328
Test loss: 0.4356186859309673
Test accuracy: 0.9252000027894973

测试准确率到了92.52%,还不错。其实训练集的loss还有下降的趋势。

Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht, Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, IEEE Transactions on Industrial Electronics, DOI: 10.1109/TIE.2020.2972458, Date of Publication: 13 February 2020

https://ieeexplore.ieee.org/document/8998530

本文系转载,前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文系转载前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

评论
作者已关闭评论
0 条评论
热度
最新
推荐阅读
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档