前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >【哈工大版】Dynamic ReLU:Adaptively Parametric ReLU及Keras代码(调参记录7)

【哈工大版】Dynamic ReLU:Adaptively Parametric ReLU及Keras代码(调参记录7)

作者头像
用户7368967
修改2020-05-27 18:10:47
5540
修改2020-05-27 18:10:47
举报
文章被收录于专栏:深度学习知识深度学习知识

本文介绍哈工大团队提出的一种Dynamic ReLU激活函数,即自适应参数化ReLU激活函数,原本是应用在基于一维振动信号的故障诊断,能够让每个样本有自己独特的ReLU参数,在2019年5月3日投稿至IEEE Transactions on Industrial Electronics,2020年1月24日录用,2020年2月13日在IEEE官网公布

续上一篇:

【哈工大版】Dynamic ReLU:Adaptively Parametric ReLU及Keras代码(调参记录6)

本文冒着过拟合的风险,将卷积核的个数增加成32个、64个和128个,继续测试Adaptively Parametric ReLU(APReLU)激活函数在Cifar10图像集上的效果。

APReLU激活函数的基本原理如下图所示:

自适应参数化ReLU:一种Dynamic ReLU激活函数
自适应参数化ReLU:一种Dynamic ReLU激活函数

Keras代码如下:

#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Tue Apr 14 04:17:45 2020
Implemented using TensorFlow 1.0.1 and Keras 2.2.1

Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht,
Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, 
IEEE Transactions on Industrial Electronics, DOI: 10.1109/TIE.2020.2972458,
Date of Publication: 13 February 2020

@author: Minghang Zhao
"""

from __future__ import print_function
import keras
import numpy as np
from keras.datasets import cifar10
from keras.layers import Dense, Conv2D, BatchNormalization, Activation, Minimum
from keras.layers import AveragePooling2D, Input, GlobalAveragePooling2D, Concatenate, Reshape
from keras.regularizers import l2
from keras import backend as K
from keras.models import Model
from keras import optimizers
from keras.preprocessing.image import ImageDataGenerator
from keras.callbacks import LearningRateScheduler
K.set_learning_phase(1)

# The data, split between train and test sets
(x_train, y_train), (x_test, y_test) = cifar10.load_data()
x_train = x_train.astype('float32') / 255.
x_test = x_test.astype('float32') / 255.
x_test = x_test-np.mean(x_train)
x_train = x_train-np.mean(x_train)
print('x_train shape:', x_train.shape)
print(x_train.shape[0], 'train samples')
print(x_test.shape[0], 'test samples')

# convert class vectors to binary class matrices
y_train = keras.utils.to_categorical(y_train, 10)
y_test = keras.utils.to_categorical(y_test, 10)

# Schedule the learning rate, multiply 0.1 every 300 epoches
def scheduler(epoch):
    if epoch % 300 == 0 and epoch != 0:
        lr = K.get_value(model.optimizer.lr)
        K.set_value(model.optimizer.lr, lr * 0.1)
        print("lr changed to {}".format(lr * 0.1))
    return K.get_value(model.optimizer.lr)

# An adaptively parametric rectifier linear unit (APReLU)
def aprelu(inputs):
    # get the number of channels
    channels = inputs.get_shape().as_list()[-1]
    # get a zero feature map
    zeros_input = keras.layers.subtract([inputs, inputs])
    # get a feature map with only positive features
    pos_input = Activation('relu')(inputs)
    # get a feature map with only negative features
    neg_input = Minimum()([inputs,zeros_input])
    # define a network to obtain the scaling coefficients
    scales_p = GlobalAveragePooling2D()(pos_input)
    scales_n = GlobalAveragePooling2D()(neg_input)
    scales = Concatenate()([scales_n, scales_p])
    scales = Dense(channels, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales)
    scales = BatchNormalization()(scales)
    scales = Activation('relu')(scales)
    scales = Dense(channels, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales)
    scales = BatchNormalization()(scales)
    scales = Activation('sigmoid')(scales)
    scales = Reshape((1,1,channels))(scales)
    # apply a paramtetric relu
    neg_part = keras.layers.multiply([scales, neg_input])
    return keras.layers.add([pos_input, neg_part])

# Residual Block
def residual_block(incoming, nb_blocks, out_channels, downsample=False,
                   downsample_strides=2):
    
    residual = incoming
    in_channels = incoming.get_shape().as_list()[-1]
    
    for i in range(nb_blocks):
        
        identity = residual
        
        if not downsample:
            downsample_strides = 1
        
        residual = BatchNormalization()(residual)
        residual = aprelu(residual)
        residual = Conv2D(out_channels, 3, strides=(downsample_strides, downsample_strides), 
                          padding='same', kernel_initializer='he_normal', 
                          kernel_regularizer=l2(1e-4))(residual)
        
        residual = BatchNormalization()(residual)
        residual = aprelu(residual)
        residual = Conv2D(out_channels, 3, padding='same', kernel_initializer='he_normal', 
                          kernel_regularizer=l2(1e-4))(residual)
        
        # Downsampling
        if downsample_strides > 1:
            identity = AveragePooling2D(pool_size=(1,1), strides=(2,2))(identity)
            
        # Zero_padding to match channels
        if in_channels != out_channels:
            zeros_identity = keras.layers.subtract([identity, identity])
            identity = keras.layers.concatenate([identity, zeros_identity])
            in_channels = out_channels
        
        residual = keras.layers.add([residual, identity])
    
    return residual


# define and train a model
inputs = Input(shape=(32, 32, 3))
net = Conv2D(32, 3, padding='same', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(inputs)
net = residual_block(net, 9, 32, downsample=False)
net = residual_block(net, 1, 64, downsample=True)
net = residual_block(net, 8, 64, downsample=False)
net = residual_block(net, 1, 128, downsample=True)
net = residual_block(net, 8, 128, downsample=False)
net = BatchNormalization()(net)
net = Activation('relu')(net)
net = GlobalAveragePooling2D()(net)
outputs = Dense(10, activation='softmax', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(net)
model = Model(inputs=inputs, outputs=outputs)
sgd = optimizers.SGD(lr=0.1, decay=0., momentum=0.9, nesterov=True)
model.compile(loss='categorical_crossentropy', optimizer=sgd, metrics=['accuracy'])

# data augmentation
datagen = ImageDataGenerator(
    # randomly rotate images in the range (deg 0 to 180)
    rotation_range=30,
    # randomly flip images
    horizontal_flip=True,
    # randomly shift images horizontally
    width_shift_range=0.125,
    # randomly shift images vertically
    height_shift_range=0.125)

reduce_lr = LearningRateScheduler(scheduler)
# fit the model on the batches generated by datagen.flow().
model.fit_generator(datagen.flow(x_train, y_train, batch_size=100),
                    validation_data=(x_test, y_test), epochs=1000, 
                    verbose=1, callbacks=[reduce_lr], workers=4)

# get results
K.set_learning_phase(0)
DRSN_train_score = model.evaluate(x_train, y_train, batch_size=100, verbose=0)
print('Train loss:', DRSN_train_score[0])
print('Train accuracy:', DRSN_train_score[1])
DRSN_test_score = model.evaluate(x_test, y_test, batch_size=100, verbose=0)
print('Test loss:', DRSN_test_score[0])
print('Test accuracy:', DRSN_test_score[1])

先复制一次spyder窗口里的实验结果:

Epoch 270/1000
91s 182ms/step - loss: 0.5576 - acc: 0.9245 - val_loss: 0.6619 - val_acc: 0.8960
Epoch 271/1000
91s 182ms/step - loss: 0.5605 - acc: 0.9250 - val_loss: 0.6675 - val_acc: 0.8908
Epoch 272/1000
91s 182ms/step - loss: 0.5578 - acc: 0.9244 - val_loss: 0.6578 - val_acc: 0.8951
Epoch 273/1000
91s 182ms/step - loss: 0.5625 - acc: 0.9232 - val_loss: 0.6663 - val_acc: 0.8907
Epoch 274/1000
91s 182ms/step - loss: 0.5598 - acc: 0.9246 - val_loss: 0.6435 - val_acc: 0.9059
Epoch 275/1000
91s 182ms/step - loss: 0.5567 - acc: 0.9265 - val_loss: 0.6589 - val_acc: 0.8949
Epoch 276/1000
91s 182ms/step - loss: 0.5616 - acc: 0.9235 - val_loss: 0.6439 - val_acc: 0.9002
Epoch 277/1000
91s 182ms/step - loss: 0.5568 - acc: 0.9258 - val_loss: 0.6731 - val_acc: 0.8913 ETA: 16s - loss: 0.5542 - acc: 0.9269
Epoch 278/1000
91s 182ms/step - loss: 0.5582 - acc: 0.9254 - val_loss: 0.6437 - val_acc: 0.8995
Epoch 279/1000
91s 182ms/step - loss: 0.5530 - acc: 0.9270 - val_loss: 0.6416 - val_acc: 0.9002
Epoch 280/1000
91s 182ms/step - loss: 0.5603 - acc: 0.9245 - val_loss: 0.6566 - val_acc: 0.8960
Epoch 281/1000
91s 182ms/step - loss: 0.5613 - acc: 0.9241 - val_loss: 0.6432 - val_acc: 0.9003
Epoch 282/1000
91s 182ms/step - loss: 0.5568 - acc: 0.9250 - val_loss: 0.6573 - val_acc: 0.8950
Epoch 283/1000
91s 182ms/step - loss: 0.5580 - acc: 0.9253 - val_loss: 0.6518 - val_acc: 0.8961 ETA: 10s - loss: 0.5551 - acc: 0.9260
Epoch 284/1000
91s 182ms/step - loss: 0.5495 - acc: 0.9276 - val_loss: 0.6736 - val_acc: 0.8918
Epoch 285/1000
91s 182ms/step - loss: 0.5611 - acc: 0.9238 - val_loss: 0.6538 - val_acc: 0.8962
Epoch 286/1000
91s 182ms/step - loss: 0.5590 - acc: 0.9250 - val_loss: 0.6563 - val_acc: 0.8965
Epoch 287/1000
91s 182ms/step - loss: 0.5581 - acc: 0.9245 - val_loss: 0.6482 - val_acc: 0.9035
Epoch 288/1000
91s 182ms/step - loss: 0.5607 - acc: 0.9233 - val_loss: 0.6516 - val_acc: 0.8984
Epoch 289/1000
91s 182ms/step - loss: 0.5608 - acc: 0.9252 - val_loss: 0.6562 - val_acc: 0.8984
Epoch 290/1000
91s 182ms/step - loss: 0.5599 - acc: 0.9240 - val_loss: 0.6941 - val_acc: 0.8847
Epoch 291/1000
91s 182ms/step - loss: 0.5600 - acc: 0.9244 - val_loss: 0.6695 - val_acc: 0.8902
Epoch 292/1000
91s 182ms/step - loss: 0.5628 - acc: 0.9232 - val_loss: 0.6580 - val_acc: 0.8979
Epoch 293/1000
91s 182ms/step - loss: 0.5602 - acc: 0.9242 - val_loss: 0.6726 - val_acc: 0.8913
Epoch 294/1000
91s 182ms/step - loss: 0.5582 - acc: 0.9249 - val_loss: 0.6917 - val_acc: 0.8901
Epoch 295/1000
91s 182ms/step - loss: 0.5559 - acc: 0.9265 - val_loss: 0.6805 - val_acc: 0.88967/500 [======================>.......] - ETA: 19s - loss: 0.5537 - acc: 0.9275
Epoch 296/1000
91s 182ms/step - loss: 0.5570 - acc: 0.9265 - val_loss: 0.6315 - val_acc: 0.9039
Epoch 297/1000
91s 182ms/step - loss: 0.5572 - acc: 0.9244 - val_loss: 0.6647 - val_acc: 0.8918
Epoch 298/1000
91s 182ms/step - loss: 0.5555 - acc: 0.9259 - val_loss: 0.6540 - val_acc: 0.8960
Epoch 299/1000
91s 182ms/step - loss: 0.5575 - acc: 0.9266 - val_loss: 0.6648 - val_acc: 0.8941
Epoch 300/1000
91s 182ms/step - loss: 0.5517 - acc: 0.9277 - val_loss: 0.6555 - val_acc: 0.8975
Epoch 301/1000
lr changed to 0.010000000149011612
91s 182ms/step - loss: 0.4683 - acc: 0.9572 - val_loss: 0.5677 - val_acc: 0.9248
Epoch 302/1000
91s 182ms/step - loss: 0.4174 - acc: 0.9735 - val_loss: 0.5622 - val_acc: 0.9256
Epoch 303/1000
91s 182ms/step - loss: 0.3968 - acc: 0.9785 - val_loss: 0.5500 - val_acc: 0.9291
Epoch 304/1000
91s 182ms/step - loss: 0.3806 - acc: 0.9814 - val_loss: 0.5520 - val_acc: 0.9283
Epoch 305/1000
91s 181ms/step - loss: 0.3687 - acc: 0.9832 - val_loss: 0.5442 - val_acc: 0.9306
Epoch 306/1000
91s 181ms/step - loss: 0.3555 - acc: 0.9864 - val_loss: 0.5454 - val_acc: 0.9284
Epoch 307/1000
91s 182ms/step - loss: 0.3485 - acc: 0.9863 - val_loss: 0.5409 - val_acc: 0.9286
Epoch 308/1000
91s 181ms/step - loss: 0.3379 - acc: 0.9885 - val_loss: 0.5383 - val_acc: 0.9305
Epoch 309/1000
91s 181ms/step - loss: 0.3272 - acc: 0.9904 - val_loss: 0.5344 - val_acc: 0.9309
Epoch 310/1000
90s 181ms/step - loss: 0.3213 - acc: 0.9900 - val_loss: 0.5333 - val_acc: 0.9298
Epoch 311/1000
90s 181ms/step - loss: 0.3143 - acc: 0.9909 - val_loss: 0.5365 - val_acc: 0.9283
Epoch 312/1000
90s 181ms/step - loss: 0.3092 - acc: 0.9910 - val_loss: 0.5287 - val_acc: 0.9311
Epoch 313/1000
90s 181ms/step - loss: 0.3006 - acc: 0.9919 - val_loss: 0.5324 - val_acc: 0.9283
Epoch 314/1000
90s 181ms/step - loss: 0.2945 - acc: 0.9916 - val_loss: 0.5286 - val_acc: 0.9300
Epoch 315/1000
90s 181ms/step - loss: 0.2886 - acc: 0.9923 - val_loss: 0.5181 - val_acc: 0.9323
Epoch 316/1000
91s 181ms/step - loss: 0.2823 - acc: 0.9932 - val_loss: 0.5212 - val_acc: 0.9286
Epoch 317/1000
90s 181ms/step - loss: 0.2778 - acc: 0.9930 - val_loss: 0.5182 - val_acc: 0.9296
Epoch 318/1000
91s 181ms/step - loss: 0.2720 - acc: 0.9936 - val_loss: 0.5122 - val_acc: 0.9287
Epoch 319/1000
91s 181ms/step - loss: 0.2662 - acc: 0.9940 - val_loss: 0.5083 - val_acc: 0.9277
Epoch 320/1000
91s 181ms/step - loss: 0.2597 - acc: 0.9944 - val_loss: 0.5018 - val_acc: 0.9315
Epoch 321/1000
91s 181ms/step - loss: 0.2560 - acc: 0.9944 - val_loss: 0.5086 - val_acc: 0.9296
Epoch 322/1000
90s 181ms/step - loss: 0.2526 - acc: 0.9939 - val_loss: 0.5059 - val_acc: 0.9274
Epoch 323/1000
90s 181ms/step - loss: 0.2466 - acc: 0.9945 - val_loss: 0.4991 - val_acc: 0.9302
Epoch 324/1000
90s 181ms/step - loss: 0.2431 - acc: 0.9945 - val_loss: 0.5006 - val_acc: 0.9273
Epoch 325/1000
90s 181ms/step - loss: 0.2384 - acc: 0.9947 - val_loss: 0.4914 - val_acc: 0.9296
Epoch 326/1000
91s 181ms/step - loss: 0.2334 - acc: 0.9948 - val_loss: 0.4938 - val_acc: 0.9291
Epoch 327/1000
91s 181ms/step - loss: 0.2301 - acc: 0.9949 - val_loss: 0.4869 - val_acc: 0.9303
Epoch 328/1000
90s 181ms/step - loss: 0.2253 - acc: 0.9952 - val_loss: 0.4850 - val_acc: 0.9293
Epoch 329/1000
91s 181ms/step - loss: 0.2219 - acc: 0.9953 - val_loss: 0.4858 - val_acc: 0.9272
Epoch 330/1000
90s 181ms/step - loss: 0.2170 - acc: 0.9959 - val_loss: 0.4834 - val_acc: 0.9277
Epoch 331/1000
90s 181ms/step - loss: 0.2140 - acc: 0.9953 - val_loss: 0.4814 - val_acc: 0.9276
Epoch 332/1000
90s 181ms/step - loss: 0.2118 - acc: 0.9951 - val_loss: 0.4767 - val_acc: 0.9273
Epoch 333/1000
90s 181ms/step - loss: 0.2077 - acc: 0.9953 - val_loss: 0.4709 - val_acc: 0.9303
Epoch 334/1000
91s 181ms/step - loss: 0.2042 - acc: 0.9952 - val_loss: 0.4808 - val_acc: 0.9257
Epoch 335/1000
90s 181ms/step - loss: 0.2015 - acc: 0.9951 - val_loss: 0.4691 - val_acc: 0.9287
Epoch 336/1000
90s 181ms/step - loss: 0.1988 - acc: 0.9952 - val_loss: 0.4659 - val_acc: 0.9273
Epoch 337/1000
90s 181ms/step - loss: 0.1930 - acc: 0.9961 - val_loss: 0.4667 - val_acc: 0.9293
Epoch 338/1000
90s 181ms/step - loss: 0.1901 - acc: 0.9961 - val_loss: 0.4559 - val_acc: 0.9299
Epoch 339/1000
90s 181ms/step - loss: 0.1872 - acc: 0.9962 - val_loss: 0.4676 - val_acc: 0.9269
Epoch 340/1000
90s 181ms/step - loss: 0.1890 - acc: 0.9940 - val_loss: 0.4556 - val_acc: 0.9291
Epoch 341/1000
90s 181ms/step - loss: 0.1832 - acc: 0.9954 - val_loss: 0.4552 - val_acc: 0.9268
Epoch 342/1000
90s 181ms/step - loss: 0.1798 - acc: 0.9954 - val_loss: 0.4556 - val_acc: 0.9294
Epoch 343/1000
90s 181ms/step - loss: 0.1782 - acc: 0.9950 - val_loss: 0.4498 - val_acc: 0.9255
Epoch 344/1000
91s 181ms/step - loss: 0.1775 - acc: 0.9943 - val_loss: 0.4522 - val_acc: 0.9263
Epoch 345/1000
90s 181ms/step - loss: 0.1747 - acc: 0.9950 - val_loss: 0.4376 - val_acc: 0.9258
Epoch 346/1000
90s 181ms/step - loss: 0.1702 - acc: 0.9955 - val_loss: 0.4464 - val_acc: 0.9263
Epoch 347/1000
90s 181ms/step - loss: 0.1693 - acc: 0.9949 - val_loss: 0.4515 - val_acc: 0.9269
Epoch 348/1000
90s 181ms/step - loss: 0.1654 - acc: 0.9951 - val_loss: 0.4452 - val_acc: 0.9249
Epoch 349/1000
90s 181ms/step - loss: 0.1649 - acc: 0.9948 - val_loss: 0.4461 - val_acc: 0.9249
Epoch 350/1000
90s 181ms/step - loss: 0.1632 - acc: 0.9944 - val_loss: 0.4301 - val_acc: 0.9291
Epoch 351/1000
91s 181ms/step - loss: 0.1616 - acc: 0.9941 - val_loss: 0.4411 - val_acc: 0.9237
Epoch 352/1000
90s 181ms/step - loss: 0.1594 - acc: 0.9948 - val_loss: 0.4301 - val_acc: 0.9308
Epoch 353/1000
90s 181ms/step - loss: 0.1593 - acc: 0.9937 - val_loss: 0.4230 - val_acc: 0.9265
Epoch 354/1000
90s 181ms/step - loss: 0.1565 - acc: 0.9942 - val_loss: 0.4243 - val_acc: 0.9272
Epoch 355/1000
90s 181ms/step - loss: 0.1532 - acc: 0.9946 - val_loss: 0.4290 - val_acc: 0.9258
Epoch 356/1000
90s 181ms/step - loss: 0.1525 - acc: 0.9945 - val_loss: 0.4171 - val_acc: 0.9294
Epoch 357/1000
90s 181ms/step - loss: 0.1505 - acc: 0.9943 - val_loss: 0.4205 - val_acc: 0.9273
Epoch 358/1000
90s 181ms/step - loss: 0.1481 - acc: 0.9945 - val_loss: 0.4295 - val_acc: 0.9227
Epoch 359/1000
90s 181ms/step - loss: 0.1487 - acc: 0.9938 - val_loss: 0.4185 - val_acc: 0.9248
Epoch 360/1000
91s 181ms/step - loss: 0.1452 - acc: 0.9946 - val_loss: 0.4244 - val_acc: 0.9256
Epoch 361/1000
90s 181ms/step - loss: 0.1481 - acc: 0.9925 - val_loss: 0.4267 - val_acc: 0.9220
Epoch 362/1000
91s 181ms/step - loss: 0.1468 - acc: 0.9929 - val_loss: 0.4009 - val_acc: 0.9265
Epoch 363/1000
90s 181ms/step - loss: 0.1433 - acc: 0.9941 - val_loss: 0.4098 - val_acc: 0.9259
Epoch 364/1000
91s 181ms/step - loss: 0.1441 - acc: 0.9928 - val_loss: 0.4189 - val_acc: 0.9234
Epoch 365/1000
91s 181ms/step - loss: 0.1426 - acc: 0.9934 - val_loss: 0.4099 - val_acc: 0.9251
Epoch 366/1000
90s 181ms/step - loss: 0.1383 - acc: 0.9941 - val_loss: 0.4007 - val_acc: 0.9256
Epoch 367/1000
91s 181ms/step - loss: 0.1395 - acc: 0.9933 - val_loss: 0.3938 - val_acc: 0.9269
Epoch 368/1000
90s 181ms/step - loss: 0.1379 - acc: 0.9934 - val_loss: 0.4024 - val_acc: 0.9253
Epoch 369/1000
90s 181ms/step - loss: 0.1359 - acc: 0.9935 - val_loss: 0.4021 - val_acc: 0.9265
Epoch 370/1000
90s 181ms/step - loss: 0.1370 - acc: 0.9928 - val_loss: 0.3925 - val_acc: 0.9270
Epoch 371/1000
90s 181ms/step - loss: 0.1373 - acc: 0.9924 - val_loss: 0.3932 - val_acc: 0.9259
Epoch 372/1000
90s 181ms/step - loss: 0.1349 - acc: 0.9926 - val_loss: 0.4055 - val_acc: 0.9254
Epoch 373/1000
90s 181ms/step - loss: 0.1342 - acc: 0.9927 - val_loss: 0.3934 - val_acc: 0.9289
Epoch 374/1000
90s 181ms/step - loss: 0.1352 - acc: 0.9919 - val_loss: 0.4131 - val_acc: 0.9225
Epoch 375/1000
91s 181ms/step - loss: 0.1351 - acc: 0.9917 - val_loss: 0.3916 - val_acc: 0.9249
Epoch 376/1000
90s 181ms/step - loss: 0.1317 - acc: 0.9929 - val_loss: 0.4016 - val_acc: 0.9237
Epoch 377/1000
90s 181ms/step - loss: 0.1316 - acc: 0.9930 - val_loss: 0.3906 - val_acc: 0.9259
Epoch 378/1000
90s 181ms/step - loss: 0.1307 - acc: 0.9925 - val_loss: 0.3954 - val_acc: 0.9248
Epoch 379/1000
90s 181ms/step - loss: 0.1328 - acc: 0.9914 - val_loss: 0.3997 - val_acc: 0.9221
Epoch 380/1000
90s 181ms/step - loss: 0.1345 - acc: 0.9902 - val_loss: 0.3934 - val_acc: 0.9260
Epoch 381/1000
90s 181ms/step - loss: 0.1319 - acc: 0.9915 - val_loss: 0.3973 - val_acc: 0.9232
Epoch 382/1000
90s 181ms/step - loss: 0.1307 - acc: 0.9920 - val_loss: 0.4105 - val_acc: 0.9220
Epoch 383/1000
90s 181ms/step - loss: 0.1281 - acc: 0.9924 - val_loss: 0.3980 - val_acc: 0.9242
Epoch 384/1000
90s 181ms/step - loss: 0.1305 - acc: 0.9911 - val_loss: 0.4200 - val_acc: 0.9194
Epoch 385/1000
90s 181ms/step - loss: 0.1311 - acc: 0.9910 - val_loss: 0.4101 - val_acc: 0.9184
Epoch 386/1000
91s 181ms/step - loss: 0.1291 - acc: 0.9913 - val_loss: 0.4074 - val_acc: 0.9225
Epoch 387/1000
90s 181ms/step - loss: 0.1316 - acc: 0.9902 - val_loss: 0.4087 - val_acc: 0.9180
Epoch 388/1000
90s 181ms/step - loss: 0.1306 - acc: 0.9906 - val_loss: 0.4021 - val_acc: 0.9192
Epoch 389/1000
90s 181ms/step - loss: 0.1295 - acc: 0.9910 - val_loss: 0.3877 - val_acc: 0.9250
Epoch 390/1000
90s 181ms/step - loss: 0.1285 - acc: 0.9913 - val_loss: 0.3914 - val_acc: 0.9208
Epoch 391/1000
90s 181ms/step - loss: 0.1284 - acc: 0.9911 - val_loss: 0.3887 - val_acc: 0.9221
Epoch 392/1000
90s 181ms/step - loss: 0.1289 - acc: 0.9911 - val_loss: 0.3992 - val_acc: 0.9262
Epoch 393/1000
90s 181ms/step - loss: 0.1265 - acc: 0.9919 - val_loss: 0.4006 - val_acc: 0.9213
Epoch 394/1000
90s 181ms/step - loss: 0.1261 - acc: 0.9911 - val_loss: 0.3943 - val_acc: 0.9238
Epoch 395/1000
90s 181ms/step - loss: 0.1277 - acc: 0.9908 - val_loss: 0.3963 - val_acc: 0.9236
Epoch 396/1000
90s 181ms/step - loss: 0.1286 - acc: 0.9902 - val_loss: 0.4147 - val_acc: 0.9194
Epoch 397/1000
90s 181ms/step - loss: 0.1309 - acc: 0.9894 - val_loss: 0.3996 - val_acc: 0.9192
Epoch 398/1000
91s 181ms/step - loss: 0.1268 - acc: 0.9912 - val_loss: 0.3952 - val_acc: 0.9225
Epoch 399/1000
90s 181ms/step - loss: 0.1255 - acc: 0.9911 - val_loss: 0.4084 - val_acc: 0.9204
Epoch 400/1000
91s 181ms/step - loss: 0.1268 - acc: 0.9902 - val_loss: 0.3954 - val_acc: 0.9209
Epoch 401/1000
90s 181ms/step - loss: 0.1263 - acc: 0.9902 - val_loss: 0.4022 - val_acc: 0.9224
Epoch 402/1000
90s 181ms/step - loss: 0.1270 - acc: 0.9904 - val_loss: 0.3891 - val_acc: 0.9246
Epoch 403/1000
90s 181ms/step - loss: 0.1272 - acc: 0.9899 - val_loss: 0.4038 - val_acc: 0.9202
Epoch 404/1000
91s 181ms/step - loss: 0.1307 - acc: 0.9885 - val_loss: 0.4022 - val_acc: 0.9205
Epoch 405/1000
91s 181ms/step - loss: 0.1298 - acc: 0.9891 - val_loss: 0.3900 - val_acc: 0.9213
Epoch 406/1000
90s 181ms/step - loss: 0.1277 - acc: 0.9897 - val_loss: 0.3946 - val_acc: 0.9209
Epoch 407/1000
90s 181ms/step - loss: 0.1257 - acc: 0.9905 - val_loss: 0.3962 - val_acc: 0.9216
Epoch 408/1000
91s 181ms/step - loss: 0.1262 - acc: 0.9906 - val_loss: 0.4070 - val_acc: 0.9205
Epoch 409/1000
90s 181ms/step - loss: 0.1273 - acc: 0.9899 - val_loss: 0.3869 - val_acc: 0.9249
Epoch 410/1000
90s 181ms/step - loss: 0.1268 - acc: 0.9902 - val_loss: 0.4044 - val_acc: 0.9201
Epoch 411/1000
90s 181ms/step - loss: 0.1264 - acc: 0.9900 - val_loss: 0.4039 - val_acc: 0.9214
Epoch 412/1000
91s 181ms/step - loss: 0.1278 - acc: 0.9896 - val_loss: 0.4072 - val_acc: 0.9187
Epoch 413/1000
90s 181ms/step - loss: 0.1267 - acc: 0.9900 - val_loss: 0.4132 - val_acc: 0.9174
Epoch 414/1000
90s 181ms/step - loss: 0.1294 - acc: 0.9890 - val_loss: 0.3933 - val_acc: 0.9214
Epoch 415/1000
90s 181ms/step - loss: 0.1236 - acc: 0.9911 - val_loss: 0.4097 - val_acc: 0.9205
Epoch 416/1000
90s 181ms/step - loss: 0.1279 - acc: 0.9896 - val_loss: 0.3939 - val_acc: 0.9206
Epoch 417/1000
90s 181ms/step - loss: 0.1243 - acc: 0.9907 - val_loss: 0.4011 - val_acc: 0.9213
Epoch 418/1000
90s 181ms/step - loss: 0.1255 - acc: 0.9904 - val_loss: 0.4279 - val_acc: 0.9141
Epoch 419/1000
91s 181ms/step - loss: 0.1267 - acc: 0.9905 - val_loss: 0.4297 - val_acc: 0.9130
Epoch 420/1000
90s 181ms/step - loss: 0.1245 - acc: 0.9907 - val_loss: 0.4141 - val_acc: 0.9166
Epoch 421/1000
90s 181ms/step - loss: 0.1270 - acc: 0.9897 - val_loss: 0.3903 - val_acc: 0.9203
Epoch 422/1000
90s 181ms/step - loss: 0.1213 - acc: 0.9916 - val_loss: 0.4057 - val_acc: 0.9199
Epoch 423/1000
91s 181ms/step - loss: 0.1213 - acc: 0.9915 - val_loss: 0.3929 - val_acc: 0.9192
Epoch 424/1000
90s 181ms/step - loss: 0.1215 - acc: 0.9916 - val_loss: 0.3834 - val_acc: 0.9251
Epoch 425/1000
90s 181ms/step - loss: 0.1224 - acc: 0.9905 - val_loss: 0.4071 - val_acc: 0.9215
Epoch 426/1000
91s 181ms/step - loss: 0.1280 - acc: 0.9891 - val_loss: 0.4023 - val_acc: 0.9208
Epoch 427/1000
91s 181ms/step - loss: 0.1274 - acc: 0.9893 - val_loss: 0.3839 - val_acc: 0.9223
Epoch 428/1000
90s 181ms/step - loss: 0.1244 - acc: 0.9904 - val_loss: 0.3948 - val_acc: 0.9215
Epoch 429/1000
90s 181ms/step - loss: 0.1247 - acc: 0.9899 - val_loss: 0.4135 - val_acc: 0.9181
Epoch 430/1000
91s 181ms/step - loss: 0.1218 - acc: 0.9915 - val_loss: 0.3810 - val_acc: 0.9256
Epoch 431/1000
90s 181ms/step - loss: 0.1230 - acc: 0.9905 - val_loss: 0.3961 - val_acc: 0.9203
Epoch 432/1000
91s 182ms/step - loss: 0.1262 - acc: 0.9894 - val_loss: 0.3939 - val_acc: 0.9213
Epoch 433/1000
91s 182ms/step - loss: 0.1273 - acc: 0.9889 - val_loss: 0.4070 - val_acc: 0.9139
Epoch 434/1000
91s 182ms/step - loss: 0.1228 - acc: 0.9911 - val_loss: 0.3896 - val_acc: 0.9214
Epoch 435/1000
91s 182ms/step - loss: 0.1252 - acc: 0.9900 - val_loss: 0.3858 - val_acc: 0.9217
Epoch 436/1000
91s 182ms/step - loss: 0.1246 - acc: 0.9905 - val_loss: 0.3926 - val_acc: 0.9214
Epoch 437/1000
91s 182ms/step - loss: 0.1254 - acc: 0.9897 - val_loss: 0.3927 - val_acc: 0.9247
Epoch 438/1000
91s 182ms/step - loss: 0.1238 - acc: 0.9903 - val_loss: 0.4091 - val_acc: 0.9155
Epoch 439/1000
91s 182ms/step - loss: 0.1259 - acc: 0.9895 - val_loss: 0.4237 - val_acc: 0.9116
Epoch 440/1000
91s 182ms/step - loss: 0.1263 - acc: 0.9896 - val_loss: 0.4008 - val_acc: 0.9178
Epoch 441/1000
91s 182ms/step - loss: 0.1268 - acc: 0.9892 - val_loss: 0.4129 - val_acc: 0.9141
Epoch 442/1000
91s 182ms/step - loss: 0.1261 - acc: 0.9902 - val_loss: 0.3831 - val_acc: 0.9238
Epoch 443/1000
91s 182ms/step - loss: 0.1234 - acc: 0.9905 - val_loss: 0.4066 - val_acc: 0.9175
Epoch 444/1000
91s 182ms/step - loss: 0.1258 - acc: 0.9903 - val_loss: 0.4081 - val_acc: 0.9177
Epoch 445/1000
91s 182ms/step - loss: 0.1279 - acc: 0.9889 - val_loss: 0.3980 - val_acc: 0.9208
Epoch 446/1000
91s 181ms/step - loss: 0.1257 - acc: 0.9896 - val_loss: 0.3887 - val_acc: 0.9220
Epoch 447/1000
91s 182ms/step - loss: 0.1240 - acc: 0.9905 - val_loss: 0.4044 - val_acc: 0.9180
Epoch 448/1000
91s 182ms/step - loss: 0.1270 - acc: 0.9895 - val_loss: 0.4061 - val_acc: 0.9189
Epoch 449/1000
91s 182ms/step - loss: 0.1229 - acc: 0.9911 - val_loss: 0.3971 - val_acc: 0.9220
Epoch 450/1000
91s 182ms/step - loss: 0.1217 - acc: 0.9918 - val_loss: 0.4036 - val_acc: 0.9227
Epoch 451/1000
91s 182ms/step - loss: 0.1240 - acc: 0.9906 - val_loss: 0.4011 - val_acc: 0.9216
Epoch 452/1000
91s 182ms/step - loss: 0.1239 - acc: 0.9901 - val_loss: 0.4079 - val_acc: 0.9173
Epoch 453/1000
91s 182ms/step - loss: 0.1224 - acc: 0.9906 - val_loss: 0.3917 - val_acc: 0.9240
Epoch 454/1000
91s 182ms/step - loss: 0.1265 - acc: 0.9891 - val_loss: 0.3877 - val_acc: 0.9235 ETA: 34s - loss: 0.1254 - acc: 0.9893
Epoch 455/1000
91s 182ms/step - loss: 0.1233 - acc: 0.9910 - val_loss: 0.4031 - val_acc: 0.9177
Epoch 456/1000
91s 182ms/step - loss: 0.1239 - acc: 0.9904 - val_loss: 0.4203 - val_acc: 0.9185
Epoch 457/1000
91s 182ms/step - loss: 0.1240 - acc: 0.9905 - val_loss: 0.3918 - val_acc: 0.9247
Epoch 458/1000
91s 182ms/step - loss: 0.1247 - acc: 0.9898 - val_loss: 0.4155 - val_acc: 0.9176
Epoch 459/1000
91s 182ms/step - loss: 0.1239 - acc: 0.9900 - val_loss: 0.3980 - val_acc: 0.9207
Epoch 460/1000
91s 182ms/step - loss: 0.1296 - acc: 0.9881 - val_loss: 0.3954 - val_acc: 0.9190 ETA: 1:14 - loss: 0.1236 - acc: 0.9908 - ETA: 1:09 - loss: 0.1257 - acc: 0.9903
Epoch 461/1000
91s 182ms/step - loss: 0.1232 - acc: 0.9908 - val_loss: 0.4039 - val_acc: 0.9223
Epoch 462/1000
91s 182ms/step - loss: 0.1283 - acc: 0.9888 - val_loss: 0.4285 - val_acc: 0.9136
Epoch 463/1000
91s 182ms/step - loss: 0.1264 - acc: 0.9899 - val_loss: 0.4025 - val_acc: 0.9191
Epoch 464/1000
91s 181ms/step - loss: 0.1236 - acc: 0.9909 - val_loss: 0.3952 - val_acc: 0.9205
Epoch 465/1000
91s 182ms/step - loss: 0.1204 - acc: 0.9921 - val_loss: 0.4008 - val_acc: 0.9207 ETA: 33s - loss: 0.1189 - acc: 0.9928
Epoch 466/1000
90s 181ms/step - loss: 0.1233 - acc: 0.9905 - val_loss: 0.4098 - val_acc: 0.9158
Epoch 467/1000
91s 182ms/step - loss: 0.1207 - acc: 0.9916 - val_loss: 0.4012 - val_acc: 0.9160
Epoch 468/1000
91s 182ms/step - loss: 0.1231 - acc: 0.9910 - val_loss: 0.3880 - val_acc: 0.9248
Epoch 469/1000
91s 182ms/step - loss: 0.1241 - acc: 0.9900 - val_loss: 0.4136 - val_acc: 0.9175
Epoch 470/1000
91s 182ms/step - loss: 0.1255 - acc: 0.9894 - val_loss: 0.4084 - val_acc: 0.9202
Epoch 471/1000
91s 182ms/step - loss: 0.1253 - acc: 0.9902 - val_loss: 0.3892 - val_acc: 0.9225
Epoch 472/1000
91s 182ms/step - loss: 0.1269 - acc: 0.9891 - val_loss: 0.4101 - val_acc: 0.9201
Epoch 473/1000
91s 182ms/step - loss: 0.1226 - acc: 0.9913 - val_loss: 0.4143 - val_acc: 0.9167
Epoch 474/1000
91s 182ms/step - loss: 0.1230 - acc: 0.9911 - val_loss: 0.4019 - val_acc: 0.9184
Epoch 475/1000
91s 182ms/step - loss: 0.1242 - acc: 0.9902 - val_loss: 0.4229 - val_acc: 0.9181
Epoch 476/1000
91s 182ms/step - loss: 0.1251 - acc: 0.9905 - val_loss: 0.3879 - val_acc: 0.9241
Epoch 477/1000
91s 182ms/step - loss: 0.1243 - acc: 0.9899 - val_loss: 0.4191 - val_acc: 0.9172
Epoch 478/1000
91s 182ms/step - loss: 0.1240 - acc: 0.9907 - val_loss: 0.3942 - val_acc: 0.9230
Epoch 479/1000
91s 182ms/step - loss: 0.1230 - acc: 0.9909 - val_loss: 0.3843 - val_acc: 0.9274
Epoch 480/1000
91s 182ms/step - loss: 0.1207 - acc: 0.9918 - val_loss: 0.4098 - val_acc: 0.9196 ETA: 2s - loss: 0.1208 - acc: 0.9918
Epoch 481/1000
91s 182ms/step - loss: 0.1244 - acc: 0.9905 - val_loss: 0.4048 - val_acc: 0.9172
Epoch 482/1000
91s 182ms/step - loss: 0.1250 - acc: 0.9902 - val_loss: 0.4160 - val_acc: 0.9203
Epoch 483/1000
91s 182ms/step - loss: 0.1226 - acc: 0.9908 - val_loss: 0.4054 - val_acc: 0.9196
Epoch 484/1000
91s 182ms/step - loss: 0.1206 - acc: 0.9917 - val_loss: 0.4020 - val_acc: 0.9218
Epoch 485/1000
91s 182ms/step - loss: 0.1277 - acc: 0.9889 - val_loss: 0.3926 - val_acc: 0.9208 ETA: 3s - loss: 0.1276 - acc: 0.9889
Epoch 486/1000
91s 182ms/step - loss: 0.1244 - acc: 0.9901 - val_loss: 0.3976 - val_acc: 0.9179 ETA: 46s - loss: 0.1215 - acc: 0.9913
Epoch 487/1000
91s 182ms/step - loss: 0.1216 - acc: 0.9915 - val_loss: 0.4025 - val_acc: 0.9215
Epoch 488/1000
91s 182ms/step - loss: 0.1231 - acc: 0.9908 - val_loss: 0.4037 - val_acc: 0.9223
Epoch 489/1000
91s 182ms/step - loss: 0.1259 - acc: 0.9901 - val_loss: 0.4109 - val_acc: 0.9187
Epoch 490/1000
91s 182ms/step - loss: 0.1247 - acc: 0.9908 - val_loss: 0.4085 - val_acc: 0.9206
Epoch 491/1000
91s 182ms/step - loss: 0.1214 - acc: 0.9916 - val_loss: 0.4054 - val_acc: 0.9242 ETA: 7s - loss: 0.1211 - acc: 0.9916
Epoch 492/1000
91s 181ms/step - loss: 0.1253 - acc: 0.9898 - val_loss: 0.4153 - val_acc: 0.9198
Epoch 493/1000
91s 181ms/step - loss: 0.1203 - acc: 0.9919 - val_loss: 0.3971 - val_acc: 0.9212
Epoch 494/1000
91s 181ms/step - loss: 0.1243 - acc: 0.9903 - val_loss: 0.4066 - val_acc: 0.9195
Epoch 495/1000
91s 181ms/step - loss: 0.1256 - acc: 0.9904 - val_loss: 0.4061 - val_acc: 0.9199
Epoch 496/1000
91s 181ms/step - loss: 0.1266 - acc: 0.9890 - val_loss: 0.3927 - val_acc: 0.9221
Epoch 497/1000
90s 181ms/step - loss: 0.1256 - acc: 0.9897 - val_loss: 0.4104 - val_acc: 0.9210
Epoch 498/1000
90s 181ms/step - loss: 0.1218 - acc: 0.9909 - val_loss: 0.4074 - val_acc: 0.9176
Epoch 499/1000
90s 181ms/step - loss: 0.1225 - acc: 0.9907 - val_loss: 0.3931 - val_acc: 0.9219
Epoch 500/1000
90s 181ms/step - loss: 0.1238 - acc: 0.9908 - val_loss: 0.3940 - val_acc: 0.9262
Epoch 501/1000
90s 181ms/step - loss: 0.1217 - acc: 0.9912 - val_loss: 0.4017 - val_acc: 0.9238
Epoch 502/1000
91s 181ms/step - loss: 0.1239 - acc: 0.9906 - val_loss: 0.4000 - val_acc: 0.9217
Epoch 503/1000
91s 181ms/step - loss: 0.1219 - acc: 0.9915 - val_loss: 0.4070 - val_acc: 0.9199
Epoch 504/1000
91s 182ms/step - loss: 0.1237 - acc: 0.9907 - val_loss: 0.4045 - val_acc: 0.9205
Epoch 505/1000
91s 182ms/step - loss: 0.1291 - acc: 0.9884 - val_loss: 0.3828 - val_acc: 0.9203
Epoch 506/1000
91s 182ms/step - loss: 0.1250 - acc: 0.9899 - val_loss: 0.4053 - val_acc: 0.9232
Epoch 507/1000
91s 182ms/step - loss: 0.1248 - acc: 0.9907 - val_loss: 0.4098 - val_acc: 0.9204
Epoch 508/1000
91s 182ms/step - loss: 0.1212 - acc: 0.9920 - val_loss: 0.3999 - val_acc: 0.9222
Epoch 509/1000
91s 182ms/step - loss: 0.1223 - acc: 0.9918 - val_loss: 0.4083 - val_acc: 0.9183
Epoch 510/1000
91s 182ms/step - loss: 0.1250 - acc: 0.9900 - val_loss: 0.3959 - val_acc: 0.9209
Epoch 511/1000
91s 182ms/step - loss: 0.1190 - acc: 0.9919 - val_loss: 0.4029 - val_acc: 0.9237
Epoch 512/1000
91s 182ms/step - loss: 0.1191 - acc: 0.9924 - val_loss: 0.4040 - val_acc: 0.9221
Epoch 513/1000
91s 182ms/step - loss: 0.1229 - acc: 0.9906 - val_loss: 0.3949 - val_acc: 0.9251
Epoch 514/1000
91s 182ms/step - loss: 0.1263 - acc: 0.9895 - val_loss: 0.4191 - val_acc: 0.9186
Epoch 515/1000
91s 182ms/step - loss: 0.1240 - acc: 0.9904 - val_loss: 0.3939 - val_acc: 0.9208
Epoch 516/1000
91s 181ms/step - loss: 0.1240 - acc: 0.9906 - val_loss: 0.3991 - val_acc: 0.9181
Epoch 517/1000
91s 181ms/step - loss: 0.1209 - acc: 0.9915 - val_loss: 0.3953 - val_acc: 0.9216
Epoch 518/1000
91s 182ms/step - loss: 0.1215 - acc: 0.9910 - val_loss: 0.4056 - val_acc: 0.9219
Epoch 519/1000
91s 182ms/step - loss: 0.1232 - acc: 0.9905 - val_loss: 0.4092 - val_acc: 0.9187
Epoch 520/1000
91s 182ms/step - loss: 0.1252 - acc: 0.9899 - val_loss: 0.4108 - val_acc: 0.9190
Epoch 521/1000
91s 182ms/step - loss: 0.1215 - acc: 0.9912 - val_loss: 0.4031 - val_acc: 0.9191
Epoch 522/1000
91s 182ms/step - loss: 0.1236 - acc: 0.9903 - val_loss: 0.3995 - val_acc: 0.9201
Epoch 523/1000
91s 182ms/step - loss: 0.1226 - acc: 0.9916 - val_loss: 0.3823 - val_acc: 0.9264
Epoch 524/1000
91s 182ms/step - loss: 0.1229 - acc: 0.9913 - val_loss: 0.3882 - val_acc: 0.9237

本来想着早晨过来看看程序跑得怎么样了,却发现不知道为什么spyder自动退出了。

Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht, Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, IEEE Transactions on Industrial Electronics, DOI: 10.1109/TIE.2020.2972458, Date of Publication: 13 February 2020

https://ieeexplore.ieee.org/document/8998530

本文系转载,前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文系转载前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

评论
作者已关闭评论
0 条评论
热度
最新
推荐阅读
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档