前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >深度残差网络+自适应参数化ReLU激活函数(调参记录4)

深度残差网络+自适应参数化ReLU激活函数(调参记录4)

作者头像
用户6915903
修改2020-04-23 10:22:22
4210
修改2020-04-23 10:22:22
举报
文章被收录于专栏:深度神经网络深度神经网络

续上一篇:

深度残差网络+自适应参数化ReLU激活函数(调参记录3)

https://blog.csdn.net/dangqing1988/article/details/105601313

本文在深度残差网络中采用了自适应参数化ReLU激活函数,继续测试其在Cifar10图像集上的效果。与上一篇不同的是,这次修改了残差模块里面的结构,原先是两个3×3的卷积层,现在改成了1×1→3×3→1×1的瓶颈式结构,从而网络层数是加深了,但是参数规模减小了。

其中,自适应参数化ReLU是Parametric ReLU的改进版本:

自适应参数化ReLU激活函数
自适应参数化ReLU激活函数

具体Keras代码如下:

代码语言:python
复制
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Tue Apr 14 04:17:45 2020
Implemented using TensorFlow 1.0.1 and Keras 2.2.1

Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht,
Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, 
IEEE Transactions on Industrial Electronics, 2020,  DOI: 10.1109/TIE.2020.2972458 

@author: Minghang Zhao
"""

from __future__ import print_function
import keras
import numpy as np
from keras.datasets import cifar10
from keras.layers import Dense, Conv2D, BatchNormalization, Activation, Minimum
from keras.layers import AveragePooling2D, Input, GlobalAveragePooling2D, Concatenate, Reshape
from keras.regularizers import l2
from keras import backend as K
from keras.models import Model
from keras import optimizers
from keras.preprocessing.image import ImageDataGenerator
from keras.callbacks import LearningRateScheduler
K.set_learning_phase(1)

# The data, split between train and test sets
(x_train, y_train), (x_test, y_test) = cifar10.load_data()

# Noised data
x_train = x_train.astype('float32') / 255.
x_test = x_test.astype('float32') / 255.
x_test = x_test-np.mean(x_train)
x_train = x_train-np.mean(x_train)
print('x_train shape:', x_train.shape)
print(x_train.shape[0], 'train samples')
print(x_test.shape[0], 'test samples')

# convert class vectors to binary class matrices
y_train = keras.utils.to_categorical(y_train, 10)
y_test = keras.utils.to_categorical(y_test, 10)

# Schedule the learning rate, multiply 0.1 every 200 epoches
def scheduler(epoch):
    if epoch % 200 == 0 and epoch != 0:
        lr = K.get_value(model.optimizer.lr)
        K.set_value(model.optimizer.lr, lr * 0.1)
        print("lr changed to {}".format(lr * 0.1))
    return K.get_value(model.optimizer.lr)

# An adaptively parametric rectifier linear unit (APReLU)
def aprelu(inputs):
    # get the number of channels
    channels = inputs.get_shape().as_list()[-1]
    # get a zero feature map
    zeros_input = keras.layers.subtract([inputs, inputs])
    # get a feature map with only positive features
    pos_input = Activation('relu')(inputs)
    # get a feature map with only negative features
    neg_input = Minimum()([inputs,zeros_input])
    # define a network to obtain the scaling coefficients
    scales_p = GlobalAveragePooling2D()(pos_input)
    scales_n = GlobalAveragePooling2D()(neg_input)
    scales = Concatenate()([scales_n, scales_p])
    scales = Dense(channels//4, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales)
    scales = BatchNormalization()(scales)
    scales = Activation('relu')(scales)
    scales = Dense(channels, activation='linear', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(scales)
    scales = BatchNormalization()(scales)
    scales = Activation('sigmoid')(scales)
    scales = Reshape((1,1,channels))(scales)
    # apply a paramtetric relu
    neg_part = keras.layers.multiply([scales, neg_input])
    return keras.layers.add([pos_input, neg_part])

# Residual Block
def residual_block(incoming, nb_blocks, out_channels, downsample=False,
                   downsample_strides=2):
    
    residual = incoming
    in_channels = incoming.get_shape().as_list()[-1]
    
    for i in range(nb_blocks):
        
        identity = residual
        
        if not downsample:
            downsample_strides = 1
        
        residual = BatchNormalization()(residual)
        residual = aprelu(residual)
        residual = Conv2D(out_channels//4, 1, strides=(downsample_strides, downsample_strides), 
                          padding='same', kernel_initializer='he_normal', 
                          kernel_regularizer=l2(1e-4))(residual)
        
        residual = BatchNormalization()(residual)
        residual = aprelu(residual)
        residual = Conv2D(out_channels//4, 3, padding='same', kernel_initializer='he_normal', 
                          kernel_regularizer=l2(1e-4))(residual)
        
        residual = BatchNormalization()(residual)
        residual = aprelu(residual)
        residual = Conv2D(out_channels, 1, padding='same', kernel_initializer='he_normal', 
                          kernel_regularizer=l2(1e-4))(residual)
        
        # Downsampling
        if downsample_strides > 1:
            identity = AveragePooling2D(pool_size=(1,1), strides=(2,2))(identity)
            
        # Zero_padding to match channels
        if in_channels != out_channels:
            zeros_identity = keras.layers.subtract([identity, identity])
            identity = keras.layers.concatenate([identity, zeros_identity])
            in_channels = out_channels
        
        residual = keras.layers.add([residual, identity])
    
    return residual


# define and train a model
inputs = Input(shape=(32, 32, 3))
net = Conv2D(16, 3, padding='same', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(inputs)
net = residual_block(net, 9, 16, downsample=False)
net = residual_block(net, 1, 32, downsample=True)
net = residual_block(net, 8, 32, downsample=False)
net = residual_block(net, 1, 64, downsample=True)
net = residual_block(net, 8, 64, downsample=False)
net = BatchNormalization()(net)
net = aprelu(net)
net = GlobalAveragePooling2D()(net)
outputs = Dense(10, activation='softmax', kernel_initializer='he_normal', kernel_regularizer=l2(1e-4))(net)
model = Model(inputs=inputs, outputs=outputs)
sgd = optimizers.SGD(lr=0.1, decay=0., momentum=0.9, nesterov=True)
model.compile(loss='categorical_crossentropy', optimizer=sgd, metrics=['accuracy'])

# data augmentation
datagen = ImageDataGenerator(
    # randomly rotate images in the range (deg 0 to 180)
    rotation_range=30,
    # randomly flip images
    horizontal_flip=True,
    # randomly shift images horizontally
    width_shift_range=0.125,
    # randomly shift images vertically
    height_shift_range=0.125)

reduce_lr = LearningRateScheduler(scheduler)
# fit the model on the batches generated by datagen.flow().
model.fit_generator(datagen.flow(x_train, y_train, batch_size=100),
                    validation_data=(x_test, y_test), epochs=500, 
                    verbose=1, callbacks=[reduce_lr], workers=4)

# get results
K.set_learning_phase(0)
DRSN_train_score1 = model.evaluate(x_train, y_train, batch_size=100, verbose=0)
print('Train loss:', DRSN_train_score1[0])
print('Train accuracy:', DRSN_train_score1[1])
DRSN_test_score1 = model.evaluate(x_test, y_test, batch_size=100, verbose=0)
print('Test loss:', DRSN_test_score1[0])
print('Test accuracy:', DRSN_test_score1[1])

实验结果如下:

代码语言:python
复制
Epoch 301/500
500/500 [==============================] - 76s 153ms/step - loss: 0.2765 - acc: 0.9388 - val_loss: 0.4537 - val_acc: 0.8881
Epoch 302/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2759 - acc: 0.9391 - val_loss: 0.4594 - val_acc: 0.8895
Epoch 303/500
500/500 [==============================] - 76s 151ms/step - loss: 0.2822 - acc: 0.9362 - val_loss: 0.4455 - val_acc: 0.8922
Epoch 304/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2811 - acc: 0.9361 - val_loss: 0.4593 - val_acc: 0.8870
Epoch 305/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2761 - acc: 0.9382 - val_loss: 0.4599 - val_acc: 0.8872
Epoch 306/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2753 - acc: 0.9392 - val_loss: 0.4532 - val_acc: 0.8913
Epoch 307/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2776 - acc: 0.9393 - val_loss: 0.4373 - val_acc: 0.8916
Epoch 308/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2750 - acc: 0.9388 - val_loss: 0.4406 - val_acc: 0.8915
Epoch 309/500
500/500 [==============================] - 76s 153ms/step - loss: 0.2778 - acc: 0.9380 - val_loss: 0.4662 - val_acc: 0.8832
Epoch 310/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2790 - acc: 0.9384 - val_loss: 0.4385 - val_acc: 0.8960
Epoch 311/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2772 - acc: 0.9388 - val_loss: 0.4503 - val_acc: 0.8899
Epoch 312/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2776 - acc: 0.9388 - val_loss: 0.4423 - val_acc: 0.8938
Epoch 313/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2786 - acc: 0.9379 - val_loss: 0.4404 - val_acc: 0.8951
Epoch 314/500
500/500 [==============================] - 76s 153ms/step - loss: 0.2767 - acc: 0.9388 - val_loss: 0.4483 - val_acc: 0.8899
Epoch 315/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2741 - acc: 0.9412 - val_loss: 0.4484 - val_acc: 0.8885
Epoch 316/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2796 - acc: 0.9371 - val_loss: 0.4526 - val_acc: 0.8883
Epoch 317/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2751 - acc: 0.9394 - val_loss: 0.4552 - val_acc: 0.8874
Epoch 318/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2775 - acc: 0.9387 - val_loss: 0.4464 - val_acc: 0.8905
Epoch 319/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2762 - acc: 0.9388 - val_loss: 0.4523 - val_acc: 0.8889
Epoch 320/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2757 - acc: 0.9383 - val_loss: 0.4490 - val_acc: 0.8901
Epoch 321/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2732 - acc: 0.9385 - val_loss: 0.4538 - val_acc: 0.8853
Epoch 322/500
500/500 [==============================] - 76s 153ms/step - loss: 0.2812 - acc: 0.9377 - val_loss: 0.4450 - val_acc: 0.8909
Epoch 323/500
500/500 [==============================] - 76s 153ms/step - loss: 0.2740 - acc: 0.9388 - val_loss: 0.4530 - val_acc: 0.8868
Epoch 324/500
500/500 [==============================] - 76s 153ms/step - loss: 0.2730 - acc: 0.9391 - val_loss: 0.4544 - val_acc: 0.8882
Epoch 325/500
500/500 [==============================] - 77s 153ms/step - loss: 0.2786 - acc: 0.9385 - val_loss: 0.4564 - val_acc: 0.8881
Epoch 326/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2793 - acc: 0.9385 - val_loss: 0.4503 - val_acc: 0.8900
Epoch 327/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2764 - acc: 0.9384 - val_loss: 0.4602 - val_acc: 0.8867
Epoch 328/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2771 - acc: 0.9386 - val_loss: 0.4446 - val_acc: 0.8888
Epoch 329/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2764 - acc: 0.9375 - val_loss: 0.4495 - val_acc: 0.8892
Epoch 330/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2773 - acc: 0.9389 - val_loss: 0.4532 - val_acc: 0.8876
Epoch 331/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2751 - acc: 0.9399 - val_loss: 0.4550 - val_acc: 0.8890
Epoch 332/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2720 - acc: 0.9395 - val_loss: 0.4577 - val_acc: 0.8870
Epoch 333/500
500/500 [==============================] - 76s 153ms/step - loss: 0.2713 - acc: 0.9412 - val_loss: 0.4565 - val_acc: 0.8884
Epoch 334/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2731 - acc: 0.9399 - val_loss: 0.4496 - val_acc: 0.8904
Epoch 335/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2695 - acc: 0.9412 - val_loss: 0.4491 - val_acc: 0.8877
Epoch 336/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2715 - acc: 0.9403 - val_loss: 0.4476 - val_acc: 0.8909
Epoch 337/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2777 - acc: 0.9365 - val_loss: 0.4533 - val_acc: 0.8889
Epoch 338/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2727 - acc: 0.9411 - val_loss: 0.4648 - val_acc: 0.8854
Epoch 339/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2712 - acc: 0.9411 - val_loss: 0.4701 - val_acc: 0.8873
Epoch 340/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2736 - acc: 0.9398 - val_loss: 0.4632 - val_acc: 0.8874
Epoch 341/500
500/500 [==============================] - 77s 153ms/step - loss: 0.2749 - acc: 0.9389 - val_loss: 0.4607 - val_acc: 0.8841
Epoch 342/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2697 - acc: 0.9409 - val_loss: 0.4659 - val_acc: 0.8851
Epoch 343/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2761 - acc: 0.9391 - val_loss: 0.4545 - val_acc: 0.8854
Epoch 344/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2709 - acc: 0.9410 - val_loss: 0.4563 - val_acc: 0.8860
Epoch 345/500
500/500 [==============================] - 77s 153ms/step - loss: 0.2746 - acc: 0.9391 - val_loss: 0.4578 - val_acc: 0.8874
Epoch 346/500
500/500 [==============================] - 76s 153ms/step - loss: 0.2726 - acc: 0.9406 - val_loss: 0.4714 - val_acc: 0.8847
Epoch 347/500
500/500 [==============================] - 77s 153ms/step - loss: 0.2713 - acc: 0.9406 - val_loss: 0.4648 - val_acc: 0.8848
Epoch 348/500
500/500 [==============================] - 76s 153ms/step - loss: 0.2745 - acc: 0.9401 - val_loss: 0.4541 - val_acc: 0.8875
Epoch 349/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2688 - acc: 0.9421 - val_loss: 0.4635 - val_acc: 0.8840
Epoch 350/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2736 - acc: 0.9412 - val_loss: 0.4625 - val_acc: 0.8850
Epoch 351/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2721 - acc: 0.9406 - val_loss: 0.4726 - val_acc: 0.8818
Epoch 352/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2756 - acc: 0.9399 - val_loss: 0.4567 - val_acc: 0.8870
Epoch 353/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2715 - acc: 0.9408 - val_loss: 0.4589 - val_acc: 0.8879
Epoch 354/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2714 - acc: 0.9402 - val_loss: 0.4720 - val_acc: 0.8838
Epoch 355/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2727 - acc: 0.9398 - val_loss: 0.4646 - val_acc: 0.8861
Epoch 356/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2726 - acc: 0.9416 - val_loss: 0.4490 - val_acc: 0.8886
Epoch 357/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2715 - acc: 0.9413 - val_loss: 0.4559 - val_acc: 0.8879
Epoch 358/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2711 - acc: 0.9414 - val_loss: 0.4723 - val_acc: 0.8867
Epoch 359/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2719 - acc: 0.9407 - val_loss: 0.4639 - val_acc: 0.8857
Epoch 360/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2745 - acc: 0.9398 - val_loss: 0.4669 - val_acc: 0.8851
Epoch 361/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2690 - acc: 0.9413 - val_loss: 0.4633 - val_acc: 0.8860
Epoch 362/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2701 - acc: 0.9415 - val_loss: 0.4719 - val_acc: 0.8860
Epoch 363/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2712 - acc: 0.9421 - val_loss: 0.4661 - val_acc: 0.8850
Epoch 364/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2747 - acc: 0.9393 - val_loss: 0.4545 - val_acc: 0.8875
Epoch 365/500
500/500 [==============================] - 77s 153ms/step - loss: 0.2734 - acc: 0.9407 - val_loss: 0.4742 - val_acc: 0.8820
Epoch 366/500
500/500 [==============================] - 77s 154ms/step - loss: 0.2745 - acc: 0.9391 - val_loss: 0.4537 - val_acc: 0.8912
Epoch 367/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2669 - acc: 0.9422 - val_loss: 0.4615 - val_acc: 0.8867
Epoch 368/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2719 - acc: 0.9407 - val_loss: 0.4636 - val_acc: 0.8891
Epoch 369/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2706 - acc: 0.9408 - val_loss: 0.4668 - val_acc: 0.8848
Epoch 370/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2714 - acc: 0.9404 - val_loss: 0.4527 - val_acc: 0.8901
Epoch 371/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2696 - acc: 0.9426 - val_loss: 0.4626 - val_acc: 0.8844
Epoch 372/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2662 - acc: 0.9430 - val_loss: 0.4587 - val_acc: 0.8889
Epoch 373/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2729 - acc: 0.9410 - val_loss: 0.4603 - val_acc: 0.8879
Epoch 374/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2692 - acc: 0.9422 - val_loss: 0.4587 - val_acc: 0.8905
Epoch 375/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2719 - acc: 0.9419 - val_loss: 0.4760 - val_acc: 0.8864
Epoch 376/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2727 - acc: 0.9401 - val_loss: 0.4500 - val_acc: 0.8895
Epoch 377/500
500/500 [==============================] - 76s 151ms/step - loss: 0.2681 - acc: 0.9432 - val_loss: 0.4561 - val_acc: 0.8927
Epoch 378/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2763 - acc: 0.9396 - val_loss: 0.4599 - val_acc: 0.8863
Epoch 379/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2682 - acc: 0.9413 - val_loss: 0.4728 - val_acc: 0.8849
Epoch 380/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2694 - acc: 0.9426 - val_loss: 0.4717 - val_acc: 0.8832
Epoch 381/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2710 - acc: 0.9400 - val_loss: 0.4568 - val_acc: 0.8858
Epoch 382/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2734 - acc: 0.9393 - val_loss: 0.4745 - val_acc: 0.8831
Epoch 383/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2681 - acc: 0.9428 - val_loss: 0.4760 - val_acc: 0.8845
Epoch 384/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2720 - acc: 0.9414 - val_loss: 0.4651 - val_acc: 0.8879
Epoch 385/500
500/500 [==============================] - 76s 151ms/step - loss: 0.2715 - acc: 0.9412 - val_loss: 0.4527 - val_acc: 0.8924
Epoch 386/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2662 - acc: 0.9441 - val_loss: 0.4607 - val_acc: 0.8876
Epoch 387/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2649 - acc: 0.9429 - val_loss: 0.4731 - val_acc: 0.8838
Epoch 388/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2720 - acc: 0.9407 - val_loss: 0.4683 - val_acc: 0.8842
Epoch 389/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2707 - acc: 0.9404 - val_loss: 0.4674 - val_acc: 0.8850
Epoch 390/500
500/500 [==============================] - 76s 153ms/step - loss: 0.2687 - acc: 0.9416 - val_loss: 0.4766 - val_acc: 0.8810
Epoch 391/500
500/500 [==============================] - 76s 152ms/step - loss: 0.2669 - acc: 0.9440 - val_loss: 0.4728 - val_acc: 0.8834
Epoch 392/500
500/500 [==============================] - 77s 153ms/step - loss: 0.2683 - acc: 0.9422 - val_loss: 0.4572 - val_acc: 0.8880
Epoch 393/500
500/500 [==============================] - 77s 154ms/step - loss: 0.2631 - acc: 0.9449 - val_loss: 0.4691 - val_acc: 0.8858
Epoch 394/500
500/500 [==============================] - 77s 154ms/step - loss: 0.2681 - acc: 0.9419 - val_loss: 0.4747 - val_acc: 0.8875
Epoch 395/500
500/500 [==============================] - 77s 154ms/step - loss: 0.2700 - acc: 0.9419 - val_loss: 0.4650 - val_acc: 0.8889
Epoch 396/500
500/500 [==============================] - 77s 153ms/step - loss: 0.2702 - acc: 0.9419 - val_loss: 0.4520 - val_acc: 0.8901
Epoch 397/500
500/500 [==============================] - 77s 154ms/step - loss: 0.2640 - acc: 0.9439 - val_loss: 0.4607 - val_acc: 0.8857
Epoch 398/500
500/500 [==============================] - 77s 154ms/step - loss: 0.2683 - acc: 0.9425 - val_loss: 0.4654 - val_acc: 0.8894
Epoch 399/500
500/500 [==============================] - 77s 154ms/step - loss: 0.2709 - acc: 0.9419 - val_loss: 0.4727 - val_acc: 0.8853
Epoch 400/500
500/500 [==============================] - 77s 153ms/step - loss: 0.2673 - acc: 0.9429 - val_loss: 0.4670 - val_acc: 0.8873
Epoch 401/500
lr changed to 0.0009999999776482583
500/500 [==============================] - 77s 154ms/step - loss: 0.2343 - acc: 0.9556 - val_loss: 0.4340 - val_acc: 0.8968
Epoch 402/500
500/500 [==============================] - 77s 154ms/step - loss: 0.2155 - acc: 0.9635 - val_loss: 0.4307 - val_acc: 0.9001
Epoch 403/500
500/500 [==============================] - 77s 154ms/step - loss: 0.2098 - acc: 0.9645 - val_loss: 0.4287 - val_acc: 0.8996
Epoch 404/500
500/500 [==============================] - 77s 153ms/step - loss: 0.2014 - acc: 0.9686 - val_loss: 0.4280 - val_acc: 0.9001
Epoch 405/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1992 - acc: 0.9681 - val_loss: 0.4285 - val_acc: 0.9006
Epoch 406/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1960 - acc: 0.9695 - val_loss: 0.4308 - val_acc: 0.9000
Epoch 407/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1946 - acc: 0.9697 - val_loss: 0.4326 - val_acc: 0.9011
Epoch 408/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1956 - acc: 0.9703 - val_loss: 0.4329 - val_acc: 0.9021
Epoch 409/500
500/500 [==============================] - 76s 153ms/step - loss: 0.1925 - acc: 0.9713 - val_loss: 0.4312 - val_acc: 0.9020
Epoch 410/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1875 - acc: 0.9720 - val_loss: 0.4347 - val_acc: 0.9021
Epoch 411/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1895 - acc: 0.9718 - val_loss: 0.4368 - val_acc: 0.9000
Epoch 412/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1856 - acc: 0.9722 - val_loss: 0.4390 - val_acc: 0.9012
Epoch 413/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1857 - acc: 0.9721 - val_loss: 0.4396 - val_acc: 0.9007
Epoch 414/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1842 - acc: 0.9730 - val_loss: 0.4406 - val_acc: 0.9002
Epoch 415/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1840 - acc: 0.9734 - val_loss: 0.4426 - val_acc: 0.9003
Epoch 416/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1822 - acc: 0.9738 - val_loss: 0.4447 - val_acc: 0.9009
Epoch 417/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1828 - acc: 0.9732 - val_loss: 0.4433 - val_acc: 0.8994
Epoch 418/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1826 - acc: 0.9735 - val_loss: 0.4407 - val_acc: 0.9006
Epoch 419/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1798 - acc: 0.9737 - val_loss: 0.4432 - val_acc: 0.9009
Epoch 420/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1800 - acc: 0.9738 - val_loss: 0.4415 - val_acc: 0.9016
Epoch 421/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1785 - acc: 0.9743 - val_loss: 0.4447 - val_acc: 0.9012
Epoch 422/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1792 - acc: 0.9738 - val_loss: 0.4467 - val_acc: 0.9008
Epoch 423/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1763 - acc: 0.9759 - val_loss: 0.4459 - val_acc: 0.9013
Epoch 424/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1795 - acc: 0.9735 - val_loss: 0.4501 - val_acc: 0.8997
Epoch 425/500
500/500 [==============================] - 76s 153ms/step - loss: 0.1767 - acc: 0.9744 - val_loss: 0.4469 - val_acc: 0.9004
Epoch 426/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1766 - acc: 0.9748 - val_loss: 0.4494 - val_acc: 0.9007
Epoch 427/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1762 - acc: 0.9748 - val_loss: 0.4534 - val_acc: 0.9001
Epoch 428/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1760 - acc: 0.9751 - val_loss: 0.4516 - val_acc: 0.9014
Epoch 429/500
500/500 [==============================] - 77s 155ms/step - loss: 0.1752 - acc: 0.9747 - val_loss: 0.4515 - val_acc: 0.8996
Epoch 430/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1764 - acc: 0.9747 - val_loss: 0.4529 - val_acc: 0.9010
Epoch 431/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1732 - acc: 0.9765 - val_loss: 0.4541 - val_acc: 0.8994
Epoch 432/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1720 - acc: 0.9764 - val_loss: 0.4530 - val_acc: 0.9000
Epoch 433/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1735 - acc: 0.9756 - val_loss: 0.4527 - val_acc: 0.9007
Epoch 434/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1723 - acc: 0.9755 - val_loss: 0.4558 - val_acc: 0.9000
Epoch 435/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1731 - acc: 0.9759 - val_loss: 0.4549 - val_acc: 0.9013
Epoch 436/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1703 - acc: 0.9764 - val_loss: 0.4560 - val_acc: 0.9017
Epoch 437/500
500/500 [==============================] - 77s 155ms/step - loss: 0.1714 - acc: 0.9754 - val_loss: 0.4557 - val_acc: 0.9014
Epoch 438/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1691 - acc: 0.9765 - val_loss: 0.4596 - val_acc: 0.8988
Epoch 439/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1700 - acc: 0.9761 - val_loss: 0.4613 - val_acc: 0.9006
Epoch 440/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1718 - acc: 0.9754 - val_loss: 0.4611 - val_acc: 0.9001
Epoch 441/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1704 - acc: 0.9758 - val_loss: 0.4616 - val_acc: 0.9017
Epoch 442/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1663 - acc: 0.9781 - val_loss: 0.4638 - val_acc: 0.8990
Epoch 443/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1697 - acc: 0.9759 - val_loss: 0.4635 - val_acc: 0.9007
Epoch 444/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1673 - acc: 0.9775 - val_loss: 0.4664 - val_acc: 0.8994
Epoch 445/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1649 - acc: 0.9779 - val_loss: 0.4651 - val_acc: 0.8991
Epoch 446/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1692 - acc: 0.9760 - val_loss: 0.4659 - val_acc: 0.8992
Epoch 447/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1678 - acc: 0.9764 - val_loss: 0.4637 - val_acc: 0.8997
Epoch 448/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1644 - acc: 0.9774 - val_loss: 0.4659 - val_acc: 0.8996
Epoch 449/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1634 - acc: 0.9783 - val_loss: 0.4628 - val_acc: 0.9002
Epoch 450/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1662 - acc: 0.9774 - val_loss: 0.4642 - val_acc: 0.9024
Epoch 451/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1649 - acc: 0.9767 - val_loss: 0.4647 - val_acc: 0.9020
Epoch 452/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1645 - acc: 0.9776 - val_loss: 0.4674 - val_acc: 0.8994
Epoch 453/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1646 - acc: 0.9772 - val_loss: 0.4650 - val_acc: 0.8999
Epoch 454/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1639 - acc: 0.9786 - val_loss: 0.4683 - val_acc: 0.8973
Epoch 455/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1626 - acc: 0.9778 - val_loss: 0.4665 - val_acc: 0.8997
Epoch 456/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1634 - acc: 0.9779 - val_loss: 0.4647 - val_acc: 0.8993
Epoch 457/500
500/500 [==============================] - 76s 153ms/step - loss: 0.1623 - acc: 0.9785 - val_loss: 0.4645 - val_acc: 0.8996
Epoch 458/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1616 - acc: 0.9780 - val_loss: 0.4654 - val_acc: 0.9007
Epoch 459/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1617 - acc: 0.9777 - val_loss: 0.4664 - val_acc: 0.8987
Epoch 460/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1623 - acc: 0.9777 - val_loss: 0.4652 - val_acc: 0.8989
Epoch 461/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1595 - acc: 0.9789 - val_loss: 0.4637 - val_acc: 0.8992
Epoch 462/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1609 - acc: 0.9789 - val_loss: 0.4675 - val_acc: 0.8967
Epoch 463/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1615 - acc: 0.9779 - val_loss: 0.4731 - val_acc: 0.8981
Epoch 464/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1612 - acc: 0.9778 - val_loss: 0.4656 - val_acc: 0.9017
Epoch 465/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1571 - acc: 0.9793 - val_loss: 0.4738 - val_acc: 0.9003
Epoch 466/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1606 - acc: 0.9773 - val_loss: 0.4741 - val_acc: 0.8996
Epoch 467/500
500/500 [==============================] - 76s 153ms/step - loss: 0.1591 - acc: 0.9794 - val_loss: 0.4749 - val_acc: 0.8988
Epoch 468/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1594 - acc: 0.9780 - val_loss: 0.4723 - val_acc: 0.8969
Epoch 469/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1591 - acc: 0.9786 - val_loss: 0.4748 - val_acc: 0.8981
Epoch 470/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1560 - acc: 0.9795 - val_loss: 0.4730 - val_acc: 0.8972
Epoch 471/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1574 - acc: 0.9791 - val_loss: 0.4760 - val_acc: 0.8975
Epoch 472/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1577 - acc: 0.9786 - val_loss: 0.4757 - val_acc: 0.8974
Epoch 473/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1543 - acc: 0.9799 - val_loss: 0.4787 - val_acc: 0.8955
Epoch 474/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1552 - acc: 0.9800 - val_loss: 0.4751 - val_acc: 0.8966
Epoch 475/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1579 - acc: 0.9778 - val_loss: 0.4761 - val_acc: 0.8954
Epoch 476/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1566 - acc: 0.9795 - val_loss: 0.4738 - val_acc: 0.8973
Epoch 477/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1552 - acc: 0.9795 - val_loss: 0.4787 - val_acc: 0.8966
Epoch 478/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1569 - acc: 0.9789 - val_loss: 0.4724 - val_acc: 0.8986
Epoch 479/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1544 - acc: 0.9796 - val_loss: 0.4722 - val_acc: 0.8991
Epoch 480/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1566 - acc: 0.9790 - val_loss: 0.4749 - val_acc: 0.8977
Epoch 481/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1539 - acc: 0.9797 - val_loss: 0.4756 - val_acc: 0.8982
Epoch 482/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1543 - acc: 0.9793 - val_loss: 0.4783 - val_acc: 0.8978
Epoch 483/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1546 - acc: 0.9793 - val_loss: 0.4776 - val_acc: 0.8973
Epoch 484/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1549 - acc: 0.9787 - val_loss: 0.4755 - val_acc: 0.8977
Epoch 485/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1534 - acc: 0.9786 - val_loss: 0.4774 - val_acc: 0.8976
Epoch 486/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1528 - acc: 0.9795 - val_loss: 0.4746 - val_acc: 0.8997
Epoch 487/500
500/500 [==============================] - 77s 154ms/step - loss: 0.1522 - acc: 0.9798 - val_loss: 0.4762 - val_acc: 0.8996
Epoch 488/500
500/500 [==============================] - 77s 153ms/step - loss: 0.1538 - acc: 0.9790 - val_loss: 0.4771 - val_acc: 0.8986
Epoch 489/500
277/500 [===============>..............] - ETA: 33s - loss: 0.1521 - acc: 0.9798 Traceback (most recent call last):

  File "C:\Users\hitwh\.spyder-py3\temp.py", line 153, in <module>
    verbose=1, callbacks=[reduce_lr], workers=4)

  File "C:\Users\hitwh\Anaconda3\envs\Initial\lib\site-packages\keras\legacy\interfaces.py", line 91, in wrapper
    return func(*args, **kwargs)

  File "C:\Users\hitwh\Anaconda3\envs\Initial\lib\site-packages\keras\engine\training.py", line 1415, in fit_generator
    initial_epoch=initial_epoch)

  File "C:\Users\hitwh\Anaconda3\envs\Initial\lib\site-packages\keras\engine\training_generator.py", line 213, in fit_generator
    class_weight=class_weight)

  File "C:\Users\hitwh\Anaconda3\envs\Initial\lib\site-packages\keras\engine\training.py", line 1215, in train_on_batch
    outputs = self.train_function(ins)

  File "C:\Users\hitwh\Anaconda3\envs\Initial\lib\site-packages\keras\backend\tensorflow_backend.py", line 2666, in __call__
    return self._call(inputs)

  File "C:\Users\hitwh\Anaconda3\envs\Initial\lib\site-packages\keras\backend\tensorflow_backend.py", line 2636, in _call
    fetched = self._callable_fn(*array_vals)

  File "C:\Users\hitwh\Anaconda3\envs\Initial\lib\site-packages\tensorflow\python\client\session.py", line 1382, in __call__
    run_metadata_ptr)

KeyboardInterrupt

这次是故意中断的,估计跑完500个epoch,效果也没有上一篇(调参记录3)的时候效果好。其中,在第122个epoch的时候,电脑居然休眠了,浪费了一万多秒。

Minghang Zhao, Shisheng Zhong, Xuyun Fu, Baoping Tang, Shaojiang Dong, Michael Pecht, Deep Residual Networks with Adaptively Parametric Rectifier Linear Units for Fault Diagnosis, IEEE Transactions on Industrial Electronics, 2020, DOI: 10.1109/TIE.2020.2972458

https://ieeexplore.ieee.org/document/8998530

————————————————

版权声明:本文为CSDN博主「dangqing1988」的原创文章,遵循 CC 4.0 BY-SA 版权协议,转载请附上原文出处链接及本声明。

原文链接:https://blog.csdn.net/dangqing1988/article/details/105610584

本文系转载,前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文系转载前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档