首页
学习
活动
专区
圈层
工具
发布

【python实现卷积神经网络】激活层实现

代码来源:https://github.com/eriklindernoren/ML-From-Scratch

卷积神经网络中卷积层Conv2D(带stride、padding)的具体实现:https://cloud.tencent.com/developer/article/1686529

激活函数的实现(sigmoid、softmax、tanh、relu、leakyrelu、elu、selu、softplus):https://cloud.tencent.com/developer/article/1686496

损失函数定义(均方误差、交叉熵损失):https://cloud.tencent.com/developer/article/1686498

优化器的实现(SGD、Nesterov、Adagrad、Adadelta、RMSprop、Adam):https://cloud.tencent.com/developer/article/1686499

卷积层反向传播过程:https://cloud.tencent.com/developer/article/1686503

全连接层实现:https://cloud.tencent.com/developer/article/1686504

批量归一化层实现:https://cloud.tencent.com/developer/article/1686506

池化层实现:https://cloud.tencent.com/developer/article/1686507

padding2D实现:https://cloud.tencent.com/developer/article/1686509

Flatten层实现:https://cloud.tencent.com/developer/article/1686511

上采样层UpSampling2D实现:https://cloud.tencent.com/developer/article/1686515

Dropout层实现:https://cloud.tencent.com/developer/article/1686520

之前就已经定义过了各种激活函数的前向和反向计算,这里只需要将其封装成类。

代码语言:javascript
代码运行次数:0
复制
activation_functions = {
    'relu': ReLU,
    'sigmoid': Sigmoid,
    'selu': SELU,
    'elu': ELU,
    'softmax': Softmax,
    'leaky_relu': LeakyReLU,
    'tanh': TanH,
    'softplus': SoftPlus
}

class Activation(Layer):
    """A layer that applies an activation operation to the input.
    Parameters:
    -----------
    name: string
        The name of the activation function that will be used.
    """

    def __init__(self, name):
        self.activation_name = name
        self.activation_func = activation_functions[name]()
        self.trainable = True

    def layer_name(self):
        return "Activation (%s)" % (self.activation_func.__class__.__name__)

    def forward_pass(self, X, training=True):
        self.layer_input = X
        return self.activation_func(X)

    def backward_pass(self, accum_grad):
        return accum_grad * self.activation_func.gradient(self.layer_input)

    def output_shape(self):
        return self.input_shape
下一篇
举报
领券