首页
学习
活动
专区
圈层
工具
发布
首页
学习
活动
专区
圈层
工具
MCP广场
社区首页 >问答首页 >Prelu在Tensorflow中的应用

Prelu在Tensorflow中的应用
EN

Stack Overflow用户
提问于 2022-09-19 12:26:16
回答 1查看 66关注 0票数 0

我正在建立一个强化学习模型。我正在尝试使用PRelu在我的2D Conv模型中使用tensorflow。下面是Actor模型的代码。

代码:

代码语言:javascript
运行
复制
from tensorflow.keras.layers import Conv2D, Input, MaxPool1D, concatenate, Lambda, Dense, Flatten
import tensorflow as tf

# activation = tf.keras.layers.LeakyReLU(alpha=0.5)
activation =   tf.keras.layers.PReLU(alpha_initializer=tf.initializers.constant(0.25))

def ActorNetwork(input_shape_A,input_shape_B, n_actions):
    input_layer_A = Input(shape=input_shape_A[1:], name="input_layer_A")
    input_layer_B = Input(shape=input_shape_B[1:], name="input_layer_B")

    Rescale = Lambda(lambda x: tf.divide(tf.subtract(x, tf.reduce_max(x)), tf.subtract(tf.reduce_max(x), tf.reduce_min(x))))(input_layer_A)

    Conv1 = Conv2D(32, 3, activation= activation, padding='same', name="Conv1")(Rescale)
    Conv2 = Conv2D(32, 3, activation=activation, padding='same', name="Conv2")(Conv1)
    Conv_pool_1 = Conv2D(32, 2, strides=2, activation='relu', padding='same', name="Conv_pool_1")(Conv2)

    Batchnorm_1 = tf.keras.layers.BatchNormalization(name='Batchnorm_1')(Conv_pool_1)
  
    Conv3 = Conv2D(32, 3, activation= activation, padding='same', name="Conv3")(Batchnorm_1)
    Conv4 = Conv2D(32, 3, activation=activation, padding='same', name="Conv4")(Conv3)
    Conv_pool_2 = Conv2D(32, 2, strides=2, activation='relu', padding='same', name="Conv_pool_2")(Conv4)

    Batchnorm_2 = tf.keras.layers.BatchNormalization(name='Batchnorm_2')(Conv_pool_2)
  

    Conv5 = Conv2D(64, 3, activation= activation, padding='same', name="Conv5")(Batchnorm_2)
    Conv6 = Conv2D(64, 3, activation=activation, padding='same', name="Conv6")(Conv5)
    Conv_pool_3 = Conv2D(64, 2, strides=2, activation='relu', padding='same', name="Conv_pool_3")(Conv6)

    Batchnorm_3 = tf.keras.layers.BatchNormalization(name='Batchnorm_3')(Conv_pool_3)
  

    Conv7 = Conv2D(64, 3, activation= activation, padding='same', name="Conv7")(Batchnorm_3)
    Conv8 = Conv2D(64, 3, activation=activation, padding='same', name="Conv8")(Conv7)
    Conv_pool_4 = Conv2D(64, 2, strides=2, activation='relu', padding='same', name="Conv_pool_4")(Conv8)

    Batchnorm_4 = tf.keras.layers.BatchNormalization(name='Batchnorm_4')(Conv_pool_4)
  
    Conv9 = Conv2D(128, 3, activation= activation, padding='same', name="Conv9")(Batchnorm_4)
    Conv10 = Conv2D(128, 3, activation=activation, padding='same', name="Conv10")(Conv9)
    Conv_pool_5 = Conv2D(128, 2, strides=2, activation='relu', padding='same', name="Conv_pool_5")(Conv10)

    Batchnorm_5 = tf.keras.layers.BatchNormalization(name='Batchnorm_5')(Conv_pool_5)
  
    Conv11 = Conv2D(128, 3, activation= activation, padding='same', name="Conv11")(Batchnorm_5)
    Conv12 = Conv2D(128, 3, activation=activation, padding='same', name="Conv12")(Conv11)
    Conv_pool_6 = Conv2D(128, 2, strides=2, activation='relu', padding='same', name="Conv_pool_6")(Conv12)

    Batchnorm_6 = tf.keras.layers.BatchNormalization(name='Batchnorm_6')(Conv_pool_6)
  

    Conv_pool_7 = Conv2D(128, 1, strides=1, activation='relu', padding='same', name="Conv_pool_7")(Batchnorm_6)
    Conv_pool_8 = Conv2D(64, 1, strides=1, activation='relu', padding='same', name="Conv_pool_8")(Conv_pool_7)
    Conv_pool_9 = Conv2D(32, 1, strides=1, activation='relu', padding='same', name="Conv_pool_9")(Conv_pool_8)


    flatten = Flatten()(Conv_pool_9)
    

    Concat_2 = tf.keras.layers.concatenate([flatten, input_layer_B], axis=-1,name='Concat_2')


    fc1 = Dense(8194, activation='relu', name="fc1")(Concat_2)
    fc2 = Dense(4096, activation='relu', name="fc2")(fc1)
    fc3 = Dense(n_actions, activation='softmax', name="fc3")(fc2)

    return tf.keras.models.Model(inputs=[input_layer_A,input_layer_B], outputs = fc3, name="actor_model")


model=ActorNetwork((1,1000,4000,1),(1,2),3)
model.compile()
model.summary()
print(model([tf.random.uniform((1,1000,4000,1)),tf.random.uniform((1,2))]))
tf.keras.utils.plot_model(model, show_shapes=True)

我在LeakyRelu上工作很好,但是当我使用Prelu时,我抛出了与维度相关的错误。我不明白

错误:

代码语言:javascript
运行
复制
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-17-0a596da4bc68> in <module>
    131 
    132 
--> 133 model=ActorNetwork((1,1000,4000,1),(1,2),3)
    134 model.compile()
    135 model.summary()

2 frames
/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/ops.py in _create_c_op(graph, node_def, inputs, control_inputs, op_def)
   2011   except errors.InvalidArgumentError as e:
   2012     # Convert to ValueError for backwards compatibility.
-> 2013     raise ValueError(e.message)
   2014 
   2015   return c_op

ValueError: Exception encountered when calling layer "p_re_lu_10" (type PReLU).

Dimensions must be equal, but are 1000 and 500 for '{{node Conv3/p_re_lu_10/mul}} = Mul[T=DT_FLOAT](Conv3/p_re_lu_10/Neg, Conv3/p_re_lu_10/Relu_1)' with input shapes: [1000,4000,32], [?,500,2000,32].

Call arguments received:
  • inputs=tf.Tensor(shape=(None, 500, 2000, 32), dtype=float32)

我在这里做错什么了?

EN

回答 1

Stack Overflow用户

回答已采纳

发布于 2022-09-19 13:02:52

PReLu激活函数维护一个可学习参数alpha,该参数与函数的输入具有相同的形状。您可以在文档中阅读更多内容。

每次要使用该激活函数时,都需要定义一个新层。

i.e

代码语言:javascript
运行
复制
Conv1 = Conv2D(32, 3, activation=None, padding='same', name="Conv1")(Rescale)
Conv1_p_relu = tf.keras.layers.PReLU(alpha_initializer=tf.initializers.constant(0.25))(Conv1)
Conv2 = Conv2D(32, 3, activation=None, padding='same', name="Conv2")(Conv1_p_relu)
Conv2_p_relu = tf.keras.layers.PReLU(alpha_initializer=tf.initializers.constant(0.25))(Conv2)
Conv_pool_1 = Conv2D(32, 2, strides=2, activation='relu', padding='same', name="Conv_pool_1")(Conv2_p_relu)
票数 1
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/73773318

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档