我正在尝试使用pytorch LSTM中的relu激活函数,但得到的错误是“张量对象不可调用。有什么指导或帮助吗?我可以在向前传播中使用不同的激活函数吗?但我在隐藏层和向前传播中使用相同的激活函数。
class LSTM(nn.Module):
def __init__(self, input_size=1, hidden_layer_size=20, output_size=1):
super().__init__()
self.hidden_layer_size = hidden_layer_size
self.lstm = nn.LSTM(input_size, hidden_layer_size)
self.relu = nn.functional.relu(torch.FloatTensor(hidden_layer_size), torch.FloatTensor(output_size))
self.hidden_cell = (torch.zeros(1, 1, self.hidden_layer_size),
torch.zeros(1, 1, self.hidden_layer_size))
def forward(self, input_seq):
lstm_out, self.hidden_cell = self.lstm(input_seq.view(len(input_seq), 1, -1), self.hidden_cell)
predictions = self.relu(lstm_out.view(len(input_seq), -1))
return predictions[-1]
model = LSTM()
loss_function = nn.MSELoss()
optimizer = torch.optim.Adam(model.parameters(), lr = 0.0001)
epochs = 150
for i in range(epochs):
for seq, labels in train_inout_seq:
optimizer.zero_grad()
model.hidden_cell = (torch.zeros(1, 1, model.hidden_layer_size),
torch.zeros(1, 1, model.hidden_layer_size))
y_pred = model(seq)
single_loss = loss_function(y_pred, labels)
single_loss.backward()
optimizer.step()
if i%25 == 1:
print(f'epoch: {i:3} loss: {single_loss.item():10.8f}')
print(f'epoch: {i:3} loss: {single_loss.item():10.10f}')在那之后,我收到一个错误:`
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-108-5fcfb471ed9a> in <module>
6 model.hidden_cell = (torch.zeros(1, 1, model.hidden_layer_size),
7 torch.zeros(1, 1, model.hidden_layer_size))
----> 8 y_pred = model(seq)
9
10 single_loss = loss_function(y_pred, labels)
/opt/conda/lib/python3.8/site-packages/torch/nn/modules/module.py in __call__(self, *input, **kwargs)
548 result = self._slow_forward(*input, **kwargs)
549 else:
--> 550 result = self.forward(*input, **kwargs)
551 for hook in self._forward_hooks.values():
552 hook_result = hook(self, input, result)
<ipython-input-105-221892d3f487> in forward(self, input_seq)
12 def forward(self, input_seq):
13 lstm_out, self.hidden_cell = self.lstm(input_seq.view(len(input_seq), 1, -1), self.hidden_cell)
---> 14 predictions = self.relu(lstm_out.view(len(input_seq), -1))
15 return predictions[-1]
TypeError: 'Tensor' object is not callable发布于 2021-03-26 16:06:58
self.relu = nn.functional.relu(torch.FloatTensor(hidden_layer_size), torch.FloatTensor(output_size)) 这一行并没有真正定义ReLU函数以供进一步使用,而是将ReLU函数应用于任意张量(即torch.FloatTensor(hidden_layer_size))并返回结果张量!因此,self.relu不是一个函数,而是一个张量,因此出现了错误。
一种补救方法是使用以下代码而不是上面的代码行:
self.relu = nn.ReLU()这提供了一个nn.ReLU实例,您可以将其用作ReLU函数,就像在forward方法中通过调用它所做的那样。
另一种解决方案是根本不定义self.relu,并在forward中使用下面这一行
predictions = nn.functional.relu(lstm_out.view(len(input_seq), -1))这是模块化和函数式方法之间的区别。
出于某些原因,可以在这里看到为什么更喜欢其中一个:https://discuss.pytorch.org/t/whats-the-difference-between-nn-relu-vs-f-relu/27599
https://stackoverflow.com/questions/66812345
复制相似问题