提交 025feb6d 编写于 作者: G gaotingquan 提交者: Tingquan Gao

fix: fix -1 in dims

上级 7a0ed6f0
...@@ -147,7 +147,8 @@ class Attention(nn.Layer): ...@@ -147,7 +147,8 @@ class Attention(nn.Layer):
]).transpose([2, 0, 3, 1, 4]) ]).transpose([2, 0, 3, 1, 4])
else: else:
x_ = x.transpose([0, 2, 1]).reshape([B, C, H, W]) x_ = x.transpose([0, 2, 1]).reshape([B, C, H, W])
x_ = self.sr(self.pool(x_)).reshape([B, C, -1]).transpose( x_ = self.sr(self.pool(x_))
x_ = x_.reshape([B, C, x_.shape[2] * x_.shape[3]]).transpose(
[0, 2, 1]) [0, 2, 1])
x_ = self.norm(x_) x_ = self.norm(x_)
x_ = self.act(x_) x_ = self.act(x_)
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册