未验证 提交 30915137 编写于 作者: G Guanghua Yu 提交者: GitHub

fix distill error (#6906)

上级 65dd2346
...@@ -262,7 +262,7 @@ class FGDFeatureLoss(nn.Layer): ...@@ -262,7 +262,7 @@ class FGDFeatureLoss(nn.Layer):
zeros_init = parameter_init("constant", 0.0) zeros_init = parameter_init("constant", 0.0)
if student_channels != teacher_channels: if student_channels != teacher_channels:
self.align = nn.Conv2d( self.align = nn.Conv2D(
student_channels, student_channels,
teacher_channels, teacher_channels,
kernel_size=1, kernel_size=1,
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册