未验证 提交 a1373714 编写于 作者: W WangXi 提交者: GitHub

NPU use squared_l2_norm in GradientClipByGlobalNorm (#34836)

上级 12bf046b
...@@ -40,7 +40,7 @@ def _squared_l2_norm(x): ...@@ -40,7 +40,7 @@ def _squared_l2_norm(x):
This OP returns the squared L2 norm of a tensor. This OP returns the squared L2 norm of a tensor.
""" """
if core.is_compiled_with_npu() or core.is_compiled_with_xpu(): if core.is_compiled_with_xpu():
square = layers.square(x) square = layers.square(x)
sum_square = layers.reduce_sum(square) sum_square = layers.reduce_sum(square)
return sum_square return sum_square
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册