未验证 提交 51529712 编写于 作者: G Ghost Screaming 提交者: GitHub

Fix bug of hybrid_parallel_optimizer, amp use scaler.minimize(), (#53773)

however it can't deal with group of parameter_list of dict.
上级 a822a084
......@@ -432,7 +432,9 @@ class HybridParallelOptimizer:
# minimize does not support parameters in the form of param_group,
# so no need use _obtain_optimizer_parameters_list
parameter_list = (
parameters if parameters else self._inner_opt._parameter_list
parameters
if parameters
else _obtain_optimizer_parameters_list(self._inner_opt)
)
# Here sharding should use global parameter list
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册