未验证 提交 66ac2594 编写于 作者: S ShenLiang 提交者: GitHub

fix bug of sync buffer (#51895)

上级 72c711bb
......@@ -166,8 +166,9 @@ def sync_params_buffers(
# is_distributed param not need to sync when in mp mode
if isinstance(param, (ParamBase, core.eager.Tensor)):
if is_model_parallel and param.is_distributed:
continue
if is_model_parallel:
if hasattr(param, "is_distributed") and param.is_distributed:
continue
# NOTE(shenliang03): Support situations that do not require synchronization parameters,
# such as moe's expert parameters
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册