未验证 提交 a049dff7 编写于 作者: G GaoWei8 提交者: GitHub

Modify the default setting of softmax cudnn (#28672)

上级 fafadbab
......@@ -1198,7 +1198,7 @@ def chunk_eval(input,
@deprecated(since="2.0.0", update_to="paddle.nn.functional.softmax")
def softmax(input, use_cudnn=False, name=None, axis=-1):
def softmax(input, use_cudnn=True, name=None, axis=-1):
r"""
This operator implements the softmax layer. The calculation process is as follows:
......
......@@ -843,7 +843,7 @@ def softmax(x, axis=-1, dtype=None, name=None):
if (dtype is not None) and (not isinstance(dtype, core.VarDesc.VarType)):
dtype = convert_np_dtype_to_dtype_(dtype)
use_cudnn = True if axis is -1 else False
use_cudnn = True
if in_dygraph_mode():
outs_cast = x if dtype is None \
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册