未验证 提交 dfe6d8fa 编写于 作者: Z zyfncg 提交者: GitHub

Fix inference performance problem caused by selecting cudnn kernel of softmax (#47338)

* fix inference perfermence problem caused by selecting cudnn kernel for softmax

* recover use_cudnn in opmaker of softmax
上级 f7616d71
......@@ -83,6 +83,11 @@ class SoftmaxOpMaker : public framework::OpProtoAndCheckerMaker {
"Defaults to \"NHWC\". Specify the data format of the output data, "
"the input will be transformed automatically. ")
.SetDefault("AnyLayout");
AddAttr<bool>(
"use_cudnn",
"(bool, default false) Only used in cudnn kernel, need install cudnn")
.SetDefault(false)
.AsExtra();
AddComment(R"DOC(
Softmax Operator.
......
......@@ -686,7 +686,7 @@
- op : softmax
backward : softmax_grad
extra :
attrs : [bool use_cudnn = false, bool use_mkldnn = false, str mkldnn_data_type = "float32", bool is_test = false]
attrs : [bool use_mkldnn = false, str mkldnn_data_type = "float32", bool is_test = false]
- op : softplus
backward : softplus_grad
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册