未验证 提交 e9ca7600 编写于 作者: F feng_shuai 提交者: GitHub

feat:add the support for vit_attention_op on gpu (#48515)

上级 5de01e8a
...@@ -216,6 +216,7 @@ GpuPassStrategy::GpuPassStrategy() : PassStrategy({}) { ...@@ -216,6 +216,7 @@ GpuPassStrategy::GpuPassStrategy() : PassStrategy({}) {
"conv_eltwiseadd_bn_fuse_pass", // "conv_eltwiseadd_bn_fuse_pass", //
"embedding_eltwise_layernorm_fuse_pass", // "embedding_eltwise_layernorm_fuse_pass", //
"multihead_matmul_fuse_pass_v2", // "multihead_matmul_fuse_pass_v2", //
"vit_attention_fuse_pass", //
"fused_multi_transformer_encoder_pass", // "fused_multi_transformer_encoder_pass", //
"fused_multi_transformer_decoder_pass", // "fused_multi_transformer_decoder_pass", //
"fused_multi_transformer_encoder_fuse_qkv_pass", // "fused_multi_transformer_encoder_fuse_qkv_pass", //
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册