[X86][CUDA] add attention_padding_mask op, x86 kernel, cuda kernel and unit tests (#2437)
* [X86] add attention_padding_mask op, x86 kernel and unit test test=develop * [CUDA] add attention_padding_mask cuda kernel and unit test test=develop
Showing
想要评论请 注册 或 登录