未验证 提交 caa0f377 编写于 作者: S ShenLiang 提交者: GitHub

fix codestyle (#56066)

上级 6131aebc
......@@ -687,7 +687,6 @@
kernel :
func : flash_attn
data_type : q
intermediate : softmax_lse, seed_offset
backward : flash_attn_grad
- op : flash_attn_unpadded
......@@ -712,7 +711,6 @@
kernel :
func : flash_attn_v1
data_type : q
intermediate : softmax_lse, seed_offset
backward : flash_attn_v1_grad
- op : flash_attn_v1_unpadded
......
......@@ -92,7 +92,7 @@ def flash_attention(
"""
if in_dynamic_mode():
if g_use_flash_attn_v1:
(result_attention, result_softmax,) = _C_ops.flash_attn_v1(
(result_attention, result_softmax, _, _) = _C_ops.flash_attn_v1(
query,
key,
value,
......@@ -101,8 +101,9 @@ def flash_attention(
return_softmax,
not training,
)
else:
(result_attention, result_softmax,) = _C_ops.flash_attn(
(result_attention, result_softmax, _, _) = _C_ops.flash_attn(
query,
key,
value,
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册