未验证 提交 caa0f377 编写于 作者: S ShenLiang 提交者: GitHub

fix codestyle (#56066)

上级 6131aebc
...@@ -687,7 +687,6 @@ ...@@ -687,7 +687,6 @@
kernel : kernel :
func : flash_attn func : flash_attn
data_type : q data_type : q
intermediate : softmax_lse, seed_offset
backward : flash_attn_grad backward : flash_attn_grad
- op : flash_attn_unpadded - op : flash_attn_unpadded
...@@ -712,7 +711,6 @@ ...@@ -712,7 +711,6 @@
kernel : kernel :
func : flash_attn_v1 func : flash_attn_v1
data_type : q data_type : q
intermediate : softmax_lse, seed_offset
backward : flash_attn_v1_grad backward : flash_attn_v1_grad
- op : flash_attn_v1_unpadded - op : flash_attn_v1_unpadded
......
...@@ -92,7 +92,7 @@ def flash_attention( ...@@ -92,7 +92,7 @@ def flash_attention(
""" """
if in_dynamic_mode(): if in_dynamic_mode():
if g_use_flash_attn_v1: if g_use_flash_attn_v1:
(result_attention, result_softmax,) = _C_ops.flash_attn_v1( (result_attention, result_softmax, _, _) = _C_ops.flash_attn_v1(
query, query,
key, key,
value, value,
...@@ -101,8 +101,9 @@ def flash_attention( ...@@ -101,8 +101,9 @@ def flash_attention(
return_softmax, return_softmax,
not training, not training,
) )
else: else:
(result_attention, result_softmax,) = _C_ops.flash_attn( (result_attention, result_softmax, _, _) = _C_ops.flash_attn(
query, query,
key, key,
value, value,
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册