- 19 5月, 2023 1 次提交
-
-
由 limingshu 提交于
* Reorganize the forward codes of flash-attention. * Fix forward. * Remove some noused codes. * Simplify codes and fix backward. * Change all LOG(INFO) to VLOG and fix the backward. * add scale for AF2 flash_attn, much thanks to xreki and shaojie for debug these codes * decrease the effect of debug print on performance * Unify the initialize of flashattn arguments. * Rewirte the reshape of temp_mask and temp_bias. * API support use_flash_attn. * Fix compiling error on CI. * Try to crop the flash-attention lib. * Correct the condition of whether can use flash-attn. * Remove the softmax_out argument. * Remove is_causal. * Polish codes. * Fix qkv_transpose_out's shape and scaling of Q * K. * Update commit of flash-attention. --------- Co-authored-by: NLiu Yiqun <liuyiqun01@baidu.com>
-
- 07 12月, 2022 1 次提交
-
-
由 张春乔 提交于
-
- 28 9月, 2022 1 次提交
-
-
由 Chen Weihang 提交于
* remove needless using tensor * remove needless using tensor * resolve conflict * replace tensor using * fix format error * revert needless changing * fix rocm and npu compile error * fix cinn compile error * fix format error * fix mkldnn format error * fix mkldnn format error * fix cinn compile error * fix cinn compile error * fix cinn compile error * resolve conflict
-
- 15 9月, 2022 1 次提交
-
-
由 Nyakku Shigure 提交于
-
- 19 7月, 2022 1 次提交
-
-
由 Ruibiao Chen 提交于
* Rename BOOST_GET macros * Fix conflicts
-
- 26 6月, 2022 1 次提交
-
-
由 Sing_chan 提交于
-
- 15 6月, 2022 1 次提交
-
-
由 Yiqun Liu 提交于
* Optimize prod's python implementation for dygraph. * Change key_dim to head_dim. * Add comment in unittest. * Disable TF32 in unittest.
-
- 08 6月, 2022 1 次提交
-
-
由 Yiqun Liu 提交于
* Polish codes and memory usage for fused_gate_attention. * Fix wrong reduce_dims in fused_gate_attention when computing gradient of nonbatched_bias.
-
- 05 6月, 2022 1 次提交
-
-
由 Sing_chan 提交于
-
- 30 5月, 2022 1 次提交
-
-
由 crystal 提交于
-