- 05 6月, 2023 1 次提交
-
-
由 Wang Xin 提交于
* third-party lib offline compilation support for mkldnn flashattn and gtest * fix bug * ignore dirty
-
- 19 5月, 2023 1 次提交
-
-
由 limingshu 提交于
* Reorganize the forward codes of flash-attention. * Fix forward. * Remove some noused codes. * Simplify codes and fix backward. * Change all LOG(INFO) to VLOG and fix the backward. * add scale for AF2 flash_attn, much thanks to xreki and shaojie for debug these codes * decrease the effect of debug print on performance * Unify the initialize of flashattn arguments. * Rewirte the reshape of temp_mask and temp_bias. * API support use_flash_attn. * Fix compiling error on CI. * Try to crop the flash-attention lib. * Correct the condition of whether can use flash-attn. * Remove the softmax_out argument. * Remove is_causal. * Polish codes. * Fix qkv_transpose_out's shape and scaling of Q * K. * Update commit of flash-attention. --------- Co-authored-by: NLiu Yiqun <liuyiqun01@baidu.com>
-
- 20 4月, 2023 1 次提交
-
-
由 Chitsing KUI 提交于
* add flash randomness control * fix VLOG undefied
-
- 29 3月, 2023 1 次提交
-
-
由 chalsliu 提交于
* Fix flashattn build error on jetson * Fix nvcc not found on jetson
-
- 01 3月, 2023 1 次提交
-
-
由 Chitsing KUI 提交于
* flash attn * seed * almost * softmax * fix workspace * add unitest; linux only * fix setup * fix datatype include * fix setup typo * fix def scope * new error api * use paddle fork * fix attr bug; complete ut * update flash hash * fix rng reset * fix offset * fix comments
-