1. 10 8月, 2023 6 次提交
  2. 09 8月, 2023 17 次提交
  3. 08 8月, 2023 16 次提交
  4. 07 8月, 2023 1 次提交
    • Y
      Add attn_mask supported for FlashAttnKernel. (#55969) · 42e0c6b8
      yin wei 提交于
      * add mask
      
      * add backword
      
      * add enforce info
      
      * update scale
      
      * integrate code
      
      * update enforce
      
      * add enforce eq
      
      * add error type
      
      * update enforce
      
      * add test_flash_attention
      
      * Polish codes and fix compiling errors.
      
      * Set num_splits to 0 for flash-attn with tensor mask.
      
      * Fix the compiling error for non flash-attn case.
      
      ---------
      Co-authored-by: NLiu Yiqun <liuyiqun01@baidu.com>
      42e0c6b8