1. 22 5月, 2023 1 次提交
    • M
      [dygraph]unify _non_static_mode() in_dygraph_mode() and in_dynamic_mode() (#53856) · 3794d171
      Meteor Liu 提交于
      * [dygraph]unify _non_static_mode() in_dygraph_mode() and in_dynamic_mode()
      
      * [dygraph]unify _non_static_mode() in_dygraph_mode() and in_dynamic_mode()
      
      * [dygraph]unify _non_static_mode() in_dygraph_mode() and in_dynamic_mode()
      
      * [dygraph]unify _non_static_mode() in_dygraph_mode() and in_dynamic_mode()
      
      * [dygraph]unify _non_static_mode() in_dygraph_mode() and in_dynamic_mode()
      
      * [dygraph]unify _non_static_mode() in_dygraph_mode() and in_dynamic_mode()
      
      * fixed cyclic reference that caused patial import
      
      * fixed bad change
      
      * fix bad import
      
      * fix bad import
      
      * fix bad import
      
      * fix ut failed caused by change in_dynamic_mode
      
      * fix ut failed caused by change in_dynamic_mode
      
      * fixed usage of in_dynamic_mode() or in_dygraph_mode()
      
      * revert python3 to python in .pre-commit-config.yaml
      
      * fix merge conflicts
      3794d171
  2. 19 5月, 2023 1 次提交
    • L
      Add flash attention to speedup fused_gate_attention. (#52731) · d29c1f8e
      limingshu 提交于
      * Reorganize the forward codes of flash-attention.
      
      * Fix forward.
      
      * Remove some noused codes.
      
      * Simplify codes and fix backward.
      
      * Change all LOG(INFO) to VLOG and fix the backward.
      
      * add scale for AF2 flash_attn, much thanks to xreki and shaojie for debug these codes
      
      * decrease the effect of debug print on performance
      
      * Unify the initialize of flashattn arguments.
      
      * Rewirte the reshape of temp_mask and temp_bias.
      
      * API support use_flash_attn.
      
      * Fix compiling error on CI.
      
      * Try to crop the flash-attention lib.
      
      * Correct the condition of whether can use flash-attn.
      
      * Remove the softmax_out argument.
      
      * Remove is_causal.
      
      * Polish codes.
      
      * Fix qkv_transpose_out's shape and scaling of Q * K.
      
      * Update commit of flash-attention.
      
      ---------
      Co-authored-by: NLiu Yiqun <liuyiqun01@baidu.com>
      d29c1f8e
  3. 06 5月, 2023 1 次提交