1. 03 7月, 2023 1 次提交
  2. 30 6月, 2023 1 次提交
  3. 28 6月, 2023 1 次提交
  4. 26 6月, 2023 1 次提交
  5. 20 6月, 2023 2 次提交
  6. 19 6月, 2023 1 次提交
  7. 16 6月, 2023 1 次提交
    • J
      [kunlun] support xpu runtime profiler (#54685) · 82eeda69
      jameszhang 提交于
      * [kunlun] support xpu runtime profiler
      
      * fix cmake error
      
      * add libxpti.so to paddle package
      
      * fix for style check
      
      * sync change in setup.py and python/setup.py.in
      
      * remove libxpti.so from paddle output dir in this PR
      82eeda69
  8. 15 6月, 2023 2 次提交
  9. 14 6月, 2023 1 次提交
  10. 09 6月, 2023 1 次提交
  11. 08 6月, 2023 2 次提交
  12. 02 6月, 2023 2 次提交
  13. 01 6月, 2023 1 次提交
  14. 26 5月, 2023 1 次提交
    • Y
      [PHI Decoupling]Create PHI shared lib (#53735) · da50a009
      YuanRisheng 提交于
      * create phi so
      
      * fix ci bugs
      
      * fix py3 bugs
      
      * add file
      
      * fix py3 bugs
      
      * fix windows bugs
      
      * perfect so
      
      * fix py3 bugs
      
      * delete all static target in phi
      
      * fix windows bugs
      
      * fix py3 bugs
      
      * fix ci bugs
      
      * fix windows bugs
      
      * fix bugs: gflags can't be linked by dynamic and static lib
      
      * fix bugs that can not load 3rd party
      
      * fix ci bugs
      
      * fix compile bugs
      
      * fix py3 bugs
      
      * fix conflict
      
      * fix xpu bugs
      
      * fix mac compile bugs
      
      * fix psgpu bugs
      
      * fix inference failed
      
      * deal with conflict
      
      * fix LIBRARY_PATH bug
      
      * fix windows bugs
      
      * fix onednn error
      
      * fix windows compile bugs
      
      * fix windows compile bugs
      
      * fix test_cuda_graph_static_mode_error aborted
      
      * fix windows bugs
      
      * fix mac-python3 error
      
      * fix hip compile bugs
      
      * change mode to static
      
      * change to static mode
      
      * fix ci bugs
      
      * fix py3 bugs
      
      * fix windows bugs
      
      * fix bugs
      
      * add static flag
      
      * add PADDLE_API
      
      * change position of PADDLE_API
      
      * fix windows bugs
      
      * change mode to dynamic lib
      
      * fix windows static bugs
      
      * deal with conflict
      
      * fix windows unit bug
      
      * fix coverage
      
      * deal with conflict
      
      * fix windows-inference
      
      * fix py3 bugs
      
      * fix bugs when compile type_info
      
      * fix compile bugs
      
      * fix py3 bugs
      
      * fix windows bugs
      
      * fix windows openblas
      
      * fix xpu bugs
      
      * fix enforce_test in windows
      
      * update code according comment
      
      * fix windows cmake bug
      
      * fix windows bugs
      
      * fix windows bugs
      
      * delete cinn unittest
      
      * fix cinn bugs
      
      ---------
      Co-authored-by: HappyHeavyRain's avatarlzydev <1528794076@qq.com>
      da50a009
  15. 25 5月, 2023 1 次提交
  16. 24 5月, 2023 3 次提交
  17. 23 5月, 2023 2 次提交
  18. 22 5月, 2023 2 次提交
  19. 19 5月, 2023 2 次提交
    • W
      [XPU] fix fallback (#53801) · 4b85e5db
      wz1qqx 提交于
      4b85e5db
    • L
      Add flash attention to speedup fused_gate_attention. (#52731) · d29c1f8e
      limingshu 提交于
      * Reorganize the forward codes of flash-attention.
      
      * Fix forward.
      
      * Remove some noused codes.
      
      * Simplify codes and fix backward.
      
      * Change all LOG(INFO) to VLOG and fix the backward.
      
      * add scale for AF2 flash_attn, much thanks to xreki and shaojie for debug these codes
      
      * decrease the effect of debug print on performance
      
      * Unify the initialize of flashattn arguments.
      
      * Rewirte the reshape of temp_mask and temp_bias.
      
      * API support use_flash_attn.
      
      * Fix compiling error on CI.
      
      * Try to crop the flash-attention lib.
      
      * Correct the condition of whether can use flash-attn.
      
      * Remove the softmax_out argument.
      
      * Remove is_causal.
      
      * Polish codes.
      
      * Fix qkv_transpose_out's shape and scaling of Q * K.
      
      * Update commit of flash-attention.
      
      ---------
      Co-authored-by: NLiu Yiqun <liuyiqun01@baidu.com>
      d29c1f8e
  20. 18 5月, 2023 2 次提交
  21. 15 5月, 2023 2 次提交
  22. 12 5月, 2023 2 次提交
  23. 11 5月, 2023 4 次提交
  24. 10 5月, 2023 1 次提交
  25. 09 5月, 2023 1 次提交