1. 20 3月, 2023 3 次提交
    • X
      【prim】New layer_norm grad (#51750) · 802a81d0
      xiaoguoguo626807 提交于
      * Add flatten composite rule
      
      * get the right xshape and pass func test
      
      * add cinn unit test
      
      * Remove cinn test, wait for it to be added after repair
      
      * add comp test to test_flatten_contiguous_range_op.py
      
      * remove func test on composite_ops
      
      * Add comments to maybe_wrap_dim func
      
      * remove commented code
      
      * fix the problem with 0D tensor case
      
      * add flatten split rule comment
      
      * fix syntax issues
      
      * block flatten on resnet_prim_cinn
      
      * init change
      
      * tmp commit
      
      * add layer_norm InferMeta check
      
      * cast type modify
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes
      
      * add unittest
      
      * fix typo
      
      * fix typo
      
      * fix map.at
      
      * fix find
      
      * fix test
      
      * fix cinn cache key structure realize
      
      * using ordered map for attributes
      
      * add test by review advice
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * Pr 50885 (#7)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes
      
      * add unittest
      
      * fix typo
      
      * fix typo
      
      * fix map.at
      
      * fix find
      
      * fix test
      
      * fix cinn cache key structure realize
      
      * using ordered map for attributes
      
      * add test by review advice
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix code in a dy2static-friendly way.
      
      * [dystatic] add hooker for prim
      
      ---------
      Co-authored-by: NAurelius84 <zhangliujie@baidu.com>
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      Co-authored-by: Ncxxly <chenxx_id@163.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix cast prim and vjp dtype mapping error bug
      
      * recover
      
      * big tol
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes
      
      * add unittest
      
      * fix typo
      
      * fix typo
      
      * fix map.at
      
      * fix find
      
      * fix test
      
      * fix cinn cache key structure realize
      
      * using ordered map for attributes
      
      * add test by review advice
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * Pr 50885 (#7)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes
      
      * add unittest
      
      * fix typo
      
      * fix typo
      
      * fix map.at
      
      * fix find
      
      * fix test
      
      * fix cinn cache key structure realize
      
      * using ordered map for attributes
      
      * add test by review advice
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix code in a dy2static-friendly way.
      
      * [dystatic] add hooker for prim
      
      ---------
      Co-authored-by: NAurelius84 <zhangliujie@baidu.com>
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      Co-authored-by: Ncxxly <chenxx_id@163.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix cast prim and vjp dtype mapping error bug
      
      * Cxx prim custom vjp (#8)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * Pr 50885 (#7)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix code in a dy2static-friendly way.
      
      * [dystatic] add hooker for prim
      
      ---------
      Co-authored-by: NAurelius84 <zhangliujie@baidu.com>
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      Co-authored-by: Ncxxly <chenxx_id@163.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix cast prim and vjp dtype mapping error bug
      
      * [dy2static-ci] fix dy2static ci errors.
      
      ---------
      Co-authored-by: NAurelius84 <zhangliujie@baidu.com>
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      Co-authored-by: Ncxxly <chenxx_id@163.com>
      
      * [Prim] enable whitelist and blacklist for custom_vjp
      
      * debug log
      
      * clear log
      
      * fix
      
      * nothing
      
      * less memory
      
      * recover utils
      
      * fix
      
      * modify threshold value
      
      * skip layer_norm for test_bert
      
      * back to bert success state
      
      * add epsion
      
      * delete unnecessary compute
      
      * modify amp dtype
      
      * modify * order
      
      * delete sqrt check and fp16
      
      ---------
      Co-authored-by: Nxuyongsheng <xuyongsheng@baidu.com>
      Co-authored-by: Nxysheng-baidu <121540080+xysheng-baidu@users.noreply.github.com>
      Co-authored-by: NAurelius84 <zhangliujie@baidu.com>
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      Co-authored-by: Ncxxly <chenxx_id@163.com>
      Co-authored-by: Nxiongkun <807377414@qq.com>
      802a81d0
    • M
      52e1742f
    • zhouweiwei2014's avatar
      Fix unsqueeze with empty axis bug (#51828) · 7a79fd88
      zhouweiwei2014 提交于
      7a79fd88
  2. 16 3月, 2023 2 次提交
  3. 15 3月, 2023 2 次提交
    • I
      add output defs for eig kernel (#51319) · 5cb95856
      Infinity_lee 提交于
      * fix eig
      
      * fix
      
      * fix
      
      * fix
      
      * fix
      5cb95856
    • iSerendipity's avatar
      [PHI] remove operator.h in blas.h (rebase to latest codebase) (#51472) · 427712df
      iSerendipity 提交于
      * Revert "Revert "【Hackathon No.67】remove operator.h in blas.h (#50989)" (#51467)"
      
      This reverts commit b9d91531.
      
      * remove cout
      
      * add header
      
      * fix missing header
      
      * fix refer fluid error
      
      * fix missing header
      
      * 更新 repeat_interleave_grad_kernel_impl.h
      
      Change to phi style datatype.
      
      * 更新 repeat_interleave_grad_kernel_impl.h
      
      Fix missing header
      
      * datatype fluid -> phi
      
      * paddle::experimental -> phi
      
      * fix reference error
      
      * fix reference error
      
      * fix reference error
      
      * fix errors
      
      * fix missing FLAGS
      
      * fix missing headers
      
      * fix missing headers
      
      * fix missing headers
      
      * fix missing headers
      
      * fix missing header
      
      * fix missing header
      
      * fix errors
      427712df
  4. 14 3月, 2023 2 次提交
  5. 13 3月, 2023 3 次提交
  6. 09 3月, 2023 3 次提交
  7. 08 3月, 2023 1 次提交
  8. 06 3月, 2023 1 次提交
  9. 03 3月, 2023 1 次提交
  10. 01 3月, 2023 2 次提交
  11. 27 2月, 2023 1 次提交
  12. 24 2月, 2023 1 次提交
  13. 23 2月, 2023 1 次提交
  14. 22 2月, 2023 3 次提交
  15. 18 2月, 2023 1 次提交
  16. 17 2月, 2023 2 次提交
  17. 16 2月, 2023 2 次提交
  18. 10 2月, 2023 2 次提交
  19. 09 2月, 2023 1 次提交
    • Y
      Add MultiTenosrAdam OP (#49220) · 10654c77
      yuehuayingxueluo 提交于
      * add multi_tenosr_adam
      
      * update multi_tensor_base.py, test_multi_tensor_adam.py, adamw.py
      
      * fix adam.py optimizer.py
      
      * fix adamw.py
      
      * fix test_multi_tensor_adam.py
      
      * fix CI bug
      
      * fix CI coverage
      
      * fix ci bug
      
      * fix betapow
      
      * fix some bugs
      
      * fix test_adamw_op.py
      
      * fix CI coverage
      
      * fix multi_tensor_adam_kernel.cc
      
      * fix CI bug
      
      * fix multi_tensor_adam_op.cc and test_multi_tensor_adam.py
      
      * fix code style
      
      * update C++ parts
      
      * remove python parts modification temporarily
      
      * add C++ ut
      
      * update betapow copy code logic
      
      * fix ci ut
      
      * fix windows ci
      
      * fix coverage ci
      
      * improve coverage rate
      
      ---------
      Co-authored-by: Nsneaxiy <sneaxiy@126.com>
      10654c77
  20. 08 2月, 2023 1 次提交
  21. 07 2月, 2023 1 次提交
  22. 06 2月, 2023 2 次提交
  23. 01 2月, 2023 2 次提交