1. 31 3月, 2023 2 次提交
  2. 30 3月, 2023 3 次提交
  3. 28 3月, 2023 5 次提交
  4. 23 3月, 2023 1 次提交
  5. 22 3月, 2023 3 次提交
  6. 20 3月, 2023 1 次提交
    • X
      【prim】New layer_norm grad (#51750) · 802a81d0
      xiaoguoguo626807 提交于
      * Add flatten composite rule
      
      * get the right xshape and pass func test
      
      * add cinn unit test
      
      * Remove cinn test, wait for it to be added after repair
      
      * add comp test to test_flatten_contiguous_range_op.py
      
      * remove func test on composite_ops
      
      * Add comments to maybe_wrap_dim func
      
      * remove commented code
      
      * fix the problem with 0D tensor case
      
      * add flatten split rule comment
      
      * fix syntax issues
      
      * block flatten on resnet_prim_cinn
      
      * init change
      
      * tmp commit
      
      * add layer_norm InferMeta check
      
      * cast type modify
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes
      
      * add unittest
      
      * fix typo
      
      * fix typo
      
      * fix map.at
      
      * fix find
      
      * fix test
      
      * fix cinn cache key structure realize
      
      * using ordered map for attributes
      
      * add test by review advice
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * Pr 50885 (#7)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes
      
      * add unittest
      
      * fix typo
      
      * fix typo
      
      * fix map.at
      
      * fix find
      
      * fix test
      
      * fix cinn cache key structure realize
      
      * using ordered map for attributes
      
      * add test by review advice
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix code in a dy2static-friendly way.
      
      * [dystatic] add hooker for prim
      
      ---------
      Co-authored-by: NAurelius84 <zhangliujie@baidu.com>
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      Co-authored-by: Ncxxly <chenxx_id@163.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix cast prim and vjp dtype mapping error bug
      
      * recover
      
      * big tol
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes
      
      * add unittest
      
      * fix typo
      
      * fix typo
      
      * fix map.at
      
      * fix find
      
      * fix test
      
      * fix cinn cache key structure realize
      
      * using ordered map for attributes
      
      * add test by review advice
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * Pr 50885 (#7)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes
      
      * add unittest
      
      * fix typo
      
      * fix typo
      
      * fix map.at
      
      * fix find
      
      * fix test
      
      * fix cinn cache key structure realize
      
      * using ordered map for attributes
      
      * add test by review advice
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix code in a dy2static-friendly way.
      
      * [dystatic] add hooker for prim
      
      ---------
      Co-authored-by: NAurelius84 <zhangliujie@baidu.com>
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      Co-authored-by: Ncxxly <chenxx_id@163.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix cast prim and vjp dtype mapping error bug
      
      * Cxx prim custom vjp (#8)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * Pr 50885 (#7)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix code in a dy2static-friendly way.
      
      * [dystatic] add hooker for prim
      
      ---------
      Co-authored-by: NAurelius84 <zhangliujie@baidu.com>
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      Co-authored-by: Ncxxly <chenxx_id@163.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix cast prim and vjp dtype mapping error bug
      
      * [dy2static-ci] fix dy2static ci errors.
      
      ---------
      Co-authored-by: NAurelius84 <zhangliujie@baidu.com>
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      Co-authored-by: Ncxxly <chenxx_id@163.com>
      
      * [Prim] enable whitelist and blacklist for custom_vjp
      
      * debug log
      
      * clear log
      
      * fix
      
      * nothing
      
      * less memory
      
      * recover utils
      
      * fix
      
      * modify threshold value
      
      * skip layer_norm for test_bert
      
      * back to bert success state
      
      * add epsion
      
      * delete unnecessary compute
      
      * modify amp dtype
      
      * modify * order
      
      * delete sqrt check and fp16
      
      ---------
      Co-authored-by: Nxuyongsheng <xuyongsheng@baidu.com>
      Co-authored-by: Nxysheng-baidu <121540080+xysheng-baidu@users.noreply.github.com>
      Co-authored-by: NAurelius84 <zhangliujie@baidu.com>
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      Co-authored-by: Ncxxly <chenxx_id@163.com>
      Co-authored-by: Nxiongkun <807377414@qq.com>
      802a81d0
  7. 17 3月, 2023 1 次提交
  8. 15 3月, 2023 4 次提交
  9. 14 3月, 2023 1 次提交
  10. 13 3月, 2023 1 次提交
    • H
      【prim】Maximum grad (#51006) · 4a484973
      heyanru 提交于
      * refresh
      
      * compat
      
      * register
      
      * testop
      
      * fix
      
      * fix
      
      * fox
      
      * cast
      
      * cast
      
      * fix
      
      * type
      
      * fix
      
      * out
      
      * cast
      
      * fix
      
      * fix
      
      * fix
      
      * broad
      
      * broad
      
      * broad
      
      * fix
      
      * fix
      
      * fix
      
      * fix
      
      * fix
      
      * broad
      
      * broad
      
      * numel
      
      * fix
      
      * fix
      
      * fix
      
      * fix
      
      * cinn
      
      * fix
      
      * fix
      
      * fix
      
      * fix
      4a484973
  11. 10 3月, 2023 1 次提交
    • HappyHeavyRain's avatar
      [New features]Add function node in phi_kernel for MKLDNN (#51073) · a0a6dc6a
      HappyHeavyRain 提交于
      * Add function node in phi_kernel for MKLDNN
      
      * fix the bug in 'BuildInferVarKernelContext'
      
      * add infer_varkernel_utils.cc
      
      * fix the bug:the first two parametes of 'BuildInferVarKernelContext' can't be template variable
      
      * change the code according to first review
      
      * change the code according to first review
      
      * change the mode of paddle_build.sh
      
      * change 'infer_var_kernel_fn_' to 'get_kerneltype_forvar_fn_'
      
      * add the error information
      
      * fix NotFound infomation warning
      
      * fix NotFound infomation warning
      
      * fix NotFound infomation warning
      a0a6dc6a
  12. 09 3月, 2023 3 次提交
  13. 02 3月, 2023 1 次提交
    • W
      Add concat grad cinn (#50972) · a4689c90
      wangzhen38 提交于
      * [cinn] concat_grad
      
      * [cinn] concat_grad
      
      * [cinn] concat_grad build success
      
      * [Add PGLBOX] fix unnitest
      
      * [Add PGLBOX] fix unnitest
      
      * [Add PGLBOX] fix codestyle
      
      * [cinn] update by comments
      
      * [cinn] update by comment
      
      * [cinn] add axis check
      a4689c90
  14. 28 2月, 2023 3 次提交
    • X
      【prim】Matmul double grad composite api (#50452) · a0c473f4
      xiaoguoguo626807 提交于
      * modify name
      
      * merge develop
      
      * original code
      
      * build modify
      
      * success 2*2
      
      * fused dim=1 failed
      
      * success
      
      * modify static
      
      * success for static except dim=1
      
      * delete log
      
      * tmp modify
      
      * success
      
      * success
      
      * add fp1664
      
      * delete fp16 cpu test
      
      * stop windows test
      
      * review modify
      
      * modify tanh test
      
      * modify tanh
      
      * fix_conflixt
      
      * modift static prim
      
      * fix_conflict
      
      * Update test_static_prim.cc
      
      * update
      
      * bug fix
      a0c473f4
    • G
      add cumsum prim backward (#50565) · ca2b6095
      GGBond8488 提交于
      * add cumsum prim backward
      
      * skip aixs=None test case
      
      * fix op generante eror
      
      * fix static test error
      
      * remove unused code
      
      * fix static test error
      
      * skip cpu float16 test case
      
      * skip eager cpu cumsum float16 test case
      
      * add cinn test
      
      * reshape flatten out
      
      * Disable cinn single test
      
      * remove cinn test
      
      * reformat todo
      
      * add prim in cumsum op test
      
      * remove old test
      
      * fix typro
      
      * fix typro
      
      * fix typro
      
      * pass axis=None test case
      
      * remove forward prim test
      
      * remove same name axis
      ca2b6095
    • J
      【Prim】Reshape, transpose, cast vjp (#50778) · ab1b6303
      Jiabin Yang 提交于
      * support transpose and reshape
      
      * support reshpe, transpose, cast vjp
      
      * merge develop
      
      * recover unused file
      
      * remove prim base
      
      * support problem
      
      * remove additional status settting
      
      * remove additional status settting
      
      * fix ut
      
      * fix ut
      
      * fix ut
      
      * fix no grad branch
      
      * add more test
      
      * disable fp16 in cpu
      
      * fix test
      ab1b6303
  15. 25 2月, 2023 1 次提交
  16. 24 2月, 2023 1 次提交
    • X
      【prim】Slice grad (#50771) · f6dea800
      xiaoguoguo626807 提交于
      * support prim test in OpTest
      
      * fix cmake
      
      * fix op test
      
      * fix test_input_spec
      
      * disable cinn in reduce_sum unit test
      
      * add bfloat16 dtype for sum
      
      * add approve rules
      
      * polish code
      
      * add clear jit program function
      
      * convert grad out from tensor to numpy
      
      * remove unnecessary code
      
      * add only_prim flag
      
      * fix flag
      
      * fix op test
      
      * add attr
      
      * fix optest comp inplace error
      
      * fix op test
      
      * fix op test with guard
      
      * add initialization of check_comp flag
      
      * fix comp inplace error in op test
      
      * rename check_comp with check_prim and add bfloat16 dtype convert
      
      * rename comp_op_type to prim_op_type
      
      * rename comp to prim
      
      * remove useless code
      
      * skip ci check for only prim
      
      * add no_grad_vars and grad_outputs in prim test
      
      * fix var_dict
      
      * fix op test for only_prim
      
      * fix dy2static bugs
      
      * polish some code
      
      * temp
      
      * modify op test
      
      * except cinn test
      
      * modify bfp16
      
      * modify pad grad
      
      * add pad_grad dtype
      
      * start cinn part
      
      ---------
      Co-authored-by: NCharles-hit <wanghao107@baidu.com>
      f6dea800
  17. 23 2月, 2023 2 次提交
    • HappyHeavyRain's avatar
      Support 'complex promote' in yaml (#50611) · 91a3d159
      HappyHeavyRain 提交于
      * support 'complex promote' in yaml
      
      * change the compplex_promote
      
      * change 'kron' in math.py
      
      * change 'kron' comment in python
      
      * change kron comment in python
      
      * change kron comment in python
      91a3d159
    • J
      【Prim】Enhance gather vjp (#50786) · dca3a099
      Jiabin Yang 提交于
      * tmp gather vjp
      
      * support gather
      
      * remove useless code
      
      * fix compiling error
      
      * fix ut
      
      * add eager test
      
      * add eager test
      
      * add seed
      
      * fix cpu error
      
      * fix transpose op compat
      
      * remove tensor index case
      
      * fix prim_cinn
      
      * fix ut
      
      * add gather composite
      dca3a099
  18. 21 2月, 2023 1 次提交
    • HappyHeavyRain's avatar
      Support bw invoke fw (#50260) · d8845735
      HappyHeavyRain 提交于
      * support bw invoke fw
      
      * fix scale in static_backward.yaml
      
      * fix the bug in tensorrt/convert
      
      * move 'scale','sign' into ops.yaml
      
      * add scale_grad of scale in op_compat.yaml
      
      * change generated_static_op in CMakeLists.txt
      d8845735
  19. 13 2月, 2023 1 次提交
  20. 03 2月, 2023 1 次提交
  21. 20 1月, 2023 1 次提交
  22. 18 1月, 2023 1 次提交
  23. 17 1月, 2023 1 次提交
    • X
      【Prim】Add multiply,expand,div vjp rules (#49831) · 39c6765a
      Xiaoxu Chen 提交于
      * support elementwise base func
      
      * fix compiling error and add test
      
      * support vjp for div using comp
      
      * remove additional change
      
      * fix dy2st error with magic num
      
      * fix dy magic num
      
      * another magic
      
      * another magic
      
      * another magic
      
      * add skip rename strategy
      
      * support add vjp
      
      * support add with new axis cal
      
      * support sub vjp
      
      * [prim] add multiply vjp rules
      
      * [prim] add multiply vjp rules
      
      * [prim] fix no infershape with composite in _append_backward_ops
      
      * [prim] add expand vjp rule
      
      * [prim] add exp vjp rule
      
      * uncomment infer shape for reshape/sum static prim api
      
      * [prim] fix tanh nullptr error
      
      * remove some print message
      
      * fix magic number in run_program relative tests @JiaBinYang
      
      * [prim] add expand,multiply,exp vjp rules
      
      * fix only support single direction reduce error
      
      * infer reduce dims using out dims
      Co-authored-by: NJiabinYang <360788950@qq.com>
      39c6765a