1. 12 4月, 2023 2 次提交
    • H
      Modify LayerNorm Composite Rule (#52712) · a2060568
      Huihuang Zheng 提交于
      * [Do NOT merge] Expr PR on Composite
      
      * Expr PR on Composite
      
      * Revert some compsite experiment
      
      * Remove unnecessary composite code
      
      * Add rsqrt as sub primitives
      a2060568
    • C
      [Prim] Add instance_norm composite rule (#52203) · b0f17d05
      chenjian 提交于
      * fix
      
      * fix
      
      * fix
      
      * fix
      
      * fix
      
      * fix
      
      * fix
      
      * fix
      
      * fix
      
      * fix
      
      * fix
      
      * fix
      
      * fix
      
      * fix
      
      * fix
      
      * isamp
      
      * gpu
      
      * cpu
      
      * noamp
      
      * fix instance_norm
      
      * fix
      
      * fix unit test
      
      * fix unit test
      
      * add unit test
      
      * fix
      
      * add big data tests
      
      * fix
      
      * fix
      
      * fix
      
      * fix
      
      * fix
      
      * fix
      
      * fix
      
      * add test case
      
      * fix
      
      * fix
      
      * fix
      
      * fix
      
      * fix
      
      * remove amp test
      
      ---------
      Co-authored-by: Nheyanru01 <429520051@qq.com>
      b0f17d05
  2. 04 4月, 2023 1 次提交
  3. 03 4月, 2023 1 次提交
  4. 29 3月, 2023 1 次提交
    • Y
      Add group_norm composite rule (#51874) · cabf3921
      Yichen Zhang 提交于
      * add group_norm composite rule
      
      * add test for scale_grad and bias_grad
      
      * resolve conflicts
      
      * remove amp in composite_rule.py
      
      * add float16 test
      
      * deal with NHWC format
      
      * keep the composite rule in float16 identical as original kernel
      
      * resolve conflicts
      cabf3921
  5. 28 3月, 2023 1 次提交
  6. 27 3月, 2023 1 次提交
  7. 23 3月, 2023 2 次提交
  8. 21 3月, 2023 1 次提交
  9. 20 3月, 2023 3 次提交
    • X
      【prim】New layer_norm grad (#51750) · 802a81d0
      xiaoguoguo626807 提交于
      * Add flatten composite rule
      
      * get the right xshape and pass func test
      
      * add cinn unit test
      
      * Remove cinn test, wait for it to be added after repair
      
      * add comp test to test_flatten_contiguous_range_op.py
      
      * remove func test on composite_ops
      
      * Add comments to maybe_wrap_dim func
      
      * remove commented code
      
      * fix the problem with 0D tensor case
      
      * add flatten split rule comment
      
      * fix syntax issues
      
      * block flatten on resnet_prim_cinn
      
      * init change
      
      * tmp commit
      
      * add layer_norm InferMeta check
      
      * cast type modify
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes
      
      * add unittest
      
      * fix typo
      
      * fix typo
      
      * fix map.at
      
      * fix find
      
      * fix test
      
      * fix cinn cache key structure realize
      
      * using ordered map for attributes
      
      * add test by review advice
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * Pr 50885 (#7)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes
      
      * add unittest
      
      * fix typo
      
      * fix typo
      
      * fix map.at
      
      * fix find
      
      * fix test
      
      * fix cinn cache key structure realize
      
      * using ordered map for attributes
      
      * add test by review advice
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix code in a dy2static-friendly way.
      
      * [dystatic] add hooker for prim
      
      ---------
      Co-authored-by: NAurelius84 <zhangliujie@baidu.com>
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      Co-authored-by: Ncxxly <chenxx_id@163.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix cast prim and vjp dtype mapping error bug
      
      * recover
      
      * big tol
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes
      
      * add unittest
      
      * fix typo
      
      * fix typo
      
      * fix map.at
      
      * fix find
      
      * fix test
      
      * fix cinn cache key structure realize
      
      * using ordered map for attributes
      
      * add test by review advice
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * Pr 50885 (#7)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes
      
      * add unittest
      
      * fix typo
      
      * fix typo
      
      * fix map.at
      
      * fix find
      
      * fix test
      
      * fix cinn cache key structure realize
      
      * using ordered map for attributes
      
      * add test by review advice
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix code in a dy2static-friendly way.
      
      * [dystatic] add hooker for prim
      
      ---------
      Co-authored-by: NAurelius84 <zhangliujie@baidu.com>
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      Co-authored-by: Ncxxly <chenxx_id@163.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix cast prim and vjp dtype mapping error bug
      
      * Cxx prim custom vjp (#8)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * Pr 50885 (#7)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557)
      
      * [CINN]Enhance CacheKey hash logic by considering input dtypes
      
      ---------
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix code in a dy2static-friendly way.
      
      * [dystatic] add hooker for prim
      
      ---------
      Co-authored-by: NAurelius84 <zhangliujie@baidu.com>
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      Co-authored-by: Ncxxly <chenxx_id@163.com>
      
      * [prim] enable dygraph_to_static to support custom_vjp
      
      * fix cast prim and vjp dtype mapping error bug
      
      * [dy2static-ci] fix dy2static ci errors.
      
      ---------
      Co-authored-by: NAurelius84 <zhangliujie@baidu.com>
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      Co-authored-by: Ncxxly <chenxx_id@163.com>
      
      * [Prim] enable whitelist and blacklist for custom_vjp
      
      * debug log
      
      * clear log
      
      * fix
      
      * nothing
      
      * less memory
      
      * recover utils
      
      * fix
      
      * modify threshold value
      
      * skip layer_norm for test_bert
      
      * back to bert success state
      
      * add epsion
      
      * delete unnecessary compute
      
      * modify amp dtype
      
      * modify * order
      
      * delete sqrt check and fp16
      
      ---------
      Co-authored-by: Nxuyongsheng <xuyongsheng@baidu.com>
      Co-authored-by: Nxysheng-baidu <121540080+xysheng-baidu@users.noreply.github.com>
      Co-authored-by: NAurelius84 <zhangliujie@baidu.com>
      Co-authored-by: Njiangcheng <thisjiang@qq.com>
      Co-authored-by: Ncxxly <chenxx_id@163.com>
      Co-authored-by: Nxiongkun <807377414@qq.com>
      802a81d0
    • warrentdrew's avatar
      add composite rules for squeeze op (#51539) · 89ff0d59
      warrentdrew 提交于
      * add composite rule for squeeze
      
      * fix pre commit
      
      * fix pre commit
      
      * simplify rules
      
      * arrange code
      
      * fix int axis
      
      * simplify squeeze axis rules
      
      * bugfix
      
      * fix pre commit
      89ff0d59
    • J
      support relue custom vjp (#51742) · 604b7a53
      Jiabin Yang 提交于
      604b7a53
  10. 17 3月, 2023 2 次提交
    • C
      [Prim] support batch_norm vjp (#51283) · ff40a7e5
      cyber-pioneer 提交于
      * add bn vjp
      
      * fix example
      
      * fix code
      
      * fix code
      
      * fix cinn case
      
      * fix code
      
      * fix example
      
      * fix code
      
      * fix example
      
      * fix example
      ff40a7e5
    • M
      Add sqrt composite rule (#51080) · aba9c4d4
      mhy-666 提交于
      * add sqrt composite rule/test
      
      * add sqrt composite rule/test
      
      * fix ops/sqrt, add cinn test
      
      * fix sqrt_comp
      
      * fix sqrt_comp
      
      * fix sqrt_comp
      
      * fix
      
      * fix codestyle
      
      * fix codestyle
      
      * add fp16 test
      
      * add ops/sqrt
      
      * fix
      
      * fix
      
      * fix unitest
      
      * fix
      
      * fix
      
      * fix
      aba9c4d4
  11. 16 3月, 2023 2 次提交
  12. 15 3月, 2023 3 次提交
    • K
      feat: add rsqrt composite rule (#51432) · c9ca7c35
      Kang Zhao 提交于
      * feat: add relu composite rule
      
      * feat: add relu composite rule, maximum op
      
      * feat: add relu composite rule, maximum op
      
      * feat: add relu composite rule, polish comments
      
      * feat: add relu composite rule, polish comments
      
      * feat: add relu composite rule, add python api of relu
      
      * feat: add relu composite rule, commit hook
      
      * fix: maximum type error & ban cinn test
      
      * fix: maximum input sequence bugs
      
      * resolve conflicts
      
      * fix: code style bugs
      
      * add: relu fp16 test
      
      * feat: add rsqrt composite rule
      
      * feat: add rsqrt composite rule
      
      * resolve conflicts of composite rule
      
      * fix: delete check eager
      c9ca7c35
    • J
      【Prim】Support amp logic for layer_norm and softmax (#51473) · 64076727
      Jiabin Yang 提交于
      * support amp logic for layer_norm and softmax
      
      * fix layer_norm amp
      
      * fix layernorm api and dropout fp16
      
      * fix layernorm api and dropout fp16
      
      * fix bn, ln dtype in float16
      
      * fix dropout fp16
      
      * fix comment
      64076727
    • C
      [Prim] add pow composite rule (#51070) · 2d9e103e
      chenjian 提交于
      * add pow composite rule
      
      * fix test
      
      * fix unit test
      
      * update test
      
      * fix test
      
      * update
      2d9e103e
  13. 14 3月, 2023 2 次提交
  14. 13 3月, 2023 2 次提交
  15. 08 3月, 2023 1 次提交
    • K
      feat: add relu composite (#50819) · 079f41c8
      Kang Zhao 提交于
      * feat: add relu composite rule
      
      * feat: add relu composite rule, maximum op
      
      * feat: add relu composite rule, maximum op
      
      * feat: add relu composite rule, polish comments
      
      * feat: add relu composite rule, polish comments
      
      * feat: add relu composite rule, add python api of relu
      
      * feat: add relu composite rule, commit hook
      
      * fix: maximum type error & ban cinn test
      
      * fix: maximum input sequence bugs
      
      * resolve conflicts
      
      * fix: code style bugs
      
      * add: relu fp16 test
      079f41c8
  16. 07 3月, 2023 1 次提交
  17. 03 3月, 2023 1 次提交
    • Z
      add sigmoid composite rule (#50827) · d3352b99
      zxcd 提交于
      * add sigmoid composite rule
      
      * add python api
      
      * fix code style.
      
      * add check_prim=True
      
      * add sigmoid fp16 unit test.
      
      * fix code style.
      
      * rm bf16 check_prim
      
      * fix code style.
      d3352b99
  18. 02 3月, 2023 1 次提交
    • R
      Comp hardswish (#51003) · 51331098
      Roc 提交于
      * add composite op hard swish
      
      * add test grad
      
      * update apis calling
      
      * update date range
      
      * add ut
      
      * tune off cinn for 0-d shape
      
      * skip cinn
      51331098
  19. 01 3月, 2023 1 次提交
    • Y
      Add full_like composite rule (#50794) · 7468bab4
      Yichen Zhang 提交于
      * implement composite full_like and simple unit test
      
      * implement op tests for composite full_like op
      
      * some modification as reviewers suggested
      add cinn op test to CMakeLists.txt
      fix code style
      
      * fix code style
      
      * modify input args of prim fill_any_like op
      
      * resolve conflicts
      
      * resolve conflicts
      
      * modify python api and unit tests as suggested
      
      * resolve conflicts
      
      * resolve conflicts
      
      * use framework.dtype to convert dtype in Op test
      7468bab4
  20. 28 2月, 2023 3 次提交
    • I
      Fix some typos (#50914) · 5d8fe822
      iLeGend 提交于
      5d8fe822
    • Z
      add silu composite rule (#50838) · 5d70ba6d
      zxcd 提交于
      * add silu composite rule
      
      * fix code style.
      
      * add silu fp16 unit test.
      5d70ba6d
    • X
      Add flatten composite rule (#50672) · 8220771b
      xysheng-baidu 提交于
      * Add flatten composite rule
      
      * get the right xshape and pass func test
      
      * add cinn unit test
      
      * Remove cinn test, wait for it to be added after repair
      
      * add comp test to test_flatten_contiguous_range_op.py
      
      * remove func test on composite_ops
      
      * Add comments to maybe_wrap_dim func
      
      * remove commented code
      
      * fix the problem with 0D tensor case
      
      * add flatten split rule comment
      
      * fix syntax issues
      
      * block flatten on resnet_prim_cinn
      
      * remove maybe_wrap_dim func
      
      * Use none instead od xshape
      8220771b
  21. 24 2月, 2023 1 次提交
  22. 22 2月, 2023 1 次提交
  23. 21 2月, 2023 1 次提交
    • X
      【prim】Layer norm (#50422) · 2f4763ee
      xiaoguoguo626807 提交于
      * fix composite mean op map
      
      * fix composite check output
      
      * init layer_norm
      
      * init layer_norm
      
      * map output from composite rule to origin op
      
      * add dropout op map
      
      * add input map check
      
      * polish log
      
      * modify rules
      
      * success test_forward
      
      * modify test without cinn
      
      * modify cinn test
      
      * modify cinn test
      
      * except fp64
      
      * except fp64
      
      * delete flatten
      
      * delete unused change
      
      * review
      
      * pass cpu test
      
      * code style
      
      * delete flatten fp16 error
      
      * modify flatten test
      
      ---------
      Co-authored-by: Ncyber-pioneer <chenzhuo@tju.edu.cn>
      2f4763ee
  24. 20 2月, 2023 1 次提交
  25. 16 2月, 2023 1 次提交
    • Z
      Add mean composite rule (#50298) · f7f67b72
      zqw_1997 提交于
      * beta
      
      * small commit
      
      * add batch_norm composite rule
      
      move composite test case
      
      remove unuseful var
      
      add composite op blacklist
      
      * small change v2
      
      * finish the test_composite_mean and test_composite_mean_grad
      
      * add ops assertion to the tests
      
      * add cinn test
      
      * fix the error and inappropriate usage in func: mean_composite
      
      * remove the ref of outer lib in primtives.py
      
      * modify sample code of reduce_sum
      
      * fix composite mean op map
      
      * modify testcases to test more float type
      
      * remove cpu float16 test
      
      * cinn test fix
      
      * remove reduce_max
      
      * change the name sum to sum_x
      
      * change the use of reduce_sum to sum
      
      ---------
      Co-authored-by: Ncyber-pioneer <chenzhuo@tju.edu.cn>
      f7f67b72
  26. 15 2月, 2023 1 次提交
    • C
      fix composite op map (#50397) · ff86aeab
      cyber-pioneer 提交于
      * map output from composite rule to origin op
      
      add mean layer_norm dropout op map
      
      add input map check
      
      composite softmax support input shape []
      
      * composite softmax support shape []
      
      * polish log
      
      * solve conflict
      
      * polish code
      
      * polish op map output
      
      * add check dtype
      ff86aeab
  27. 14 2月, 2023 2 次提交
    • M
      b85af464
    • G
      Add gelu composite rule (#50295) · c364f41d
      GGBond8488 提交于
      * add gelu composite rule
      
      * use full replace fill_constant
      
      * change the form of calculation
      
      * remove float16 test for composite gelu
      
      * reformate code
      
      * remove float16 test case
      
      * add forwad with prim and backward without prim test
      
      * add float16 test for composite gelu and add high dims test
      
      * add float16 test case and high dims test
      
      * shield float16 and cpu test case
      
      * increase train step to 10 in test cinn prim gelu
      
      * replace pow to multiply
      c364f41d