- 21 3月, 2023 1 次提交
-
-
由 cyber-pioneer 提交于
* simplify batch_norm composite rule * polish code
-
- 20 3月, 2023 3 次提交
-
-
由 xiaoguoguo626807 提交于
* Add flatten composite rule * get the right xshape and pass func test * add cinn unit test * Remove cinn test, wait for it to be added after repair * add comp test to test_flatten_contiguous_range_op.py * remove func test on composite_ops * Add comments to maybe_wrap_dim func * remove commented code * fix the problem with 0D tensor case * add flatten split rule comment * fix syntax issues * block flatten on resnet_prim_cinn * init change * tmp commit * add layer_norm InferMeta check * cast type modify * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557) * [CINN]Enhance CacheKey hash logic by considering input dtypes * add unittest * fix typo * fix typo * fix map.at * fix find * fix test * fix cinn cache key structure realize * using ordered map for attributes * add test by review advice --------- Co-authored-by: Njiangcheng <thisjiang@qq.com> * [prim] enable dygraph_to_static to support custom_vjp * Pr 50885 (#7) * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557) * [CINN]Enhance CacheKey hash logic by considering input dtypes * add unittest * fix typo * fix typo * fix map.at * fix find * fix test * fix cinn cache key structure realize * using ordered map for attributes * add test by review advice --------- Co-authored-by: Njiangcheng <thisjiang@qq.com> * [prim] enable dygraph_to_static to support custom_vjp * fix code in a dy2static-friendly way. * [dystatic] add hooker for prim --------- Co-authored-by: NAurelius84 <zhangliujie@baidu.com> Co-authored-by: Njiangcheng <thisjiang@qq.com> Co-authored-by: Ncxxly <chenxx_id@163.com> * [prim] enable dygraph_to_static to support custom_vjp * fix cast prim and vjp dtype mapping error bug * recover * big tol * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557) * [CINN]Enhance CacheKey hash logic by considering input dtypes * add unittest * fix typo * fix typo * fix map.at * fix find * fix test * fix cinn cache key structure realize * using ordered map for attributes * add test by review advice --------- Co-authored-by: Njiangcheng <thisjiang@qq.com> * [prim] enable dygraph_to_static to support custom_vjp * Pr 50885 (#7) * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557) * [CINN]Enhance CacheKey hash logic by considering input dtypes * add unittest * fix typo * fix typo * fix map.at * fix find * fix test * fix cinn cache key structure realize * using ordered map for attributes * add test by review advice --------- Co-authored-by: Njiangcheng <thisjiang@qq.com> * [prim] enable dygraph_to_static to support custom_vjp * fix code in a dy2static-friendly way. * [dystatic] add hooker for prim --------- Co-authored-by: NAurelius84 <zhangliujie@baidu.com> Co-authored-by: Njiangcheng <thisjiang@qq.com> Co-authored-by: Ncxxly <chenxx_id@163.com> * [prim] enable dygraph_to_static to support custom_vjp * fix cast prim and vjp dtype mapping error bug * Cxx prim custom vjp (#8) * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557) --------- Co-authored-by: Njiangcheng <thisjiang@qq.com> * [prim] enable dygraph_to_static to support custom_vjp * Pr 50885 (#7) * [CINN]Enhance CacheKey hash logic by considering input dtypes (#50557) * [CINN]Enhance CacheKey hash logic by considering input dtypes --------- Co-authored-by: Njiangcheng <thisjiang@qq.com> * [prim] enable dygraph_to_static to support custom_vjp * fix code in a dy2static-friendly way. * [dystatic] add hooker for prim --------- Co-authored-by: NAurelius84 <zhangliujie@baidu.com> Co-authored-by: Njiangcheng <thisjiang@qq.com> Co-authored-by: Ncxxly <chenxx_id@163.com> * [prim] enable dygraph_to_static to support custom_vjp * fix cast prim and vjp dtype mapping error bug * [dy2static-ci] fix dy2static ci errors. --------- Co-authored-by: NAurelius84 <zhangliujie@baidu.com> Co-authored-by: Njiangcheng <thisjiang@qq.com> Co-authored-by: Ncxxly <chenxx_id@163.com> * [Prim] enable whitelist and blacklist for custom_vjp * debug log * clear log * fix * nothing * less memory * recover utils * fix * modify threshold value * skip layer_norm for test_bert * back to bert success state * add epsion * delete unnecessary compute * modify amp dtype * modify * order * delete sqrt check and fp16 --------- Co-authored-by: Nxuyongsheng <xuyongsheng@baidu.com> Co-authored-by: Nxysheng-baidu <121540080+xysheng-baidu@users.noreply.github.com> Co-authored-by: NAurelius84 <zhangliujie@baidu.com> Co-authored-by: Njiangcheng <thisjiang@qq.com> Co-authored-by: Ncxxly <chenxx_id@163.com> Co-authored-by: Nxiongkun <807377414@qq.com>
-
由 warrentdrew 提交于
* add composite rule for squeeze * fix pre commit * fix pre commit * simplify rules * arrange code * fix int axis * simplify squeeze axis rules * bugfix * fix pre commit
-
由 Jiabin Yang 提交于
-
- 17 3月, 2023 2 次提交
-
-
由 cyber-pioneer 提交于
* add bn vjp * fix example * fix code * fix code * fix cinn case * fix code * fix example * fix code * fix example * fix example
-
由 mhy-666 提交于
* add sqrt composite rule/test * add sqrt composite rule/test * fix ops/sqrt, add cinn test * fix sqrt_comp * fix sqrt_comp * fix sqrt_comp * fix * fix codestyle * fix codestyle * add fp16 test * add ops/sqrt * fix * fix * fix unitest * fix * fix * fix
-
- 16 3月, 2023 2 次提交
-
-
由 Roc 提交于
-
由 Jiabin Yang 提交于
* support amp logic for layer_norm and softmax * fix layer_norm amp * fix layernorm api and dropout fp16 * fix layernorm api and dropout fp16 * fix bn, ln dtype in float16 * fix dropout fp16 * fix comment * fix cinn dropout amp error
-
- 15 3月, 2023 3 次提交
-
-
由 Kang Zhao 提交于
* feat: add relu composite rule * feat: add relu composite rule, maximum op * feat: add relu composite rule, maximum op * feat: add relu composite rule, polish comments * feat: add relu composite rule, polish comments * feat: add relu composite rule, add python api of relu * feat: add relu composite rule, commit hook * fix: maximum type error & ban cinn test * fix: maximum input sequence bugs * resolve conflicts * fix: code style bugs * add: relu fp16 test * feat: add rsqrt composite rule * feat: add rsqrt composite rule * resolve conflicts of composite rule * fix: delete check eager
-
由 Jiabin Yang 提交于
* support amp logic for layer_norm and softmax * fix layer_norm amp * fix layernorm api and dropout fp16 * fix layernorm api and dropout fp16 * fix bn, ln dtype in float16 * fix dropout fp16 * fix comment
-
由 chenjian 提交于
* add pow composite rule * fix test * fix unit test * update test * fix test * update
-
- 14 3月, 2023 2 次提交
-
-
由 qizhaoaoe 提交于
-
由 cyber-pioneer 提交于
-
- 13 3月, 2023 2 次提交
-
-
由 mengziheng 提交于
* first test * add unsqueeze_op
-
由 xysheng-baidu 提交于
* Add expand composite rule * reshape x when dim_in less than dim_out * add tile op for expand * remove rensor shape case when comp prim * enable cinn case * dim_out can't be 0 * update test case for prim type
-
- 08 3月, 2023 1 次提交
-
-
由 Kang Zhao 提交于
* feat: add relu composite rule * feat: add relu composite rule, maximum op * feat: add relu composite rule, maximum op * feat: add relu composite rule, polish comments * feat: add relu composite rule, polish comments * feat: add relu composite rule, add python api of relu * feat: add relu composite rule, commit hook * fix: maximum type error & ban cinn test * fix: maximum input sequence bugs * resolve conflicts * fix: code style bugs * add: relu fp16 test
-
- 07 3月, 2023 1 次提交
-
-
由 ccrrong 提交于
* add stack composite rule * add float16 datatype test
-
- 03 3月, 2023 1 次提交
-
-
由 zxcd 提交于
* add sigmoid composite rule * add python api * fix code style. * add check_prim=True * add sigmoid fp16 unit test. * fix code style. * rm bf16 check_prim * fix code style.
-
- 02 3月, 2023 1 次提交
-
-
由 Roc 提交于
* add composite op hard swish * add test grad * update apis calling * update date range * add ut * tune off cinn for 0-d shape * skip cinn
-
- 01 3月, 2023 1 次提交
-
-
由 Yichen Zhang 提交于
* implement composite full_like and simple unit test * implement op tests for composite full_like op * some modification as reviewers suggested add cinn op test to CMakeLists.txt fix code style * fix code style * modify input args of prim fill_any_like op * resolve conflicts * resolve conflicts * modify python api and unit tests as suggested * resolve conflicts * resolve conflicts * use framework.dtype to convert dtype in Op test
-
- 28 2月, 2023 3 次提交
-
-
由 iLeGend 提交于
-
由 zxcd 提交于
* add silu composite rule * fix code style. * add silu fp16 unit test.
-
由 xysheng-baidu 提交于
* Add flatten composite rule * get the right xshape and pass func test * add cinn unit test * Remove cinn test, wait for it to be added after repair * add comp test to test_flatten_contiguous_range_op.py * remove func test on composite_ops * Add comments to maybe_wrap_dim func * remove commented code * fix the problem with 0D tensor case * add flatten split rule comment * fix syntax issues * block flatten on resnet_prim_cinn * remove maybe_wrap_dim func * Use none instead od xshape
-
- 24 2月, 2023 1 次提交
-
-
由 Jiabin Yang 提交于
* change amp with to_prim * fix prim amp * fix rules * fix liear * add amp test * add test * disable this test on cpu * disable this test on cpu --------- Co-authored-by: Ncyber-pioneer <chenzhuo@tju.edu.cn>
-
- 22 2月, 2023 1 次提交
-
-
由 Xiaoxu Chen 提交于
* map output from composite rule to origin op add mean layer_norm dropout op map add input map check composite softmax support input shape [] * polish log * [prim] add dropout composite rule --------- Co-authored-by: Ncyber-pioneer <chenzhuo@tju.edu.cn>
-
- 21 2月, 2023 1 次提交
-
-
由 xiaoguoguo626807 提交于
* fix composite mean op map * fix composite check output * init layer_norm * init layer_norm * map output from composite rule to origin op * add dropout op map * add input map check * polish log * modify rules * success test_forward * modify test without cinn * modify cinn test * modify cinn test * except fp64 * except fp64 * delete flatten * delete unused change * review * pass cpu test * code style * delete flatten fp16 error * modify flatten test --------- Co-authored-by: Ncyber-pioneer <chenzhuo@tju.edu.cn>
-
- 20 2月, 2023 1 次提交
-
-
由 cyber-pioneer 提交于
* check win * fix random error * fix manage
-
- 16 2月, 2023 1 次提交
-
-
由 zqw_1997 提交于
* beta * small commit * add batch_norm composite rule move composite test case remove unuseful var add composite op blacklist * small change v2 * finish the test_composite_mean and test_composite_mean_grad * add ops assertion to the tests * add cinn test * fix the error and inappropriate usage in func: mean_composite * remove the ref of outer lib in primtives.py * modify sample code of reduce_sum * fix composite mean op map * modify testcases to test more float type * remove cpu float16 test * cinn test fix * remove reduce_max * change the name sum to sum_x * change the use of reduce_sum to sum --------- Co-authored-by: Ncyber-pioneer <chenzhuo@tju.edu.cn>
-
- 15 2月, 2023 1 次提交
-
-
由 cyber-pioneer 提交于
* map output from composite rule to origin op add mean layer_norm dropout op map add input map check composite softmax support input shape [] * composite softmax support shape [] * polish log * solve conflict * polish code * polish op map output * add check dtype
-
- 14 2月, 2023 2 次提交
-
-
由 mhy-666 提交于
-
由 GGBond8488 提交于
* add gelu composite rule * use full replace fill_constant * change the form of calculation * remove float16 test for composite gelu * reformate code * remove float16 test case * add forwad with prim and backward without prim test * add float16 test for composite gelu and add high dims test * add float16 test case and high dims test * shield float16 and cpu test case * increase train step to 10 in test cinn prim gelu * replace pow to multiply
-
- 09 2月, 2023 1 次提交
-
-
由 Jiabin Yang 提交于
-
- 07 2月, 2023 1 次提交
-
-
由 cyber-pioneer 提交于
move composite test case remove unuseful var add composite op blacklist
-
- 17 1月, 2023 1 次提交
-
-
由 cyber-pioneer 提交于
* support @to_static+to_prime+cinn * fix code logic * debug4 * debug5 * debug6 * debug7 * debug 8 * debug 9 * debug10 * debug11 * debug11 * debug 12 Co-authored-by: NAurelius84 <zhangliujie@baidu.com>
-
- 13 1月, 2023 1 次提交
-
-
由 cyber-pioneer 提交于
-