• W
    Add support for forward and reverse high-order automatic differentiation mechanism (#41919) · f6ee202f
    WangZhen 提交于
    * Updated triple_grad_check func
    
    * add todo for gradient checker and refine some comments
    
    * remove additional code
    
    * add test for warnging in backward.py
    
    * format python code
    
    * support multi input in triple gradient checker
    
    * Add matmul triple grad kernel
    
    * Updated comments of TODO
    
    * Supported some special tests
    
    * Change code-format to follow CI std
    
    * Updated gradient_checker.py
    
    * Fix conflicts
    
    * Removed unnecessary printing log
    
    * Change code style to follow CI std
    
    * merge upstream
    
    * add priops.py
    
    * add_p
    
    * rm useless files
    
    * add sub_p mul_p div_p
    
    * add sqrt_p and tanh_p
    
    * add reshape_p
    
    * add broadcast_p
    
    * Add python primitive wrappers.
    
    * Jvp rules updated.
    
    * JVP rules done for all the 17 primops.
    
    * quick check and fixes.
    
    * add jvp(op, *args)
    
    * add broadcast_p fill_constant_p matmul_p reduce_p reshape_p transpose_p
    
    * add split_p and concat_p
    
    * add gather_p and scatter_add_p
    
    * add slice_select_p and slice_assign_p
    
    * Add transpose rules.
    
    * add multi input check for add_p, sub_p, mul_p, div_p
    
    * update concat_p
    
    * Linearize and transpose in progress..
    
    * refine gather_p and scatter_add_p
    
    * updated.
    
    * update transpose.
    
    * refine slice_assign_p and slice_select_p
    
    * init commit for lower
    
    * Merged with primitive ops.
    
    * small update
    
    * add rules for orig2prim and prim2orig
    
    * add 9 test for prim ops
    
    * add more test and fix some bug
    
    * add more test
    
    * register proto
    
    * Adding primops test.
    
    * add shape valid check for broadcast_p op, and add keepdim attr into reduce_p op proto
    
    * support multi input and multi output for split_p and concat_p
    
    * Test updated.
    
    * update
    
    * fix slice bug for slice_select_p and slice_assign_p
    
    * updated.
    
    * Ops updated.
    
    * Refactor and bug fixes.
    
    * updated.
    
    * finish orig2prim and prim2orig rules
    
    * dtype for axis attr should be long int
    
    * update dtype for axis attr int64_t
    
    * update for iscan CI
    
    * Update primx.
    
    * Refactor vars in primx.
    
    * update for lower transform
    
    * add more shape and dtype check
    
    * update primx.py
    
    * change IndexTensor into int32 dtype
    
    * update
    
    * Fix linearize and transpose.
    
    * Update is_dot
    
    * Update is_dot
    
    * Update is_dot
    
    * add gradient aggregation, fix add_transpose.
    
    * pass first linearize+transpose test.
    
    * update test
    
    * refactor op registration and primx.
    
    * update rule for slice_assign
    
    * try test lower
    
    * update orig2prim and prim2orig
    
    * pass simple lower pass
    
    * update
    
    * Update input types in the unit test.
    
    * orig2prim segfault.
    
    * 50% for adam.minimize
    
    * test updated.
    
    * temp fix erros in removing vars.
    
    * primx updated.
    
    * update for matmul_v2 and reshape2 orig2prim
    
    * update for minimize
    
    * Refine primrules
    
    * Remove some code
    
    * supporting unused and unreachable vars.
    
    * update for use prim2orig in minimize
    
    * fix gather and scatter_add transpose.
    
    * Add rules UT
    
    * update scatter_add
    
    * Refine UT code
    
    * fix nonetype check in topo
    
    * Update gather_p pywrapper.
    
    * remove useless print
    
    * Merge tongxin PR and refine code
    
    * readd some test
    
    * rm useless print
    
    * polish code.
    
    * fix bug in minimize
    
    * add get_input_var_list and get_output_var_list and use it in lower
    
    * Fix scatter_add_p prim2orig
    
    * Update code and fix orig2prim/prim2orig UT
    
    * delete vars after block.desc._remove
    
    * Improve ops and vars clean up logics.
    
    * fix some bug in linearize and lower
    
    * update tanh transpose.
    
    * use set instead of list for var2remove
    
    * test updated.
    
    * polish code.
    
    * fix dot2bar delete.
    
    * merge tx/ad
    
    * add indextensor_dot for gather and scatter_add
    
    * add sorted for set
    
    * Fix scale_orig2prim params
    
    * fix some syntax bug
    
    * add golbal_lower_update list
    
    * Better handling of unused vars.
    
    * update tests.
    
    * Fix elementwise_sub orig2prim
    
    * support none for transpose rule
    
    * Merge and add transform UT
    
    * fix a bug in transpose
    
    * Fix transpose and UT
    
    * a hacky fix for cancat op
    
    * Fix exector place
    
    * Refine variable name
    
    * Add elementwise_mul orig2prim and support p_norm when p=1
    
    * Add sqrt orig2prim rule and UT
    
    * merge wz test
    
    * rename files, add enable_prim, disable_prim, prim_enabled, delete global_lower_update
    
    * fix a bug in test_ad_transform_trans
    
    * revert modify in framework.py
    
    * add paddle.fluid.incubate.ad_transform to  python/setup.py.in
    
    * Fix remove vars error
    
    * Fix p_norm_orig2prim
    
    * merge wz
    
    * Modify the code directory
    
    * Add utils.py and remove get_input/output_vars functions
    
    * Update maolin code
    
    * Rename UT and refine test_ad_transform_primops
    
    * Fix div_p jvp rule
    
    * Add higher derivatives UT
    
    * Remove UT to autograd dir
    
    * Fix comments
    
    * import paddle in primops.py
    
    * Add some error message for assert
    
    * Refine UT class name and refine some comments in primreg.py
    
    * update minimize of paddle/optimizer for supporting new autograd
    
    * resolve cicular importing between backward.py and optimizer.py
    
    * fill gradients and minimize unittest
    
    * Replace `assert isinstance` with `raise TypeError`
    
    * Add some assert message for primx.py
    
    * Polish variable name
    
    * Add some assert message
    
    * add some docstring
    
    * refine some name
    
    * update the format of english documents
    
    * Split test_transform.py to two files to avoid ci error
    
    * fix the document format of enable_prim/disable_prim/prim2orig/prim_enabled
    
    * polish test_gradients_and_minimize
    
    * add default value for prim_enabled api doc
    
    * Remove some UT to avoid windows ci error
    
    * Enlarge test_gradients_and_minimize limit time
    
    * Fix ut limit time
    Co-authored-by: Nveyron95 <veyron_wu@163.com>
    Co-authored-by: NJiabin Yang <360788950@qq.com>
    Co-authored-by: Nlevi131 <limaolin01@baidu.com>
    Co-authored-by: NTongxin Bai <waffle.bai@gmail.com>
    Co-authored-by: NXiaoxu Chen <chenxx_id@163.com>
    Co-authored-by: Nlevi131 <83750468+levi131@users.noreply.github.com>
    f6ee202f
backward.py 90.7 KB