• W
    Support matmul_v2 triple grad Kernel (#36459) · 203a0e3e
    Weilong Wu 提交于
    * native commit for triple grad of sigmod
    
    * Updated unittests files
    
    * init functional jacobian api
    
    * Updated trible_test func
    
    * Updated gradient_checker & test_script
    
    * finish test with dtype float32
    
    * add float64 test case
    
    * polish code
    
    * use atol=1e-5 with dtype float64
    
    * fix for ci
    
    * set timeout for test_jacobian
    
    * fix dygraph grad to support high differential
    
    * polish API docstring
    
    * Updated gradient checker and some related files
    
    * fix double grad strip error for high differential
    
    * fix double grad strip error for high differential
    
    * Add Sigmoid triple grad tests
    
    * fix dygraph double grad dtype error when calling for high differential senario
    
    * Updated triple grad teses func
    
    * Use np.random to initialize ddx
    
    * Updated triple_grad_check func
    
    * add todo for gradient checker and refine some comments
    
    * remove additional code
    
    * add test for warnging in backward.py
    
    * format python code
    
    * support multi input in triple gradient checker
    
    * Add matmul triple grad kernel
    
    * Updated comments of TODO
    
    * Supported some special tests
    
    * Change code-format to follow CI std
    
    * Updated gradient_checker.py
    
    * Fix conflicts
    
    * Removed unnecessary printing log
    
    * Change code style to follow CI std
    Co-authored-by: Nlevi131 <limaolin01@baidu.com>
    Co-authored-by: NJiabin Yang <360788950@qq.com>
    203a0e3e
gradient_checker.py 16.8 KB