• J
    [New Feature] Support tanh triple grad (#36225) · 808be657
    Jiabin Yang 提交于
    * native commit for triple grad of sigmod
    
    * Updated unittests files
    
    * init functional jacobian api
    
    * Updated trible_test func
    
    * Updated gradient_checker & test_script
    
    * finish test with dtype float32
    
    * add float64 test case
    
    * polish code
    
    * use atol=1e-5 with dtype float64
    
    * fix for ci
    
    * set timeout for test_jacobian
    
    * fix dygraph grad to support high differential
    
    * polish API docstring
    
    * Updated gradient checker and some related files
    
    * fix double grad strip error for high differential
    
    * fix double grad strip error for high differential
    
    * Add Sigmoid triple grad tests
    
    * fix dygraph double grad dtype error when calling for high differential senario
    
    * Updated triple grad teses func
    
    * Use np.random to initialize ddx
    
    * Updated triple_grad_check func
    
    * add todo for gradient checker and refine some comments
    
    * remove additional code
    
    * add test for warnging in backward.py
    
    * add tanh triple grad
    
    * format python code
    
    * refine code
    Co-authored-by: Nveyron95 <veyron_wu@163.com>
    Co-authored-by: Nlevi131 <limaolin01@baidu.com>
    808be657
activation_op.cc 62.8 KB