1. 30 3月, 2022 1 次提交
  2. 29 3月, 2022 1 次提交
  3. 28 3月, 2022 1 次提交
    • H
      Move some activation to phi (#40727) · e77a947e
      hong 提交于
      * update
      
      * add forward case
      
      * update
      
      * update; test=develop
      
      * add some grad kernel; test=develop
      
      * move gpu kernel; test=develop
      
      * update
      
      * update;
      
      * update test;
      
      * fix selected rows bug;
      
      * add mix vector include ;
      
      * add mixed vector depen; test=develop
      
      * add logit grad signature;
      
      * polish code
      
      * fix bug;
      
      * add namespace for abs
      
      * revert code
      
      * not move softsign
      
      * revmove duplate register;
      
      * fix softsign bug
      
      * polish code
      
      * format
      
      * format
      
      * fix bug
      
      * remove cmake dep
      
      * add square sqrt selected rows support
      
      * update
      
      * remove clip norm
      
      * add standalone executor sqrt dep
      
      * standalone exec denp sqrt
      
      * remove sqrt op in cmkaelist
      
      * open some case
      e77a947e
  4. 15 10月, 2021 1 次提交
    • J
      [New Feature] Support tanh triple grad (#36225) · 808be657
      Jiabin Yang 提交于
      * native commit for triple grad of sigmod
      
      * Updated unittests files
      
      * init functional jacobian api
      
      * Updated trible_test func
      
      * Updated gradient_checker & test_script
      
      * finish test with dtype float32
      
      * add float64 test case
      
      * polish code
      
      * use atol=1e-5 with dtype float64
      
      * fix for ci
      
      * set timeout for test_jacobian
      
      * fix dygraph grad to support high differential
      
      * polish API docstring
      
      * Updated gradient checker and some related files
      
      * fix double grad strip error for high differential
      
      * fix double grad strip error for high differential
      
      * Add Sigmoid triple grad tests
      
      * fix dygraph double grad dtype error when calling for high differential senario
      
      * Updated triple grad teses func
      
      * Use np.random to initialize ddx
      
      * Updated triple_grad_check func
      
      * add todo for gradient checker and refine some comments
      
      * remove additional code
      
      * add test for warnging in backward.py
      
      * add tanh triple grad
      
      * format python code
      
      * refine code
      Co-authored-by: Nveyron95 <veyron_wu@163.com>
      Co-authored-by: Nlevi131 <limaolin01@baidu.com>
      808be657
  5. 13 10月, 2021 2 次提交
    • Y
      [PaddlePaddle hackathon] + ADD CELU (#36088) · d7064f04
      yujun 提交于
      * update
      
      * update
      
      * update
      
      * try make CI pass
      
      * doc typo
      
      * update doc string
      d7064f04
    • J
      [New Feature] Support triple grad in Paddle (#36187) · 2c44ee7e
      Jiabin Yang 提交于
      * native commit for triple grad of sigmod
      
      * Updated unittests files
      
      * init functional jacobian api
      
      * Updated trible_test func
      
      * Updated gradient_checker & test_script
      
      * finish test with dtype float32
      
      * add float64 test case
      
      * polish code
      
      * use atol=1e-5 with dtype float64
      
      * fix for ci
      
      * set timeout for test_jacobian
      
      * fix dygraph grad to support high differential
      
      * polish API docstring
      
      * Updated gradient checker and some related files
      
      * fix double grad strip error for high differential
      
      * fix double grad strip error for high differential
      
      * Add Sigmoid triple grad tests
      
      * fix dygraph double grad dtype error when calling for high differential senario
      
      * Updated triple grad teses func
      
      * Use np.random to initialize ddx
      
      * Updated triple_grad_check func
      
      * add todo for gradient checker and refine some comments
      
      * remove additional code
      
      * add test for warnging in backward.py
      
      * format python code
      Co-authored-by: Nveyron95 <veyron_wu@163.com>
      Co-authored-by: Nlevi131 <limaolin01@baidu.com>
      2c44ee7e
  6. 26 5月, 2021 1 次提交
  7. 15 4月, 2021 1 次提交
  8. 12 1月, 2021 1 次提交
  9. 22 12月, 2020 1 次提交
  10. 09 12月, 2020 1 次提交
  11. 27 9月, 2020 2 次提交
  12. 25 9月, 2020 1 次提交
  13. 23 2月, 2020 1 次提交
  14. 07 1月, 2020 1 次提交
  15. 06 1月, 2020 1 次提交
    • D
      support elu_op double grad (#21822) · fab4b076
      Double_V 提交于
      * support elu activation double grad,test=develop
      
      * delete the code commit in .cc,test=develop
      
      * fix relu test unpass, test=develop
      
      * add elu double grad kernel and unit test
      
      * add caculate dX in elu double grad functor, test=develop
      
      * update the commit code,test=develop
      fab4b076
  16. 24 5月, 2019 1 次提交