1. 14 5月, 2019 2 次提交
    • K
      add elementwise_add_grad_grad op (#17366) · bd9bef5a
      Kaipeng Deng 提交于
      * add elementwise_add_grad_grad op. test=develop
      
      * use defined GradMaker. test=develop
      bd9bef5a
    • K
      support fc_op double grad (#17317) · 60be66e2
      Kaipeng Deng 提交于
      * add double grad for mul_op. test=develop
      
      * fix format. test=develop
      
      * fix format. test=develop
      
      * fix format. test=develop
      
      * refine code. test=develop
      
      * remove setzero. test=develop
      
      * fix dx/dy init bug. test=develop
      
      * fix format. test=develop
      60be66e2
  2. 13 5月, 2019 2 次提交
    • K
      add double grad for elementwise_mul op (#17255) · 8bae8590
      Kaipeng Deng 提交于
      * add double grad for elementwise_mul. test=develop
      
      * remove comment. test=develop
      
      * fix grad sum. test=develop
      
      * fix for axis expand. test=develop
      
      * add test for axis expand. test=develop
      8bae8590
    • K
      add double grad for square op (#17173) · 11d3a38f
      Kaipeng Deng 提交于
      * add double grad for square. test=develop
      
      * formax code. test=develop
      
      * fix for grad sum. test=develop
      
      * refine shape. test=develop
      
      * refine extract. test=develop
      11d3a38f
  3. 10 5月, 2019 1 次提交
    • Q
      Double backward of conv2d. (#17211) · e32c9888
      qingqing01 提交于
      * Add conv2d_grad_grad_op
      * Extracte the cuDNN conv algo searching code in conv_cudnn_helper.h.
          - Now use it in conv2d_grad_grad.
          - Will simply the searching code in conv2d and conv2d_grad in next PR.
      * Enhance and fix bug in unit testing of gradient_checker.
      * Support to fetch empty variables,return None in Python.
      e32c9888
  4. 26 4月, 2019 1 次提交
  5. 23 4月, 2019 1 次提交
    • Q
      Support backward of backward for Relu and add a new gradient checker by... · c1c2633a
      qingqing01 提交于
      Support backward of backward for Relu and add a new gradient checker by comparing theoretical and numerical Jacobian. (#16862)
      
      * Support backward of backward and a new gradient checker
      * Rename decorators.py to decorator_helper.py, since Python on Windows CI has decorators package.
      
      1. Add ReluDoubleGradMaker when register relu_grad.
      2. Add a new gradient checker by comparing theoretical and numerical Jacobian.  Check double gradients by double_grad_check.
      c1c2633a