1. 12 10月, 2019 1 次提交
  2. 17 9月, 2019 1 次提交
  3. 24 5月, 2019 1 次提交
  4. 20 5月, 2019 1 次提交
    • L
      Double backward elementwise div (#17416) · 10b23a72
      lvmengsi 提交于
      * double backward, elementwise_div
      
      * fix dx empty. test=develop
      
      * bug fix (#17392)
      
      fix secure bug
      
      * Eanble stack operator for a Ngraph, test=develop (#17406)
      
      * fix sqrt_grad_grad unittest. test=develop (#17410)
      
      * fix sqrt_grad_grad unittest. test=develop
      
      * disable sqrt_grad_grad unittest. test=develop
      
      * test=develop, fix unittest
      
      * test=develop, fix unittest
      
      * test=develop, fix unittest
      
      * test=develop, fix bug
      
      * fix unittest. test=develop
      
      * fix unittest dx. test=develop
      
      * tmp fix! for test... test=develop
      
      * reduce tmp, test=develop
      
      * test=develop, reduce tmp
      
      * fix broadcast unittest. test=develop
      
      * fix format. test=develop
      
      * refine code. test=develop
      
      * refine code. test=develop
      
      * refine GetDoubleGradSafeTensor. test=develop
      
      * fix format. test=develop
      10b23a72
  5. 19 5月, 2019 1 次提交
  6. 18 5月, 2019 1 次提交
  7. 15 5月, 2019 2 次提交
  8. 14 5月, 2019 3 次提交
    • L
      Double backward reduce mean (#17372) · 5d1ac41b
      lvmengsi 提交于
      * test=develop, double backward reduce_mean
      
      * add comment. test=develop
      
      * fix format. test=develop
      
      * rename GradGrad -> DoubleGrad. test=develop
      
      * fix op_use_default_grad_op_maker.spec. test=develop
      5d1ac41b
    • K
      add elementwise_add_grad_grad op (#17366) · bd9bef5a
      Kaipeng Deng 提交于
      * add elementwise_add_grad_grad op. test=develop
      
      * use defined GradMaker. test=develop
      bd9bef5a
    • K
      support fc_op double grad (#17317) · 60be66e2
      Kaipeng Deng 提交于
      * add double grad for mul_op. test=develop
      
      * fix format. test=develop
      
      * fix format. test=develop
      
      * fix format. test=develop
      
      * refine code. test=develop
      
      * remove setzero. test=develop
      
      * fix dx/dy init bug. test=develop
      
      * fix format. test=develop
      60be66e2
  9. 13 5月, 2019 2 次提交
    • K
      add double grad for elementwise_mul op (#17255) · 8bae8590
      Kaipeng Deng 提交于
      * add double grad for elementwise_mul. test=develop
      
      * remove comment. test=develop
      
      * fix grad sum. test=develop
      
      * fix for axis expand. test=develop
      
      * add test for axis expand. test=develop
      8bae8590
    • K
      add double grad for square op (#17173) · 11d3a38f
      Kaipeng Deng 提交于
      * add double grad for square. test=develop
      
      * formax code. test=develop
      
      * fix for grad sum. test=develop
      
      * refine shape. test=develop
      
      * refine extract. test=develop
      11d3a38f
  10. 10 5月, 2019 1 次提交
    • Q
      Double backward of conv2d. (#17211) · e32c9888
      qingqing01 提交于
      * Add conv2d_grad_grad_op
      * Extracte the cuDNN conv algo searching code in conv_cudnn_helper.h.
          - Now use it in conv2d_grad_grad.
          - Will simply the searching code in conv2d and conv2d_grad in next PR.
      * Enhance and fix bug in unit testing of gradient_checker.
      * Support to fetch empty variables,return None in Python.
      e32c9888
  11. 26 4月, 2019 1 次提交
  12. 23 4月, 2019 1 次提交
    • Q
      Support backward of backward for Relu and add a new gradient checker by... · c1c2633a
      qingqing01 提交于
      Support backward of backward for Relu and add a new gradient checker by comparing theoretical and numerical Jacobian. (#16862)
      
      * Support backward of backward and a new gradient checker
      * Rename decorators.py to decorator_helper.py, since Python on Windows CI has decorators package.
      
      1. Add ReluDoubleGradMaker when register relu_grad.
      2. Add a new gradient checker by comparing theoretical and numerical Jacobian.  Check double gradients by double_grad_check.
      c1c2633a