1. 20 6月, 2019 1 次提交
    • Q
      [cherry-pick]Update backward appending stragety to support double backward. (#18216) · a839f724
      qingqing01 提交于
      * Update backward appending stragety to support double backward and fix some bug. (#18104)
      
      * Update backward.py:
           - If there is no input grad var in all outputs of previous ops, do not append this op into graph.
           - Only apply this stragety when double backward.
      * Update some double backward op.
      * Update sum_op to judge whether a tensor is empty by numel or IsInitialized().
      a839f724
  2. 14 5月, 2019 1 次提交
  3. 10 5月, 2019 1 次提交
    • Q
      Double backward of conv2d. (#17211) · e32c9888
      qingqing01 提交于
      * Add conv2d_grad_grad_op
      * Extracte the cuDNN conv algo searching code in conv_cudnn_helper.h.
          - Now use it in conv2d_grad_grad.
          - Will simply the searching code in conv2d and conv2d_grad in next PR.
      * Enhance and fix bug in unit testing of gradient_checker.
      * Support to fetch empty variables,return None in Python.
      e32c9888
  4. 23 4月, 2019 1 次提交
    • Q
      Support backward of backward for Relu and add a new gradient checker by... · c1c2633a
      qingqing01 提交于
      Support backward of backward for Relu and add a new gradient checker by comparing theoretical and numerical Jacobian. (#16862)
      
      * Support backward of backward and a new gradient checker
      * Rename decorators.py to decorator_helper.py, since Python on Windows CI has decorators package.
      
      1. Add ReluDoubleGradMaker when register relu_grad.
      2. Add a new gradient checker by comparing theoretical and numerical Jacobian.  Check double gradients by double_grad_check.
      c1c2633a