1. 27 10月, 2017 1 次提交
    • Y
      Gradient check use graph (#5027) · be00b0c4
      Yu Yang 提交于
      * Simplize Gradient Check
      
      * Stash
      
      * Extract apply_backward_pass to backward.py
      
      Rename apply_backward_pass to append_backward_ops
      
      * Use graph API to check gradient
      
      * Fix ci
      
      * Fix CI
      
      * Fix backward for double precision
      
      * Stash
      
      * Fix CI
      
      * Fix ci
      
      * Ignore GRU test
      
      * Ignore xe op
      
      * Fix CI
      
      * Fix softmax with xe gradient
      
      The correct equation should be IG = OG * (d_softmax_with_xe())
      
      * Fix typo
      
      * Fix merge error
      
      * Disable LRN
      be00b0c4
  2. 11 10月, 2017 1 次提交
  3. 06 10月, 2017 1 次提交
    • A
      Adding Adadelta optimization operator (#4576) · 828c5b3e
      Abhinav Arora 提交于
      * Adding Adadelta optimization operator
      * Making inputs and outputs conform to naming convention
      * Removing type alias from header files
      * Fixing Adadelta documentation in comments
      * Addressing code review feedback
      828c5b3e
  4. 07 8月, 2017 1 次提交
  5. 04 8月, 2017 1 次提交
  6. 31 7月, 2017 1 次提交
  7. 25 7月, 2017 1 次提交
  8. 19 7月, 2017 1 次提交