1. 13 10月, 2017 2 次提交
    • Y
      Add no_grad_vars for grad_op_maker (#4770) · a36d2416
      Yu Yang 提交于
      * Add no_grad_vars for grad_op_maker
      
      * Add unittest
      
      * Fix unittest
      
      * Fix unittest
      
      * Follow comment
      a36d2416
    • A
      Adding the Adam Optimizer operator (#4733) · 11680037
      Abhinav Arora 提交于
      * add adam op
      
      moment1_out = beta1 * moment1 + (1 − beta1) * grad
      moment2_out = beta2 * moment2 + (1 − beta2) * grad * grad
      moment1_hat =  moment1_out / (1 - beta1^t)
      moment2_hat =  moment2_out / (1 - beta2^t)
      param_out = param - learning_rate * moment1_hat / (sqrt(moment2_hat) +
      epsilon)
      
      * fix moment 2
      
      * Adding the Adam optimization operator
      
      * Adding more tests for Adam op
      11680037
  2. 12 10月, 2017 8 次提交
  3. 11 10月, 2017 18 次提交
  4. 10 10月, 2017 12 次提交