1. 13 10月, 2017 2 次提交
    • A
      Adding the Adam Optimizer operator (#4733) · 11680037
      Abhinav Arora 提交于
      * add adam op
      
      moment1_out = beta1 * moment1 + (1 − beta1) * grad
      moment2_out = beta2 * moment2 + (1 − beta2) * grad * grad
      moment1_hat =  moment1_out / (1 - beta1^t)
      moment2_hat =  moment2_out / (1 - beta2^t)
      param_out = param - learning_rate * moment1_hat / (sqrt(moment2_hat) +
      epsilon)
      
      * fix moment 2
      
      * Adding the Adam optimization operator
      
      * Adding more tests for Adam op
      11680037
    • F
      Rename Python `graph` to `framework` (#4762) · 36de3989
      fengjiayi 提交于
      36de3989
  2. 12 10月, 2017 10 次提交
  3. 11 10月, 2017 15 次提交
  4. 10 10月, 2017 13 次提交