1. 14 9月, 2020 3 次提交
    • Z
      Update amp_check_finite_and_scale_op and add an updating_loss_scaling op for... · d708b210
      Zhen Wang 提交于
      Update amp_check_finite_and_scale_op and add an updating_loss_scaling op for static graph amp training. (#26240)
      
      * update amp_check_finite_and_scale_op for static_amp.
      
      * use amp_check_finite_and_scale in static graph amp.
      
      * update grads to zero when grads own infinite values(as for amp_checkout_finite_and_scale op).
      
      * add update_loss_scaling op in cpp.
      
      * add update_loss_scaling_op unit test.
      
      * update the doc of the check_finite_and_unscale op
      
      * Update the process of gradients updating skipping if the gradients have infinite values.
      
      * update the way to zero grads.
      
      * update test_update_loss_scaling_op.py
      
      * add log info when find infinite grads.
      
      * add the unit test for UpdateLossScaling Layer.
      d708b210
    • S
      remove auto mode from localsgd optimizer (#27237) · 2b6a5793
      ShenLiang 提交于
      * rm auto from localsgd
      2b6a5793
    • A
      Add int8 GRU kernel (#27220) · cc3f4b81
      Adam 提交于
      * Add int8 GRU kernel with UTs
      
      * Lint fixes
      
      * More lint fixes
      cc3f4b81
  2. 11 9月, 2020 8 次提交
  3. 10 9月, 2020 11 次提交
  4. 09 9月, 2020 7 次提交
  5. 08 9月, 2020 11 次提交