• Z
    Update amp_check_finite_and_scale_op and add an updating_loss_scaling op for... · d708b210
    Zhen Wang 提交于
    Update amp_check_finite_and_scale_op and add an updating_loss_scaling op for static graph amp training. (#26240)
    
    * update amp_check_finite_and_scale_op for static_amp.
    
    * use amp_check_finite_and_scale in static graph amp.
    
    * update grads to zero when grads own infinite values(as for amp_checkout_finite_and_scale op).
    
    * add update_loss_scaling op in cpp.
    
    * add update_loss_scaling_op unit test.
    
    * update the doc of the check_finite_and_unscale op
    
    * Update the process of gradients updating skipping if the gradients have infinite values.
    
    * update the way to zero grads.
    
    * update test_update_loss_scaling_op.py
    
    * add log info when find infinite grads.
    
    * add the unit test for UpdateLossScaling Layer.
    d708b210
fp16_utils.py 12.0 KB