• V
    [AMP OP&Test] Add fp16 and bf16 test to activation (#52521) · 6bd5fd75
    Vvsmile 提交于
    * adjust defalut tolerance of output and grad
    
    * fix a bug in the grad of OpTest
    
    * fix the type of setting defalut value in optest, both forward and
    backward
    
    * add defalut
    
    * fix test_sum_op
    
    * adjust tolerance
    
    * fix the tolerance of eager
    
    * add bf16 and fp16 to the activation tests
    
    * remove some fixs
    
    * fix activation
    
    * fix fp16
    
    * fix gelu
    
    * fix the activation tests
    
    * add bfloat16 specialization to singrad and cosgrad
    
    * fix bugs
    
    * fix bugs
    
    * add unittest
    
    * add skip
    
    * add fp/bf to rrelu/rrelu_grad
    
    * git add rrelu
    
    * fix bugs
    6bd5fd75
eager_op_test.py 105.9 KB