[AMP OP&Test] Add fp16 and bf16 test to activation (#52521)
* adjust defalut tolerance of output and grad * fix a bug in the grad of OpTest * fix the type of setting defalut value in optest, both forward and backward * add defalut * fix test_sum_op * adjust tolerance * fix the tolerance of eager * add bf16 and fp16 to the activation tests * remove some fixs * fix activation * fix fp16 * fix gelu * fix the activation tests * add bfloat16 specialization to singrad and cosgrad * fix bugs * fix bugs * add unittest * add skip * add fp/bf to rrelu/rrelu_grad * git add rrelu * fix bugs
Showing
想要评论请 注册 或 登录