【Prim】Support amp logic for layer_norm and softmax (#51473)
* support amp logic for layer_norm and softmax * fix layer_norm amp * fix layernorm api and dropout fp16 * fix layernorm api and dropout fp16 * fix bn, ln dtype in float16 * fix dropout fp16 * fix comment
Showing
想要评论请 注册 或 登录