-
由 Jiabin Yang 提交于
* support amp logic for layer_norm and softmax * fix layer_norm amp * fix layernorm api and dropout fp16 * fix layernorm api and dropout fp16 * fix bn, ln dtype in float16 * fix dropout fp16 * fix comment
64076727
* support amp logic for layer_norm and softmax * fix layer_norm amp * fix layernorm api and dropout fp16 * fix layernorm api and dropout fp16 * fix bn, ln dtype in float16 * fix dropout fp16 * fix comment