[AMP] add fp16&bf16 support for flatten op (#52035)
* [AMP] add fp16&bf16 support for flatten op * fix ci bug * fix inpute should astype self.dtype bug and fix zerodim test name * remove 0D-tensor bf16 test for window-inference-ci pass * remove flatten from op_accuracy_white_list
Showing
想要评论请 注册 或 登录