Created by: kexinzhao
fix #9266 (closed)
I originally plan to add one single line in activation_op.cu
to add fp16 forward kernel for all activation ops. However, this brings up tons of eigen error message requiring lots of modification to float16.h
. Because of the currently limited time, I am temporarily putting this plan on hold and simply add fp16 support for relu op in this PR.