Created by: kexinzhao
Previously, we only add float16 support to relu op. We also want to add float16 support to other activation ops, e.g., tanh, which is used in our example codes.