[oneDNN] Add bfloat16 to C-API
Created by: jczaja
It could good to add bfloat16 inference support to PaddlePaddle. There will be no quantization, so it should not be that difficult to have Fp32 model converted t bfloat16 (oneDNN has mechanics for that) and run inference.
- Extend C-API analysis config to support bfloat16 inference
- check accuracy