how to add MKL Packed interface
Created by: tensor-tang
Need discuss about the python interface how to use mkl packed implements.
As this issue is long, I summarize the key points in Chinese as first:
intel对GRU等layer优化(还是使用MKL库,非MKLDNN库)完并通过测试后,python api封装时,cpu模式(同时开启with_mkl=on),python端是否可以默认调用优化过后的c++ layer?
Background
We plan to add MKL Packed implements for RNN layers,(Recurrent Layer, LSTM, and GRU). related #6512 (closed) and #6506 (closed) .
They are total different with MKLDNN, and they will be compiled only if WITH_MKL=ON
,
Then we got a problem about how to use it from python side?
Solutions
There are two options:
-
Add one more flag:
use_mkl_packed
just likeuse_mkldnn
. This would make things flexible but complex. If user want to use both. They should add bothuse_mkldnn=True
anduse_mkl_packed=True
. -
Change
use_mkldnn
==>use_mkl
, and makeuse_mkl
cover both. This can make users happy, Justuse_mkl=True
is enough, do not need care which one to choose. But it's not flexible enough.
Which one we should choose? Any suggestion is appreciated.
@wangkuiyi @luotao1