Is it possible to grab weights from another operator using Transpiler?
Created by: wojtuss
I'm working on fusing FC and GRU layers by the MKL-DNN GRU op for inference. Using the inference_transpiler.py
, is it possible to grab the weights from the FC's MUL op and "add" it as a weight to the MKL-DNN-based GRU op?
The situation is as follows:
- in the GRUOpMaker there is a new input
WeightX
added (AsDispensable()
), - there is no
WeightX
input passed viadynamic_gru
innn.py
, so the ordinary GRU could work, - a new GRUMKLDNNKernel is created, which requires the
WeightX
tensor, it is called only when inferencing with MKL-DNN, - in the
inference_transpiler.py
I am fusing the MUL+elementwise_add+GRU ops, but this requires grabbing the MUL's weights into the GRU'sWeightX
input. How to achieve that? From the code of the transpiler, I am not really sure how to do that.
I know the "Pass" is comming to facilitate inference fusing, but until then I want to familiarize myself to and do the fusing using the transpiler.