`save_inference_model` prune has bug when the fetch_list has variable that is both input and output of some op
Created by: Superjomn
When the variable that is both input and output of some operator(such as sigmoid
), if it is contained in the fetch_list
, the prune
API in save_inference_model
will fail to include that operator.
A real example:
y = fc(x, act='sigmoid')
# that will add a sigmoid after the fc, and because the sigmoid's input and output share the same variable
# when `prune`d, the framework will drop the sigmoid
Temporary fix
Add a scale
op to force include the sigmoid
like operator:
y = fc(x, act='sigmoid')
y0 = scale(y, 1)
that works!
Long-term fix
Delete the buggy prune
method, save the original model, and in CPP inference, load that model in IR and prune it with an SSA graph.