使用load_inference_model读入数据
Created by: 333caowei
input数据如下:一个是id、一个是id list
query = fluid.layers.data(
name="query",
shape=[1],
dtype='int64')
query_char = fluid.layers.data(
name="query_char",
shape=[1],
dtype='int64',
lod_level=1)
然后使用下面的save_inference_model保存模型
fluid.io.save_inference_model(epoch_model, ["query", "query_char"], predict, exe)
载入模型:
[inference_program, feed_target_names, fetch_targets] = fluid.io.load_inference_model(model_path, exe)
预测时候遇到问题:
score = exe.run(inference_program,
feed=data2tensor({"query": 4, "query_char": [3,2,5]}, place),
fetch_list=fetch_targets,
return_numpy=True)
def data2tensor(data, place):
res = dict()
res["query"] = to_lodtensor(data["query"], place)
res["query_char"] = to_lodtensor(data["query_char"], place)
return res
def to_lodtensor(data, place):
res = fluid.LoDTensor()
res.set(data, place)
return res
在预测时候,data2tensor函数调用to_lodtensor将数据转化后给到模型,报如下错误: paddle.fluid.core.EnforceNotMet: enforce ids_dims.size() == 2 failed, 0 != 2 at [/paddle/paddle/fluid/operators/lookup_table_op.cc:43] PaddlePaddle Call Stacks: