PaddleNLP的lac分词baseline模型保存为serving的预测模型出错
Created by: levinxo
保存模型:
# baseline model
export PYTHONIOENCODING=UTF-8 # 模型输出为Unicode编码,Python2若无此设置容易报错
python3.7 inference_model.py \
--init_checkpoint ./model_baseline \
--inference_save_dir ./inference_model
保存为serving预测模型:
import paddle_serving_client.io as serving_io
serving_io.inference_model_to_serving('./inference_model', serving_server="serving_server", serving_client="serving_client", model_filename='model.pdmodel', params_filename='params.pdparams')
最后生成的serving_server_conf.prototxt有错误,导致请求时返回{"result":"Request Value Error"}
feed_var {
name: "words"
alias_name: "words"
is_lod_tensor: true
feed_type: 0
shape: -1
}
fetch_var {
name: "crf_decoding_0.tmp_0"
alias_name: "crf_decoding_0.tmp_0"
is_lod_tensor: true
fetch_type: 1 <--- 这里应该是0。0代表int,1代表float
shape: -1
}