如何获取 fetch参数?
Created by: iceriver97
我将模型 freeze 之后,使用 inference_model_to_serving
将模型保存为 Serving模型;
但我想知道客户端调用 client.predict()
函数时,fetch参数应该是什么?
不应该是我在freeze模型时使用函数 save_inference_model
时指定的 fetch吗?我将其作为参数得到报错信息:
aistudio@jupyter-380628-755201:~$ python ./client.py 983884_sat.jpg
/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/sklearn/externals/joblib/externals/cloudpickle/cloudpickle.py:47: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses
import imp
WARNING: Logging before InitGoogleLogging() is written to STDERR
I0907 16:32:14.215747 2107 naming_service_thread.cpp:209] brpc::policy::ListNamingService("127.0.0.1:9292"): added 1
Traceback (most recent call last):
File "./client.py", line 18, in <module>
fetch_map = client.predict(feed={"image": im}, fetch=["transpose_1.tmp_0"])
File "/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/paddle_serving_client/__init__.py", line 274, in predict
"Fetch names should not be empty or out of saved fetch list.")
ValueError: Fetch names should not be empty or out of saved fetch list.
我无法打开 serving_client_conf.prototxt 文件,是不是在该文件中可以得到我想要的信息呢? 这里是该文件
serving_client_conf.zip 固化模型的代码:
fluid.io.save_inference_model(
cfg.FREEZE.SAVE_DIR,
feeded_var_names=[image.name],
target_vars=[logit_out],
executor=exe,
main_program=infer_prog,
model_filename=cfg.FREEZE.MODEL_FILENAME,
params_filename=cfg.FREEZE.PARAMS_FILENAME)
我是不是应该指定client.predict()> fetch = 这里 logit_out.name?