INFERENCE_TO_SERVING_CN.md 481 字节
Newer Older
D
add doc  
dongdaxiang 已提交
1 2 3 4 5 6 7 8 9 10 11 12 13 14
# 如何从Paddle保存的预测模型转为Paddle Serving格式可部署的模型

([English](./INFERENCE_TO_SERVING.md)|简体中文)

## 示例

``` python
from paddle_serving_client.io import inference_model_to_serving
inference_model_dir = "your_inference_model"
serving_client_dir = "serving_client_dir"
serving_server_dir = "serving_server_dir"
feed_var_names, fetch_var_names = inference_model_to_serving(
		inference_model_dir, serving_client_dir, serving_server_dir)
```