提交 a511b1ad 编写于 作者: W WangXi

change io.convert to convert

上级 b4d531a0
......@@ -38,9 +38,9 @@ If you have saved model files using Paddle's `save_inference_model` API, you can
import paddle_serving_client.io as serving_io
serving_io.inference_model_to_serving(dirname, serving_server="serving_server", serving_client="serving_client", model_filename=None, params_filename=None )
```
Or you can use a build-in python module called `paddle_serving_client.io.convert` to convert it.
Or you can use a build-in python module called `paddle_serving_client.convert` to convert it.
```python
python -m paddle_serving_client.io.convert --dirname ./your_inference_model_dir
python -m paddle_serving_client.convert --dirname ./your_inference_model_dir
```
Arguments are the same as `inference_model_to_serving` API.
| Argument | Type | Default | Description |
......
......@@ -39,9 +39,9 @@ for line in sys.stdin:
import paddle_serving_client.io as serving_io
serving_io.inference_model_to_serving(dirname, serving_server="serving_server", serving_client="serving_client", model_filename=None, params_filename=None)
```
或者你可以使用Paddle Serving提供的名为`paddle_serving_client.io.convert`的内置模块进行转换。
或者你可以使用Paddle Serving提供的名为`paddle_serving_client.convert`的内置模块进行转换。
```python
python -m paddle_serving_client.io.convert --dirname ./your_inference_model_dir
python -m paddle_serving_client.convert --dirname ./your_inference_model_dir
```
模块参数与`inference_model_to_serving`接口参数相同。
| 参数 | 类型 | 默认值 | 描述 |
......
......@@ -15,10 +15,10 @@
Usage:
Convert a paddle inference model into a model file that can be used for Paddle Serving.
Example:
python -m paddle_serving_client.io.convert --dirname ./inference_model
python -m paddle_serving_client.convert --dirname ./inference_model
"""
import argparse
from . import inference_model_to_serving
from .io import inference_model_to_serving
def parse_args(): # pylint: disable=doc-string-missing
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册