提交 c00eac1f 编写于 作者: Z zhangjun

update doc

上级 0cff583a
...@@ -2,11 +2,17 @@ ...@@ -2,11 +2,17 @@
## Prepare ## Prepare
### download model and extract ### download model and extract
``` ```
wget https://paddle-serving.bj.bcebos.com/models/xpu/bert.tgz wget https://paddle-inference-dist.cdn.bcebos.com/PaddleLite/models_and_data_for_unittests/bert_base_chinese.tar.gz
tar zxvf bert_base_chinese.tar.gz
``` ```
### convert model ### convert model
``` ```
python -m paddle_serving_client.convert --dirname infer_bert-base-chinese_ft_model_4000.pdparams python3 -m paddle_serving_client.convert --dirname bert_base_chinese --model_filename bert_base_chinese/model.pdmodel --params_filename bert_base_chinese/model.pdiparams
```
### or, you can get the serving saved model directly
```
wget https://paddle-serving.bj.bcebos.com/models/xpu/bert.tar.gz
tar zxvf bert.tar.gz
``` ```
### Getting Dict and Sample Dataset ### Getting Dict and Sample Dataset
...@@ -20,11 +26,11 @@ this script will download Chinese Dictionary File vocab.txt and Chinese Sample D ...@@ -20,11 +26,11 @@ this script will download Chinese Dictionary File vocab.txt and Chinese Sample D
### Start Service ### Start Service
``` ```
python bert_web_service.py serving_server 7703 python3 bert_web_service.py serving_server 7703
``` ```
### Client Prediction ### Client Prediction
``` ```
head data-c.txt | python bert_client.py head data-c.txt | python3 bert_client.py --model serving_client/serving_client_conf.prototxt
``` ```
wget https://paddle-serving.bj.bcebos.com/bert_example/data-c.txt --no-check-certificate
wget https://paddle-serving.bj.bcebos.com/bert_example/vocab.txt --no-check-certificate
此差异已折叠。
...@@ -2,12 +2,18 @@ ...@@ -2,12 +2,18 @@
## Prepare ## Prepare
### download model and extract ### download model and extract
``` ```
wget https://paddle-serving.bj.bcebos.com/models/xpu/ernie.tgz wget https://paddle-inference-dist.cdn.bcebos.com/PaddleLite/models_and_data_for_unittests/ernie.tar.gz
tar zxvf ernie.tar.gz
``` ```
### convert model ### convert model
``` ```
python3 -m paddle_serving_client.convert --dirname erine python3 -m paddle_serving_client.convert --dirname erine
``` ```
### or, you can get the serving saved model directly
```
wget https://paddle-serving.bj.bcebos.com/models/xpu/bert.tar.gz
tar zxvf bert.tar.gz
```
### Getting Dict and Sample Dataset ### Getting Dict and Sample Dataset
``` ```
......
wget https://paddle-serving.bj.bcebos.com/bert_example/data-c.txt --no-check-certificate
wget https://paddle-serving.bj.bcebos.com/bert_example/vocab.txt --no-check-certificate
此差异已折叠。
## Prepare ## Prepare
### download model and extract
```
wget https://paddle-inference-dist.bj.bcebos.com/PaddleLite/models_and_data_for_unittests/VGG19.tar.gz
tar zxvf VGG19.tar.gz
```
### convert model ### convert model
``` ```
python -m paddle_serving_client.convert --dirname VGG19 python3 -m paddle_serving_client.convert --dirname VGG19
```
### or, you can get the serving saved model directly
```
wget https://paddle-serving.bj.bcebos.com/models/xpu/vgg19.tar.gz
tar zxvf vgg19.tar.gz
``` ```
## RPC Service ## RPC Service
...@@ -10,7 +20,7 @@ python -m paddle_serving_client.convert --dirname VGG19 ...@@ -10,7 +20,7 @@ python -m paddle_serving_client.convert --dirname VGG19
### Start Service ### Start Service
``` ```
python -m paddle_serving_server.serve --model serving_server --port 7702 --use_lite --use_xpu --ir_optim python3 -m paddle_serving_server.serve --model serving_server --port 7702 --use_lite --use_xpu --ir_optim
``` ```
### Client Prediction ### Client Prediction
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册