From aec182ed88446feb1fb28a47c65e05c8bc85d16a Mon Sep 17 00:00:00 2001 From: huangjianhui <852142024@qq.com> Date: Mon, 15 Nov 2021 21:21:48 +0800 Subject: [PATCH] Update Pipeline_Design_EN.md --- doc/Python_Pipeline/Pipeline_Design_EN.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/doc/Python_Pipeline/Pipeline_Design_EN.md b/doc/Python_Pipeline/Pipeline_Design_EN.md index 069d1697..ce33d73d 100644 --- a/doc/Python_Pipeline/Pipeline_Design_EN.md +++ b/doc/Python_Pipeline/Pipeline_Design_EN.md @@ -320,7 +320,7 @@ All examples of pipelines are in [examples/pipeline/](../../examples/Pipeline) d - [ocr](../../examples/Pipeline/PaddleOCR/ocr) - [simple_web_service](../../examples/Pipeline/simple_web_service) -Here, we build a simple imdb model enable example to show how to use Pipeline Serving. The relevant code can be found in the `python/examples/pipeline/imdb_model_ensemble` folder. The Server-side structure in the example is shown in the following figure: +Here, we build a simple imdb model enable example to show how to use Pipeline Serving. The relevant code can be found in the `Serving/examples/Pipeline/imdb_model_ensemble` folder. The Server-side structure in the example is shown in the following figure:
@@ -348,13 +348,13 @@ Five types of files are needed, of which model files, configuration files, and s ### 3.2 Get model files ```shell -cd python/examples/pipeline/imdb_model_ensemble +cd Serving/examples/Pipeline/imdb_model_ensemble sh get_data.sh python -m paddle_serving_server.serve --model imdb_cnn_model --port 9292 &> cnn.log & python -m paddle_serving_server.serve --model imdb_bow_model --port 9393 &> bow.log & ``` -PipelineServing also supports local automatic startup of PaddleServingService. Please refer to the example `python/examples/pipeline/ocr`. +PipelineServing also supports local automatic startup of PaddleServingService. Please refer to the example `Serving/examples/Pipeline/PaddleOCR/ocr`. ### 3.3 Create config.yaml @@ -705,7 +705,7 @@ Pipeline Serving supports low-precision inference. The precision types supported - fp16 - int8 -Reference the example [simple_web_service](../python/examples/pipeline/simple_web_service). +Reference the example [simple_web_service](../../examples/Pipeline/simple_web_service). *** -- GitLab