diff --git a/doc/DOCKER_IMAGES.md b/doc/DOCKER_IMAGES.md index 47a300eabc85689f9bce7c46c353b35b85db9376..dcaa34b1844bca5347569235fa8309d57f7a6b9c 100644 --- a/doc/DOCKER_IMAGES.md +++ b/doc/DOCKER_IMAGES.md @@ -24,7 +24,6 @@ You can get images in two ways: ``` - ## Image description @@ -40,3 +39,13 @@ Runtime images cannot be used for compilation. | GPU (cuda10.0-cudnn7) development | CentOS7 | latest-cuda10.0-cudnn7-devel | [Dockerfile.cuda10.0-cudnn7.devel](../tools/Dockerfile.cuda10.0-cudnn7.devel) | | CPU development (Used to compile packages on Ubuntu) | CentOS6 | | [Dockerfile.centos6.devel](../tools/Dockerfile.centos6.devel) | | GPU (cuda9.0-cudnn7) development (Used to compile packages on Ubuntu) | CentOS6 | | [Dockerfile.centos6.cuda9.0-cudnn7.devel](../tools/Dockerfile.centos6.cuda9.0-cudnn7.devel) | + + + +## Requirements for running CUDA containers + +Running a CUDA container requires a machine with at least one CUDA-capable GPU and a driver compatible with the CUDA toolkit version you are using. + +The machine running the CUDA container **only requires the NVIDIA driver**, the CUDA toolkit doesn't have to be installed. + +For the relationship between CUDA toolkit version, Driver version and GPU architecture, please refer to [nvidia-docker wiki](https://github.com/NVIDIA/nvidia-docker/wiki/CUDA). diff --git a/doc/DOCKER_IMAGES_CN.md b/doc/DOCKER_IMAGES_CN.md index 26ef5e8bd8c23a281604e5ff0319416c3e408472..6865eb773665f5bc51aa9976b2d330e607e7c83d 100644 --- a/doc/DOCKER_IMAGES_CN.md +++ b/doc/DOCKER_IMAGES_CN.md @@ -22,7 +22,6 @@ ```shell docker build -t : . ``` - @@ -40,3 +39,13 @@ | GPU (cuda10.0-cudnn7) 开发镜像 | CentOS7 | latest-cuda10.0-cudnn7-devel | [Dockerfile.cuda10.0-cudnn7.devel](../tools/Dockerfile.cuda10.0-cudnn7.devel) | | CPU 开发镜像 (用于编译 Ubuntu 包) | CentOS6 | <无> | [Dockerfile.centos6.devel](../tools/Dockerfile.centos6.devel) | | GPU (cuda9.0-cudnn7) 开发镜像 (用于编译 Ubuntu 包) | CentOS6 | <无> | [Dockerfile.centos6.cuda9.0-cudnn7.devel](../tools/Dockerfile.centos6.cuda9.0-cudnn7.devel) | + + + +## 运行CUDA容器的要求 + +运行CUDA容器需要至少具有一个支持CUDA的GPU以及与您所使用的CUDA工具包版本兼容的驱动程序。 + +运行CUDA容器的机器**只需要相应的NVIDIA驱动程序**,而CUDA工具包不是必要的。 + +相关CUDA工具包版本、驱动版本和GPU架构的关系请参阅 [nvidia-docker wiki](https://github.com/NVIDIA/nvidia-docker/wiki/CUDA)。 diff --git a/doc/FAQ.md b/doc/FAQ.md index 5571269a25811e8d4badfa1785ff7a1e17e3daac..0dc4ed35a55e5904adbd1b924441aa21bc5436ab 100644 --- a/doc/FAQ.md +++ b/doc/FAQ.md @@ -4,6 +4,35 @@ ## 基础知识 +#### Q: Paddle Serving 、Paddle Inference、PaddleHub Serving三者的区别及联系? + +**A:** paddle serving是远程服务,即发起预测的设备(手机、浏览器、客户端等)与实际预测的硬件不在一起。 paddle inference是一个library,适合嵌入到一个大系统中保证预测效率,paddle serving调用了paddle inference做远程服务。paddlehub serving可以认为是一个示例,都会使用paddle serving作为统一预测服务入口。如果在web端交互,一般是调用远程服务的形式,可以使用paddle serving的web service搭建。 + +#### Q: paddle-serving是否支持Int32支持 + +**A:** 在protobuf定feed_type和fetch_type编号与数据类型对应如下 + +​ 0-int64 + +​ 1-float32 + +​ 2-int32 + +#### Q: paddle-serving是否支持windows和Linux环境下的多线程调用 + +**A:** 客户端可以发起多线程访问调用服务端 + +#### Q: paddle-serving如何修改消息大小限制 + +**A:** 在server端和client但通过FLAGS_max_body_size来扩大数据量限制,单位为字节,默认为64MB + +#### Q: paddle-serving客户端目前支持哪些语言 + +**A:** java c++ python + +#### Q: paddle-serving目前支持哪些协议 + +**A:** http rpc ## 编译问题 @@ -12,6 +41,10 @@ **A:** 通过pip命令安装自己编译出的whl包,并设置SERVING_BIN环境变量为编译出的serving二进制文件路径。 +#### Q: 使用Java客户端,mvn compile过程出现"No compiler is provided in this environment. Perhaps you are running on a JRE rather than a JDK?"错误 + +**A:** 没有安装JDK,或者JAVA_HOME路径配置错误(正确配置是JDK路径,常见错误配置成JRE路径,例如正确路径参考JAVA_HOME="/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.262.b10-0.el7_8.x86_64/")。Java JDK安装参考https://segmentfault.com/a/1190000015389941 + ## 部署问题 @@ -46,7 +79,15 @@ InvalidArgumentError: Device id must be less than GPU count, but received id is: **A:** 目前(0.4.0)仅支持CentOS,具体列表查阅[这里](https://github.com/PaddlePaddle/Serving/blob/develop/doc/DOCKER_IMAGES.md) +#### Q: python编译的GCC版本与serving的版本不匹配 + +**A:**:1)使用[GPU docker](https://github.com/PaddlePaddle/Serving/blob/develop/doc/RUN_IN_DOCKER.md#gpunvidia-docker)解决环境问题 + +​ 2)修改anaconda的虚拟环境下安装的python的gcc版本[参考](https://www.jianshu.com/p/c498b3d86f77) +#### Q: paddle-serving是否支持本地离线安装 + +**A:** 支持离线部署,需要把一些相关的[依赖包](https://github.com/PaddlePaddle/Serving/blob/develop/doc/COMPILE.md)提前准备安装好 ## 预测问题 @@ -105,6 +146,19 @@ client端的日志直接打印到标准输出。 通过在部署服务之前 'export GLOG_v=3'可以输出更为详细的日志信息。 +#### Q: paddle-serving启动成功后,相关的日志在哪里设置 + +**A:** 1)警告是glog组件打印的,告知glog初始化之前日志打印在STDERR + +​ 2)一般采用GLOG_v方式启动服务同时设置日志级别。 + +例如: +``` +GLOG_v=2 python -m paddle_serving_server.serve --model xxx_conf/ --port 9999 +``` + + + #### Q: (GLOG_v=2下)Server端日志一切正常,但Client端始终得不到正确的预测结果 **A:** 可能是配置文件有问题,检查下配置文件(is_load_tensor,fetch_type等有没有问题) diff --git a/doc/GRPC_IMPL_CN.md b/doc/GRPC_IMPL_CN.md index 7b10907caec98ae5754126a7ec54096cc4cd48af..9e7ecd268fe0900c1085479c1f96fa083629758c 100644 --- a/doc/GRPC_IMPL_CN.md +++ b/doc/GRPC_IMPL_CN.md @@ -1,52 +1,137 @@ -# gRPC接口 +# gRPC接口使用介绍 + + - [1.与bRPC接口对比](#1与brpc接口对比) + - [1.1 服务端对比](#11-服务端对比) + - [1.2 客服端对比](#12-客服端对比) + - [1.3 其他](#13-其他) + - [2.示例:线性回归预测服务](#2示例线性回归预测服务) + - [获取数据](#获取数据) + - [开启 gRPC 服务端](#开启-grpc-服务端) + - [客户端预测](#客户端预测) + - [同步预测](#同步预测) + - [异步预测](#异步预测) + - [Batch 预测](#batch-预测) + - [通用 pb 预测](#通用-pb-预测) + - [预测超时](#预测超时) + - [List 输入](#list-输入) + - [3.更多示例](#3更多示例) + +使用gRPC接口,Client端可以在Win/Linux/MacOS平台上调用不同语言。gRPC 接口实现结构如下: + +![](https://github.com/PaddlePaddle/Serving/blob/develop/doc/grpc_impl.png) + +## 1.与bRPC接口对比 + +#### 1.1 服务端对比 + +* gRPC Server 端 `load_model_config` 函数添加 `client_config_path` 参数: -gRPC 接口实现形式类似 Web Service: - -![](grpc_impl.png) - -## 与bRPC接口对比 - -1. gRPC Server 端 `load_model_config` 函数添加 `client_config_path` 参数: - - ```python + ``` def load_model_config(self, server_config_paths, client_config_path=None) ``` + 在一些例子中 bRPC Server 端与 bRPC Client 端的配置文件可能不同(如 在cube local 中,Client 端的数据先交给 cube,经过 cube 处理后再交给预测库),此时 gRPC Server 端需要手动设置 gRPC Client 端的配置`client_config_path`。 + **`client_config_path` 默认为 `/serving_server_conf.prototxt`。** - 在一些例子中 bRPC Server 端与 bRPC Client 端的配置文件可能是不同的(如 cube local 例子中,Client 端的数据先交给 cube,经过 cube 处理后再交给预测库),所以 gRPC Server 端需要获取 gRPC Client 端的配置;同时为了取消 gRPC Client 端手动加载配置文件的过程,所以设计 gRPC Server 端同时加载两个配置文件。`client_config_path` 默认为 `/serving_server_conf.prototxt`。 +#### 1.2 客服端对比 -2. gRPC Client 端取消 `load_client_config` 步骤: +* gRPC Client 端取消 `load_client_config` 步骤: 在 `connect` 步骤通过 RPC 获取相应的 prototxt(从任意一个 endpoint 获取即可)。 -3. gRPC Client 需要通过 RPC 方式设置 timeout 时间(调用形式与 bRPC Client保持一致) +* gRPC Client 需要通过 RPC 方式设置 timeout 时间(调用形式与 bRPC Client保持一致) 因为 bRPC Client 在 `connect` 后无法更改 timeout 时间,所以当 gRPC Server 收到变更 timeout 的调用请求时会重新创建 bRPC Client 实例以变更 bRPC Client timeout时间,同时 gRPC Client 会设置 gRPC 的 deadline 时间。 **注意,设置 timeout 接口和 Inference 接口不能同时调用(非线程安全),出于性能考虑暂时不加锁。** -4. gRPC Client 端 `predict` 函数添加 `asyn` 和 `is_python` 参数: +* gRPC Client 端 `predict` 函数添加 `asyn` 和 `is_python` 参数: - ```python + ``` def predict(self, feed, fetch, need_variant_tag=False, asyn=False, is_python=True) ``` - 其中,`asyn` 为异步调用选项。当 `asyn=True` 时为异步调用,返回 `MultiLangPredictFuture` 对象,通过 `MultiLangPredictFuture.result()` 阻塞获取预测值;当 `asyn=Fasle` 为同步调用。 +1. `asyn` 为异步调用选项。当 `asyn=True` 时为异步调用,返回 `MultiLangPredictFuture` 对象,通过 `MultiLangPredictFuture.result()` 阻塞获取预测值;当 `asyn=Fasle` 为同步调用。 + +2. `is_python` 为 proto 格式选项。当 `is_python=True` 时,基于 numpy bytes 格式进行数据传输,目前只适用于 Python;当 `is_python=False` 时,以普通数据格式传输,更加通用。使用 numpy bytes 格式传输耗时比普通数据格式小很多(详见 [#654](https://github.com/PaddlePaddle/Serving/pull/654))。 + +#### 1.3 其他 + +* 异常处理:当 gRPC Server 端的 bRPC Client 预测失败(返回 `None`)时,gRPC Client 端同样返回None。其他 gRPC 异常会在 Client 内部捕获,并在返回的 fetch_map 中添加一个 "status_code" 字段来区分是否预测正常(参考 timeout 样例)。 + +* 由于 gRPC 只支持 pick_first 和 round_robin 负载均衡策略,ABTEST 特性还未打齐。 + +* 系统兼容性: + * [x] CentOS + * [x] macOS + * [x] Windows + +* 已经支持的客户端语言: + + - Python + - Java + - Go + + +## 2.示例:线性回归预测服务 + +以下是采用gRPC实现的关于线性回归预测的一个示例,具体代码详见此[链接](https://github.com/PaddlePaddle/Serving/tree/develop/python/examples/grpc_impl_example/fit_a_line) +#### 获取数据 + +```shell +sh get_data.sh +``` + +#### 开启 gRPC 服务端 + +``` shell +python test_server.py uci_housing_model/ +``` + +也可以通过下面的一行代码开启默认 gRPC 服务: + +```shell +python -m paddle_serving_server.serve --model uci_housing_model --thread 10 --port 9393 --use_multilang +``` +注:--use_multilang参数用来启用多语言客户端 + +### 客户端预测 + +#### 同步预测 + +``` shell +python test_sync_client.py +``` + +#### 异步预测 + +``` shell +python test_asyn_client.py +``` + +#### Batch 预测 + +``` shell +python test_batch_client.py +``` - `is_python` 为 proto 格式选项。当 `is_python=True` 时,基于 numpy bytes 格式进行数据传输,目前只适用于 Python;当 `is_python=False` 时,以普通数据格式传输,更加通用。使用 numpy bytes 格式传输耗时比普通数据格式小很多(详见 [#654](https://github.com/PaddlePaddle/Serving/pull/654))。 +#### 通用 pb 预测 -5. 异常处理:当 gRPC Server 端的 bRPC Client 预测失败(返回 `None`)时,gRPC Client 端同样返回None。其他 gRPC 异常会在 Client 内部捕获,并在返回的 fetch_map 中添加一个 "status_code" 字段来区分是否预测正常(参考 timeout 样例)。 +``` shell +python test_general_pb_client.py +``` -6. 由于 gRPC 只支持 pick_first 和 round_robin 负载均衡策略,ABTEST 特性还未打齐。 +#### 预测超时 -7. 经测试,gRPC 版本可以在 Windows、macOS 平台使用。 +``` shell +python test_timeout_client.py +``` -8. 计划支持的客户端语言: +#### List 输入 - - [x] Python - - [ ] Java - - [ ] Go - - [ ] JavaScript +``` shell +python test_list_input_client.py +``` -## Python 端的一些例子 +## 3.更多示例 -详见 `python/examples/grpc_impl_example` 下的示例文件。 +详见[`python/examples/grpc_impl_example`](https://github.com/PaddlePaddle/Serving/tree/develop/python/examples/grpc_impl_example)下的示例文件。 diff --git a/doc/INFERENCE_TO_SERVING.md b/doc/INFERENCE_TO_SERVING.md index e10ee976fb455c8cc49a0d5fa44ed4cc1f300ba9..719aa63c0a9b408d6bff628e7be4f35dfb49c5c8 100644 --- a/doc/INFERENCE_TO_SERVING.md +++ b/doc/INFERENCE_TO_SERVING.md @@ -24,13 +24,13 @@ inference_model_dir = "your_inference_model" serving_client_dir = "serving_client_dir" serving_server_dir = "serving_server_dir" feed_var_names, fetch_var_names = inference_model_to_serving( - inference_model_dir, serving_client_dir, serving_server_dir) + inference_model_dir, serving_server_dir, serving_client_dir) ``` if your model file and params file are both standalone, please use the following api. ``` feed_var_names, fetch_var_names = inference_model_to_serving( - inference_model_dir, serving_client_dir, serving_server_dir, + inference_model_dir, serving_server_dir, serving_client_dir, model_filename="model", params_filename="params") ``` diff --git a/doc/INFERENCE_TO_SERVING_CN.md b/doc/INFERENCE_TO_SERVING_CN.md index e7e909ac04be3b1a0885b3390d99a153dfbd170e..5d783f25a3f367baa94d471e50f227d9e6f733d1 100644 --- a/doc/INFERENCE_TO_SERVING_CN.md +++ b/doc/INFERENCE_TO_SERVING_CN.md @@ -23,11 +23,11 @@ inference_model_dir = "your_inference_model" serving_client_dir = "serving_client_dir" serving_server_dir = "serving_server_dir" feed_var_names, fetch_var_names = inference_model_to_serving( - inference_model_dir, serving_client_dir, serving_server_dir) + inference_model_dir, serving_server_dir, serving_client_dir) ``` 如果模型中有模型描述文件`model_filename` 和 模型参数文件`params_filename`,那么请用 ``` feed_var_names, fetch_var_names = inference_model_to_serving( - inference_model_dir, serving_client_dir, serving_server_dir, + inference_model_dir, serving_server_dir, serving_client_dir, model_filename="model", params_filename="params") ``` diff --git a/java/examples/pom.xml b/java/examples/pom.xml index b6c8bc424f5d528d74a4a45828fd9b5e7e5d008e..745e8d4f0f3d47e488f99bd7fe73ed6a9f887373 100644 --- a/java/examples/pom.xml +++ b/java/examples/pom.xml @@ -75,7 +75,7 @@ junit junit - 4.11 + 4.13.1 test diff --git a/python/examples/bert/bert_client.py b/python/examples/bert/bert_client.py index d0f8b0aad19b78e6235a3dd0403f20324b4681b4..4111589b3ddfde980e415fbac1a5f38f4abafada 100644 --- a/python/examples/bert/bert_client.py +++ b/python/examples/bert/bert_client.py @@ -23,7 +23,7 @@ args = benchmark_args() reader = ChineseBertReader({"max_seq_len": 128}) fetch = ["pooled_output"] -endpoint_list = ['127.0.0.1:8861'] +endpoint_list = ['127.0.0.1:9292'] client = Client() client.load_client_config(args.model) client.connect(endpoint_list) @@ -33,5 +33,5 @@ for line in sys.stdin: for key in feed_dict.keys(): feed_dict[key] = np.array(feed_dict[key]).reshape((128, 1)) #print(feed_dict) - result = client.predict(feed=feed_dict, fetch=fetch) + result = client.predict(feed=feed_dict, fetch=fetch, batch=False) print(result) diff --git a/python/examples/bert/bert_web_service.py b/python/examples/bert/bert_web_service.py index e3985c9da6c90bb349cc76cba038abd3fe9359c5..7cd34fb99e0ecebbf2f6bec47e9c9d163ac3a44c 100644 --- a/python/examples/bert/bert_web_service.py +++ b/python/examples/bert/bert_web_service.py @@ -29,13 +29,14 @@ class BertService(WebService): def preprocess(self, feed=[], fetch=[]): feed_res = [] + is_batch = False for ins in feed: feed_dict = self.reader.process(ins["words"].encode("utf-8")) for key in feed_dict.keys(): feed_dict[key] = np.array(feed_dict[key]).reshape( - (1, len(feed_dict[key]), 1)) + (len(feed_dict[key]), 1)) feed_res.append(feed_dict) - return feed_res, fetch + return feed_res, fetch, is_batch bert_service = BertService(name="bert") diff --git a/python/examples/faster_rcnn_model/000000570688.jpg b/python/examples/cascade_rcnn/000000570688.jpg similarity index 100% rename from python/examples/faster_rcnn_model/000000570688.jpg rename to python/examples/cascade_rcnn/000000570688.jpg diff --git a/python/examples/faster_rcnn_model/label_list.txt b/python/examples/cascade_rcnn/label_list.txt similarity index 100% rename from python/examples/faster_rcnn_model/label_list.txt rename to python/examples/cascade_rcnn/label_list.txt diff --git a/python/examples/cascade_rcnn/test_client.py b/python/examples/cascade_rcnn/test_client.py new file mode 100644 index 0000000000000000000000000000000000000000..b40e97acc3e84a7dc7411a7ad3c3f8c1dc8171a6 --- /dev/null +++ b/python/examples/cascade_rcnn/test_client.py @@ -0,0 +1,40 @@ +# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +from paddle_serving_client import Client +from paddle_serving_app.reader import * +import numpy as np + +preprocess = Sequential([ + File2Image(), BGR2RGB(), Div(255.0), + Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225], False), + Resize(800, 1333), Transpose((2, 0, 1)), PadStride(32) +]) +postprocess = RCNNPostprocess("label_list.txt", "output") +client = Client() +client.load_client_config("serving_client/serving_client_conf.prototxt") +client.connect(['127.0.0.1:9292']) +im = preprocess('000000570688.jpg') +fetch_map = client.predict( + feed={ + "image": im, + "im_info": np.array(list(im.shape[1:]) + [1.0]), + "im_shape": np.array(list(im.shape[1:]) + [1.0]) + }, + fetch=["multiclass_nms_0.tmp_0"], + batch=False) +fetch_map["image"] = '000000570688.jpg' +print(fetch_map) +postprocess(fetch_map) +print(fetch_map) diff --git a/python/examples/faster_rcnn/000000570688.jpg b/python/examples/faster_rcnn/000000570688.jpg new file mode 100644 index 0000000000000000000000000000000000000000..cb304bd56c4010c08611a30dcca58ea9140cea54 Binary files /dev/null and b/python/examples/faster_rcnn/000000570688.jpg differ diff --git a/python/examples/faster_rcnn_model/000000570688_bbox.jpg b/python/examples/faster_rcnn/000000570688_bbox.jpg similarity index 100% rename from python/examples/faster_rcnn_model/000000570688_bbox.jpg rename to python/examples/faster_rcnn/000000570688_bbox.jpg diff --git a/python/examples/faster_rcnn_model/README.md b/python/examples/faster_rcnn/README.md similarity index 100% rename from python/examples/faster_rcnn_model/README.md rename to python/examples/faster_rcnn/README.md diff --git a/python/examples/faster_rcnn_model/README_CN.md b/python/examples/faster_rcnn/README_CN.md similarity index 100% rename from python/examples/faster_rcnn_model/README_CN.md rename to python/examples/faster_rcnn/README_CN.md diff --git a/python/examples/faster_rcnn/benchmark.py b/python/examples/faster_rcnn/benchmark.py new file mode 100755 index 0000000000000000000000000000000000000000..543ef3352ea629f5209b7e5252276bda46a4b2ae --- /dev/null +++ b/python/examples/faster_rcnn/benchmark.py @@ -0,0 +1,123 @@ +# -*- coding: utf-8 -*- +# +# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# pylint: disable=doc-string-missing + +from __future__ import unicode_literals, absolute_import +import os +import sys +import time +import json +import requests +from paddle_serving_client import Client +from paddle_serving_client.utils import MultiThreadRunner +from paddle_serving_client.utils import benchmark_args, show_latency +from paddle_serving_app.reader import ChineseBertReader + +from paddle_serving_app.reader import * +import numpy as np + +args = benchmark_args() + + +def single_func(idx, resource): + img = "./000000570688.jpg" + profile_flags = False + latency_flags = False + if os.getenv("FLAGS_profile_client"): + profile_flags = True + if os.getenv("FLAGS_serving_latency"): + latency_flags = True + latency_list = [] + + if args.request == "rpc": + preprocess = Sequential([ + File2Image(), BGR2RGB(), Div(255.0), + Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225], False), + Resize(640, 640), Transpose((2, 0, 1)) + ]) + + postprocess = RCNNPostprocess("label_list.txt", "output") + client = Client() + + client.load_client_config(args.model) + client.connect([resource["endpoint"][idx % len(resource["endpoint"])]]) + + start = time.time() + for i in range(turns): + if args.batch_size >= 1: + l_start = time.time() + feed_batch = [] + b_start = time.time() + im = preprocess(img) + for bi in range(args.batch_size): + print("1111batch") + print(bi) + feed_batch.append({ + "image": im, + "im_info": np.array(list(im.shape[1:]) + [1.0]), + "im_shape": np.array(list(im.shape[1:]) + [1.0]) + }) + # im = preprocess(img) + b_end = time.time() + + if profile_flags: + sys.stderr.write( + "PROFILE\tpid:{}\tbert_pre_0:{} bert_pre_1:{}\n".format( + os.getpid(), + int(round(b_start * 1000000)), + int(round(b_end * 1000000)))) + #result = client.predict(feed=feed_batch, fetch=fetch) + fetch_map = client.predict( + feed=feed_batch, fetch=["multiclass_nms"]) + fetch_map["image"] = img + postprocess(fetch_map) + + l_end = time.time() + if latency_flags: + latency_list.append(l_end * 1000 - l_start * 1000) + else: + print("unsupport batch size {}".format(args.batch_size)) + else: + raise ValueError("not implemented {} request".format(args.request)) + end = time.time() + if latency_flags: + return [[end - start], latency_list] + else: + return [[end - start]] + + +if __name__ == '__main__': + multi_thread_runner = MultiThreadRunner() + endpoint_list = ["127.0.0.1:7777"] + turns = 10 + start = time.time() + result = multi_thread_runner.run( + single_func, args.thread, {"endpoint": endpoint_list, + "turns": turns}) + end = time.time() + total_cost = end - start + + avg_cost = 0 + for i in range(args.thread): + avg_cost += result[0][i] + avg_cost = avg_cost / args.thread + + print("total cost: {}s".format(total_cost)) + print("each thread cost: {}s. ".format(avg_cost)) + print("qps: {}samples/s".format(args.batch_size * args.thread * turns / + total_cost)) + if os.getenv("FLAGS_serving_latency"): + show_latency(result[1]) diff --git a/python/examples/faster_rcnn/benchmark.sh b/python/examples/faster_rcnn/benchmark.sh new file mode 100755 index 0000000000000000000000000000000000000000..5706fd03c7a0e266bcac18b0544c64f327cbbe9b --- /dev/null +++ b/python/examples/faster_rcnn/benchmark.sh @@ -0,0 +1,52 @@ +rm profile_log* +export CUDA_VISIBLE_DEVICES=0 +export FLAGS_profile_server=1 +export FLAGS_profile_client=1 +export FLAGS_serving_latency=1 + +gpu_id=0 +#save cpu and gpu utilization log +if [ -d utilization ];then + rm -rf utilization +else + mkdir utilization +fi +#start server +$PYTHONROOT/bin/python3 -m paddle_serving_server_gpu.serve --model $1 --port 7777 --thread 4 --gpu_ids 0 --ir_optim > elog 2>&1 & +sleep 5 + +#warm up +$PYTHONROOT/bin/python3 benchmark.py --thread 4 --batch_size 1 --model $2/serving_client_conf.prototxt --request rpc > profile 2>&1 +echo -e "import psutil\ncpu_utilization=psutil.cpu_percent(1,False)\nprint('CPU_UTILIZATION:', cpu_utilization)\n" > cpu_utilization.py +for thread_num in 1 4 8 16 +do +for batch_size in 1 +do + job_bt=`date '+%Y%m%d%H%M%S'` + nvidia-smi --id=0 --query-compute-apps=used_memory --format=csv -lms 100 > gpu_use.log 2>&1 & + nvidia-smi --id=0 --query-gpu=utilization.gpu --format=csv -lms 100 > gpu_utilization.log 2>&1 & + gpu_memory_pid=$! + $PYTHONROOT/bin/python3 benchmark.py --thread $thread_num --batch_size $batch_size --model $2/serving_client_conf.prototxt --request rpc > profile 2>&1 + kill ${gpu_memory_pid} + kill `ps -ef|grep used_memory|awk '{print $2}'` + echo "model_name:" $1 + echo "thread_num:" $thread_num + echo "batch_size:" $batch_size + echo "=================Done====================" + echo "model_name:$1" >> profile_log_$1 + echo "batch_size:$batch_size" >> profile_log_$1 + $PYTHONROOT/bin/python3 cpu_utilization.py >> profile_log_$1 + job_et=`date '+%Y%m%d%H%M%S'` + awk 'BEGIN {max = 0} {if(NR>1){if ($1 > max) max=$1}} END {print "MAX_GPU_MEMORY:", max}' gpu_use.log >> profile_log_$1 + awk 'BEGIN {max = 0} {if(NR>1){if ($1 > max) max=$1}} END {print "GPU_UTILIZATION:", max}' gpu_utilization.log >> profile_log_$1 + rm -rf gpu_use.log gpu_utilization.log + $PYTHONROOT/bin/python3 ../util/show_profile.py profile $thread_num >> profile_log_$1 + tail -n 8 profile >> profile_log_$1 + echo "" >> profile_log_$1 +done +done + +#Divided log +awk 'BEGIN{RS="\n\n"}{i++}{print > "bert_log_"i}' profile_log_$1 +mkdir bert_log && mv bert_log_* bert_log +ps -ef|grep 'serving'|grep -v grep|cut -c 9-15 | xargs kill -9 diff --git a/python/examples/faster_rcnn/label_list.txt b/python/examples/faster_rcnn/label_list.txt new file mode 100644 index 0000000000000000000000000000000000000000..d7d43a94adf73208f997f0efd6581bef11ca734e --- /dev/null +++ b/python/examples/faster_rcnn/label_list.txt @@ -0,0 +1,81 @@ +background +person +bicycle +car +motorcycle +airplane +bus +train +truck +boat +traffic light +fire hydrant +stop sign +parking meter +bench +bird +cat +dog +horse +sheep +cow +elephant +bear +zebra +giraffe +backpack +umbrella +handbag +tie +suitcase +frisbee +skis +snowboard +sports ball +kite +baseball bat +baseball glove +skateboard +surfboard +tennis racket +bottle +wine glass +cup +fork +knife +spoon +bowl +banana +apple +sandwich +orange +broccoli +carrot +hot dog +pizza +donut +cake +chair +couch +potted plant +bed +dining table +toilet +tv +laptop +mouse +remote +keyboard +cell phone +microwave +oven +toaster +sink +refrigerator +book +clock +vase +scissors +teddy bear +hair drier +toothbrush diff --git a/python/examples/faster_rcnn_model/test_client.py b/python/examples/faster_rcnn/test_client.py similarity index 100% rename from python/examples/faster_rcnn_model/test_client.py rename to python/examples/faster_rcnn/test_client.py diff --git a/python/examples/imagenet/resnet50_rpc_client.py b/python/examples/imagenet/resnet50_rpc_client.py index 7888ab6302b483672ec1d7270f7db0c551f1778d..b23f99175b97a011c3b1c72d3b7358b646c54e68 100644 --- a/python/examples/imagenet/resnet50_rpc_client.py +++ b/python/examples/imagenet/resnet50_rpc_client.py @@ -38,7 +38,8 @@ start = time.time() image_file = "https://paddle-serving.bj.bcebos.com/imagenet-example/daisy.jpg" for i in range(10): img = seq(image_file) - fetch_map = client.predict(feed={"image": img}, fetch=["score"]) + fetch_map = client.predict( + feed={"image": img}, fetch=["score"], batch=False) prob = max(fetch_map["score"][0]) label = label_dict[fetch_map["score"][0].tolist().index(prob)].strip( ).replace(",", "") diff --git a/python/examples/imagenet/resnet50_web_service.py b/python/examples/imagenet/resnet50_web_service.py index 4c9822757ce233498ef9ec2baf5f3fcac7bc1ccb..7033103717621807ecd74093bf5eba8d31a9b877 100644 --- a/python/examples/imagenet/resnet50_web_service.py +++ b/python/examples/imagenet/resnet50_web_service.py @@ -13,7 +13,7 @@ # limitations under the License. import sys from paddle_serving_client import Client - +import numpy as np from paddle_serving_app.reader import Sequential, URL2Image, Resize, CenterCrop, RGB2BGR, Transpose, Div, Normalize, Base64ToImage if len(sys.argv) != 4: @@ -44,12 +44,13 @@ class ImageService(WebService): def preprocess(self, feed=[], fetch=[]): feed_batch = [] + is_batch = True for ins in feed: if "image" not in ins: raise ("feed data error!") img = self.seq(ins["image"]) feed_batch.append({"image": img[np.newaxis, :]}) - return feed_batch, fetch + return feed_batch, fetch, is_batch def postprocess(self, feed=[], fetch=[], fetch_map={}): score_list = fetch_map["score"] diff --git a/python/examples/imdb/benchmark.py b/python/examples/imdb/benchmark.py index d804731162b9fe1bf376867322941fdf31ea50b0..18584f88ea51373ffe2ca2e75946342c94464d76 100644 --- a/python/examples/imdb/benchmark.py +++ b/python/examples/imdb/benchmark.py @@ -18,7 +18,7 @@ import sys import time import requests import numpy as np -from paddle_serving_app.reader import IMDBDataset +from paddle_serving_app.reader.imdb_reader import IMDBDataset from paddle_serving_client import Client from paddle_serving_client.utils import MultiThreadRunner from paddle_serving_client.utils import MultiThreadRunner, benchmark_args, show_latency diff --git a/python/examples/imdb/test_client.py b/python/examples/imdb/test_client.py index c057fdb631340174cc6d3fe9d1873767ba0ece78..2aeee01a83cde4a66e4bd03ad49c7791c67a287e 100644 --- a/python/examples/imdb/test_client.py +++ b/python/examples/imdb/test_client.py @@ -13,7 +13,7 @@ # limitations under the License. # pylint: disable=doc-string-missing from paddle_serving_client import Client -from paddle_serving_app.reader import IMDBDataset +from paddle_serving_app.reader.imdb_reader import IMDBDataset import sys import numpy as np diff --git a/python/examples/imdb/text_classify_service.py b/python/examples/imdb/text_classify_service.py index 7b1f200e152da37c57cc8b2f7cd233531e5dd445..ca1e26002baf0284f282add235706080f7902c33 100755 --- a/python/examples/imdb/text_classify_service.py +++ b/python/examples/imdb/text_classify_service.py @@ -14,7 +14,7 @@ # pylint: disable=doc-string-missing from paddle_serving_server.web_service import WebService -from paddle_serving_app.reader import IMDBDataset +from paddle_serving_app.reader.imdb_reader import IMDBDataset import sys import numpy as np @@ -29,13 +29,14 @@ class IMDBService(WebService): def preprocess(self, feed={}, fetch=[]): feed_batch = [] words_lod = [0] + is_batch = True for ins in feed: words = self.dataset.get_words_only(ins["words"]) words = np.array(words).reshape(len(words), 1) words_lod.append(words_lod[-1] + len(words)) feed_batch.append(words) feed = {"words": np.concatenate(feed_batch), "words.lod": words_lod} - return feed, fetch + return feed, fetch, is_batch imdb_service = IMDBService(name="imdb") diff --git a/python/examples/lac/lac_web_service.py b/python/examples/lac/lac_web_service.py index bed89f54b626c0cce55767f8edacc3dd33f0104c..cf37f66294bd154324f2c7cacd1a35571b6c6350 100644 --- a/python/examples/lac/lac_web_service.py +++ b/python/examples/lac/lac_web_service.py @@ -15,6 +15,7 @@ from paddle_serving_server.web_service import WebService import sys from paddle_serving_app.reader import LACReader +import numpy as np class LACService(WebService): @@ -23,13 +24,21 @@ class LACService(WebService): def preprocess(self, feed={}, fetch=[]): feed_batch = [] + fetch = ["crf_decode"] + lod_info = [0] + is_batch = True for ins in feed: if "words" not in ins: raise ("feed data error!") feed_data = self.reader.process(ins["words"]) - feed_batch.append({"words": feed_data}) - fetch = ["crf_decode"] - return feed_batch, fetch + feed_batch.append(np.array(feed_data).reshape(len(feed_data), 1)) + lod_info.append(lod_info[-1] + len(feed_data)) + feed_dict = { + "words": np.concatenate( + feed_batch, axis=0), + "words.lod": lod_info + } + return feed_dict, fetch, is_batch def postprocess(self, feed={}, fetch=[], fetch_map={}): batch_ret = [] diff --git a/python/examples/ocr/det_debugger_server.py b/python/examples/ocr/det_debugger_server.py index 913a0bb4c9a099cbef886beb3889337d024d10d6..ebaf0a3066c7e948b4ca37dda06c60423ad28da9 100644 --- a/python/examples/ocr/det_debugger_server.py +++ b/python/examples/ocr/det_debugger_server.py @@ -53,7 +53,9 @@ class OCRService(WebService): self.ori_h, self.ori_w, _ = im.shape det_img = self.det_preprocess(im) _, self.new_h, self.new_w = det_img.shape - return {"image": det_img[np.newaxis, :].copy()}, ["concat_1.tmp_0"] + return { + "image": det_img[np.newaxis, :].copy() + }, ["concat_1.tmp_0"], True def postprocess(self, feed={}, fetch=[], fetch_map=None): det_out = fetch_map["concat_1.tmp_0"] diff --git a/python/examples/ocr/det_web_server.py b/python/examples/ocr/det_web_server.py index 38c6541c70e9871d13dd81751d4edb2bc771a904..e90efc7813e96e6b768a09b4ec9f5108c5878527 100644 --- a/python/examples/ocr/det_web_server.py +++ b/python/examples/ocr/det_web_server.py @@ -54,7 +54,7 @@ class OCRService(WebService): det_img = self.det_preprocess(im) _, self.new_h, self.new_w = det_img.shape print(det_img) - return {"image": det_img}, ["concat_1.tmp_0"] + return {"image": det_img}, ["concat_1.tmp_0"], False def postprocess(self, feed={}, fetch=[], fetch_map=None): det_out = fetch_map["concat_1.tmp_0"] diff --git a/python/examples/ocr/ocr_debugger_server.py b/python/examples/ocr/ocr_debugger_server.py index 3cbc3a66ef620f5c8851b50a352a0c1587467b3b..163ff9788806db819329df769ac44982220f0423 100644 --- a/python/examples/ocr/ocr_debugger_server.py +++ b/python/examples/ocr/ocr_debugger_server.py @@ -42,10 +42,9 @@ class OCRService(WebService): self.det_client = LocalPredictor() if sys.argv[1] == 'gpu': self.det_client.load_model_config( - det_model_config, gpu=True, profile=False) + det_model_config, use_gpu=True, gpu_id=1) elif sys.argv[1] == 'cpu': - self.det_client.load_model_config( - det_model_config, gpu=False, profile=False) + self.det_client.load_model_config(det_model_config) self.ocr_reader = OCRReader() def preprocess(self, feed=[], fetch=[]): @@ -58,7 +57,7 @@ class OCRService(WebService): det_img = det_img[np.newaxis, :] det_img = det_img.copy() det_out = self.det_client.predict( - feed={"image": det_img}, fetch=["concat_1.tmp_0"]) + feed={"image": det_img}, fetch=["concat_1.tmp_0"], batch=True) filter_func = FilterBoxes(10, 10) post_func = DBPostProcess({ "thresh": 0.3, @@ -91,7 +90,7 @@ class OCRService(WebService): imgs[id] = norm_img feed = {"image": imgs.copy()} fetch = ["ctc_greedy_decoder_0.tmp_0", "softmax_0.tmp_0"] - return feed, fetch + return feed, fetch, True def postprocess(self, feed={}, fetch=[], fetch_map=None): rec_res = self.ocr_reader.postprocess(fetch_map, with_score=True) @@ -107,7 +106,8 @@ ocr_service.load_model_config("ocr_rec_model") ocr_service.prepare_server(workdir="workdir", port=9292) ocr_service.init_det_debugger(det_model_config="ocr_det_model") if sys.argv[1] == 'gpu': - ocr_service.run_debugger_service(gpu=True) + ocr_service.set_gpus("2") + ocr_service.run_debugger_service() elif sys.argv[1] == 'cpu': ocr_service.run_debugger_service() ocr_service.run_web_service() diff --git a/python/examples/ocr/ocr_web_client.py b/python/examples/ocr/ocr_web_client.py index 3d25e288dd93014cf9c3f84edc01d42c013ba2d9..ce96a8bbcd585f37368d70070d649e25a0129029 100644 --- a/python/examples/ocr/ocr_web_client.py +++ b/python/examples/ocr/ocr_web_client.py @@ -36,4 +36,5 @@ for img_file in os.listdir(test_img_dir): image = cv2_to_base64(image_data1) data = {"feed": [{"image": image}], "fetch": ["res"]} r = requests.post(url=url, headers=headers, data=json.dumps(data)) + print(r) print(r.json()) diff --git a/python/examples/ocr/ocr_web_server.py b/python/examples/ocr/ocr_web_server.py index de83ca94a4c1f55d886175d9a87b6a34db34c2a5..97e619f2b894144193ee8ed51050f0239616aa39 100644 --- a/python/examples/ocr/ocr_web_server.py +++ b/python/examples/ocr/ocr_web_server.py @@ -50,7 +50,7 @@ class OCRService(WebService): ori_h, ori_w, _ = im.shape det_img = self.det_preprocess(im) det_out = self.det_client.predict( - feed={"image": det_img}, fetch=["concat_1.tmp_0"]) + feed={"image": det_img}, fetch=["concat_1.tmp_0"], batch=False) _, new_h, new_w = det_img.shape filter_func = FilterBoxes(10, 10) post_func = DBPostProcess({ @@ -77,10 +77,10 @@ class OCRService(WebService): max_wh_ratio = max(max_wh_ratio, wh_ratio) for img in img_list: norm_img = self.ocr_reader.resize_norm_img(img, max_wh_ratio) - feed = {"image": norm_img} - feed_list.append(feed) + feed_list.append(norm_img[np.newaxis, :]) + feed_batch = {"image": np.concatenate(feed_list, axis=0)} fetch = ["ctc_greedy_decoder_0.tmp_0", "softmax_0.tmp_0"] - return feed_list, fetch + return feed_batch, fetch, True def postprocess(self, feed={}, fetch=[], fetch_map=None): rec_res = self.ocr_reader.postprocess(fetch_map, with_score=True) diff --git a/python/examples/ocr/rec_debugger_server.py b/python/examples/ocr/rec_debugger_server.py index fbe67aafee5c8dcae269cd4ad6f6100ed514f0b7..9dc7b19b7bf17096cf3a0c18d0a337f990ecd1a4 100644 --- a/python/examples/ocr/rec_debugger_server.py +++ b/python/examples/ocr/rec_debugger_server.py @@ -52,7 +52,7 @@ class OCRService(WebService): imgs[i] = norm_img feed = {"image": imgs.copy()} fetch = ["ctc_greedy_decoder_0.tmp_0", "softmax_0.tmp_0"] - return feed, fetch + return feed, fetch, True def postprocess(self, feed={}, fetch=[], fetch_map=None): rec_res = self.ocr_reader.postprocess(fetch_map, with_score=True) diff --git a/python/examples/ocr/rec_web_server.py b/python/examples/ocr/rec_web_server.py index aae97fd9e3fbd1d29b6cf2ef160b92a522db2e22..300c26be8f6b33c0cdd4a57e75648e444a25d763 100644 --- a/python/examples/ocr/rec_web_server.py +++ b/python/examples/ocr/rec_web_server.py @@ -51,10 +51,17 @@ class OCRService(WebService): max_wh_ratio = max(max_wh_ratio, wh_ratio) for img in img_list: norm_img = self.ocr_reader.resize_norm_img(img, max_wh_ratio) - feed = {"image": norm_img} - feed_list.append(feed) + #feed = {"image": norm_img} + feed_list.append(norm_img) + if len(feed_list) == 1: + feed_batch = { + "image": np.concatenate( + feed_list, axis=0)[np.newaxis, :] + } + else: + feed_batch = {"image": np.concatenate(feed_list, axis=0)} fetch = ["ctc_greedy_decoder_0.tmp_0", "softmax_0.tmp_0"] - return feed_list, fetch + return feed_batch, fetch, True def postprocess(self, feed={}, fetch=[], fetch_map=None): rec_res = self.ocr_reader.postprocess(fetch_map, with_score=True) diff --git a/python/examples/ocr_detection/7.jpg b/python/examples/ocr_detection/7.jpg deleted file mode 100644 index a9483bb74f66d88699b09545366c32a4fe108e54..0000000000000000000000000000000000000000 Binary files a/python/examples/ocr_detection/7.jpg and /dev/null differ diff --git a/python/examples/ocr_detection/text_det_client.py b/python/examples/ocr_detection/text_det_client.py deleted file mode 100644 index aaa1c5b1179fcbf1d010bb9f6335ef2886435a83..0000000000000000000000000000000000000000 --- a/python/examples/ocr_detection/text_det_client.py +++ /dev/null @@ -1,47 +0,0 @@ -# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -import os -from paddle_serving_client import Client -from paddle_serving_app.reader import Sequential, File2Image, ResizeByFactor -from paddle_serving_app.reader import Div, Normalize, Transpose -from paddle_serving_app.reader import DBPostProcess, FilterBoxes - -client = Client() -client.load_client_config("ocr_det_client/serving_client_conf.prototxt") -client.connect(["127.0.0.1:9494"]) - -read_image_file = File2Image() -preprocess = Sequential([ - ResizeByFactor(32, 960), Div(255), - Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225]), Transpose( - (2, 0, 1)) -]) -post_func = DBPostProcess({ - "thresh": 0.3, - "box_thresh": 0.5, - "max_candidates": 1000, - "unclip_ratio": 1.5, - "min_size": 3 -}) -filter_func = FilterBoxes(10, 10) - -img = read_image_file(name) -ori_h, ori_w, _ = img.shape -img = preprocess(img) -new_h, new_w, _ = img.shape -ratio_list = [float(new_h) / ori_h, float(new_w) / ori_w] -outputs = client.predict(feed={"image": img}, fetch=["concat_1.tmp_0"]) -dt_boxes_list = post_func(outputs["concat_1.tmp_0"], [ratio_list]) -dt_boxes = filter_func(dt_boxes_list[0], [ori_h, ori_w]) diff --git a/python/examples/pipeline/imagenet/README.md b/python/examples/pipeline/imagenet/README.md new file mode 100644 index 0000000000000000000000000000000000000000..d0fa99e6d72f10d3d2b5907285528b68685128e0 --- /dev/null +++ b/python/examples/pipeline/imagenet/README.md @@ -0,0 +1,19 @@ +# Imagenet Pipeline WebService + +This document will takes Imagenet service as an example to introduce how to use Pipeline WebService. + +## Get model +``` +sh get_model.sh +``` + +## Start server + +``` +python resnet50_web_service.py &>log.txt & +``` + +## RPC test +``` +python pipeline_rpc_client.py +``` diff --git a/python/examples/pipeline/imagenet/README_CN.md b/python/examples/pipeline/imagenet/README_CN.md new file mode 100644 index 0000000000000000000000000000000000000000..325a64e7a01da169978da7fc07b9252c4896f327 --- /dev/null +++ b/python/examples/pipeline/imagenet/README_CN.md @@ -0,0 +1,19 @@ +# Imagenet Pipeline WebService + +这里以 Uci 服务为例来介绍 Pipeline WebService 的使用。 + +## 获取模型 +``` +sh get_data.sh +``` + +## 启动服务 + +``` +python web_service.py &>log.txt & +``` + +## 测试 +``` +curl -X POST -k http://localhost:18082/uci/prediction -d '{"key": ["x"], "value": ["0.0137, -0.1136, 0.2553, -0.0692, 0.0582, -0.0727, -0.1583, -0.0584, 0.6283, 0.4919, 0.1856, 0.0795, -0.0332"]}' +``` diff --git a/python/examples/pipeline/imagenet/config.yml b/python/examples/pipeline/imagenet/config.yml new file mode 100644 index 0000000000000000000000000000000000000000..52ddab6f3194efe7c884411bfbcd381f76ea075e --- /dev/null +++ b/python/examples/pipeline/imagenet/config.yml @@ -0,0 +1,30 @@ +#worker_num, 最大并发数。当build_dag_each_worker=True时, 框架会创建worker_num个进程,每个进程内构建grpcSever和DAG +##当build_dag_each_worker=False时,框架会设置主线程grpc线程池的max_workers=worker_num +worker_num: 1 + +#http端口, rpc_port和http_port不允许同时为空。当rpc_port可用且http_port为空时,不自动生成http_port +http_port: 18082 +rpc_port: 9999 + +dag: + #op资源类型, True, 为线程模型;False,为进程模型 + is_thread_op: False +op: + imagenet: + #当op配置没有server_endpoints时,从local_service_conf读取本地服务配置 + local_service_conf: + + #并发数,is_thread_op=True时,为线程并发;否则为进程并发 + concurrency: 2 + + #uci模型路径 + model_config: ResNet50_vd_model + + #计算硬件ID,当devices为""或不写时为CPU预测;当devices为"0", "0,1,2"时为GPU预测,表示使用的GPU卡 + devices: "0" # "0,1" + + #client类型,包括brpc, grpc和local_predictor.local_predictor不启动Serving服务,进程内预测 + client_type: local_predictor + + #Fetch结果列表,以client_config中fetch_var的alias_name为准 + fetch_list: ["score"] diff --git a/python/examples/pipeline/imagenet/daisy.jpg b/python/examples/pipeline/imagenet/daisy.jpg new file mode 100644 index 0000000000000000000000000000000000000000..7edeca63e5f32e68550ef720d81f59df58a8eabc Binary files /dev/null and b/python/examples/pipeline/imagenet/daisy.jpg differ diff --git a/python/examples/pipeline/imagenet/get_model.sh b/python/examples/pipeline/imagenet/get_model.sh new file mode 100644 index 0000000000000000000000000000000000000000..1964c79a2e13dbbe373636ca1a06d2967fad7b79 --- /dev/null +++ b/python/examples/pipeline/imagenet/get_model.sh @@ -0,0 +1,5 @@ +wget --no-check-certificate https://paddle-serving.bj.bcebos.com/imagenet-example/ResNet50_vd.tar.gz +tar -xzvf ResNet50_vd.tar.gz + +wget --no-check-certificate https://paddle-serving.bj.bcebos.com/imagenet-example/image_data.tar.gz +tar -xzvf image_data.tar.gz diff --git a/python/examples/pipeline/imagenet/imagenet.label b/python/examples/pipeline/imagenet/imagenet.label new file mode 100644 index 0000000000000000000000000000000000000000..d7146735146ea1894173d6d0e20fb90af36be849 --- /dev/null +++ b/python/examples/pipeline/imagenet/imagenet.label @@ -0,0 +1,1000 @@ +tench, Tinca tinca, +goldfish, Carassius auratus, +great white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias, +tiger shark, Galeocerdo cuvieri, +hammerhead, hammerhead shark, +electric ray, crampfish, numbfish, torpedo, +stingray, +cock, +hen, +ostrich, Struthio camelus, +brambling, Fringilla montifringilla, +goldfinch, Carduelis carduelis, +house finch, linnet, Carpodacus mexicanus, +junco, snowbird, +indigo bunting, indigo finch, indigo bird, Passerina cyanea, +robin, American robin, Turdus migratorius, +bulbul, +jay, +magpie, +chickadee, +water ouzel, dipper, +kite, +bald eagle, American eagle, Haliaeetus leucocephalus, +vulture, +great grey owl, great gray owl, Strix nebulosa, +European fire salamander, Salamandra salamandra, +common newt, Triturus vulgaris, +eft, +spotted salamander, Ambystoma maculatum, +axolotl, mud puppy, Ambystoma mexicanum, +bullfrog, Rana catesbeiana, +tree frog, tree-frog, +tailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui, +loggerhead, loggerhead turtle, Caretta caretta, +leatherback turtle, leatherback, leathery turtle, Dermochelys coriacea, +mud turtle, +terrapin, +box turtle, box tortoise, +banded gecko, +common iguana, iguana, Iguana iguana, +American chameleon, anole, Anolis carolinensis, +whiptail, whiptail lizard, +agama, +frilled lizard, Chlamydosaurus kingi, +alligator lizard, +Gila monster, Heloderma suspectum, +green lizard, Lacerta viridis, +African chameleon, Chamaeleo chamaeleon, +Komodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis, +African crocodile, Nile crocodile, Crocodylus niloticus, +American alligator, Alligator mississipiensis, +triceratops, +thunder snake, worm snake, Carphophis amoenus, +ringneck snake, ring-necked snake, ring snake, +hognose snake, puff adder, sand viper, +green snake, grass snake, +king snake, kingsnake, +garter snake, grass snake, +water snake, +vine snake, +night snake, Hypsiglena torquata, +boa constrictor, Constrictor constrictor, +rock python, rock snake, Python sebae, +Indian cobra, Naja naja, +green mamba, +sea snake, +horned viper, cerastes, sand viper, horned asp, Cerastes cornutus, +diamondback, diamondback rattlesnake, Crotalus adamanteus, +sidewinder, horned rattlesnake, Crotalus cerastes, +trilobite, +harvestman, daddy longlegs, Phalangium opilio, +scorpion, +black and gold garden spider, Argiope aurantia, +barn spider, Araneus cavaticus, +garden spider, Aranea diademata, +black widow, Latrodectus mactans, +tarantula, +wolf spider, hunting spider, +tick, +centipede, +black grouse, +ptarmigan, +ruffed grouse, partridge, Bonasa umbellus, +prairie chicken, prairie grouse, prairie fowl, +peacock, +quail, +partridge, +African grey, African gray, Psittacus erithacus, +macaw, +sulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita, +lorikeet, +coucal, +bee eater, +hornbill, +hummingbird, +jacamar, +toucan, +drake, +red-breasted merganser, Mergus serrator, +goose, +black swan, Cygnus atratus, +tusker, +echidna, spiny anteater, anteater, +platypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus, +wallaby, brush kangaroo, +koala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus, +wombat, +jellyfish, +sea anemone, anemone, +brain coral, +flatworm, platyhelminth, +nematode, nematode worm, roundworm, +conch, +snail, +slug, +sea slug, nudibranch, +chiton, coat-of-mail shell, sea cradle, polyplacophore, +chambered nautilus, pearly nautilus, nautilus, +Dungeness crab, Cancer magister, +rock crab, Cancer irroratus, +fiddler crab, +king crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica, +American lobster, Northern lobster, Maine lobster, Homarus americanus, +spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish, +crayfish, crawfish, crawdad, crawdaddy, +hermit crab, +isopod, +white stork, Ciconia ciconia, +black stork, Ciconia nigra, +spoonbill, +flamingo, +little blue heron, Egretta caerulea, +American egret, great white heron, Egretta albus, +bittern, +crane, +limpkin, Aramus pictus, +European gallinule, Porphyrio porphyrio, +American coot, marsh hen, mud hen, water hen, Fulica americana, +bustard, +ruddy turnstone, Arenaria interpres, +red-backed sandpiper, dunlin, Erolia alpina, +redshank, Tringa totanus, +dowitcher, +oystercatcher, oyster catcher, +pelican, +king penguin, Aptenodytes patagonica, +albatross, mollymawk, +grey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus, +killer whale, killer, orca, grampus, sea wolf, Orcinus orca, +dugong, Dugong dugon, +sea lion, +Chihuahua, +Japanese spaniel, +Maltese dog, Maltese terrier, Maltese, +Pekinese, Pekingese, Peke, +Shih-Tzu, +Blenheim spaniel, +papillon, +toy terrier, +Rhodesian ridgeback, +Afghan hound, Afghan, +basset, basset hound, +beagle, +bloodhound, sleuthhound, +bluetick, +black-and-tan coonhound, +Walker hound, Walker foxhound, +English foxhound, +redbone, +borzoi, Russian wolfhound, +Irish wolfhound, +Italian greyhound, +whippet, +Ibizan hound, Ibizan Podenco, +Norwegian elkhound, elkhound, +otterhound, otter hound, +Saluki, gazelle hound, +Scottish deerhound, deerhound, +Weimaraner, +Staffordshire bullterrier, Staffordshire bull terrier, +American Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier, +Bedlington terrier, +Border terrier, +Kerry blue terrier, +Irish terrier, +Norfolk terrier, +Norwich terrier, +Yorkshire terrier, +wire-haired fox terrier, +Lakeland terrier, +Sealyham terrier, Sealyham, +Airedale, Airedale terrier, +cairn, cairn terrier, +Australian terrier, +Dandie Dinmont, Dandie Dinmont terrier, +Boston bull, Boston terrier, +miniature schnauzer, +giant schnauzer, +standard schnauzer, +Scotch terrier, Scottish terrier, Scottie, +Tibetan terrier, chrysanthemum dog, +silky terrier, Sydney silky, +soft-coated wheaten terrier, +West Highland white terrier, +Lhasa, Lhasa apso, +flat-coated retriever, +curly-coated retriever, +golden retriever, +Labrador retriever, +Chesapeake Bay retriever, +German short-haired pointer, +vizsla, Hungarian pointer, +English setter, +Irish setter, red setter, +Gordon setter, +Brittany spaniel, +clumber, clumber spaniel, +English springer, English springer spaniel, +Welsh springer spaniel, +cocker spaniel, English cocker spaniel, cocker, +Sussex spaniel, +Irish water spaniel, +kuvasz, +schipperke, +groenendael, +malinois, +briard, +kelpie, +komondor, +Old English sheepdog, bobtail, +Shetland sheepdog, Shetland sheep dog, Shetland, +collie, +Border collie, +Bouvier des Flandres, Bouviers des Flandres, +Rottweiler, +German shepherd, German shepherd dog, German police dog, alsatian, +Doberman, Doberman pinscher, +miniature pinscher, +Greater Swiss Mountain dog, +Bernese mountain dog, +Appenzeller, +EntleBucher, +boxer, +bull mastiff, +Tibetan mastiff, +French bulldog, +Great Dane, +Saint Bernard, St Bernard, +Eskimo dog, husky, +malamute, malemute, Alaskan malamute, +Siberian husky, +dalmatian, coach dog, carriage dog, +affenpinscher, monkey pinscher, monkey dog, +basenji, +pug, pug-dog, +Leonberg, +Newfoundland, Newfoundland dog, +Great Pyrenees, +Samoyed, Samoyede, +Pomeranian, +chow, chow chow, +keeshond, +Brabancon griffon, +Pembroke, Pembroke Welsh corgi, +Cardigan, Cardigan Welsh corgi, +toy poodle, +miniature poodle, +standard poodle, +Mexican hairless, +timber wolf, grey wolf, gray wolf, Canis lupus, +white wolf, Arctic wolf, Canis lupus tundrarum, +red wolf, maned wolf, Canis rufus, Canis niger, +coyote, prairie wolf, brush wolf, Canis latrans, +dingo, warrigal, warragal, Canis dingo, +dhole, Cuon alpinus, +African hunting dog, hyena dog, Cape hunting dog, Lycaon pictus, +hyena, hyaena, +red fox, Vulpes vulpes, +kit fox, Vulpes macrotis, +Arctic fox, white fox, Alopex lagopus, +grey fox, gray fox, Urocyon cinereoargenteus, +tabby, tabby cat, +tiger cat, +Persian cat, +Siamese cat, Siamese, +Egyptian cat, +cougar, puma, catamount, mountain lion, painter, panther, Felis concolor, +lynx, catamount, +leopard, Panthera pardus, +snow leopard, ounce, Panthera uncia, +jaguar, panther, Panthera onca, Felis onca, +lion, king of beasts, Panthera leo, +tiger, Panthera tigris, +cheetah, chetah, Acinonyx jubatus, +brown bear, bruin, Ursus arctos, +American black bear, black bear, Ursus americanus, Euarctos americanus, +ice bear, polar bear, Ursus Maritimus, Thalarctos maritimus, +sloth bear, Melursus ursinus, Ursus ursinus, +mongoose, +meerkat, mierkat, +tiger beetle, +ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle, +ground beetle, carabid beetle, +long-horned beetle, longicorn, longicorn beetle, +leaf beetle, chrysomelid, +dung beetle, +rhinoceros beetle, +weevil, +fly, +bee, +ant, emmet, pismire, +grasshopper, hopper, +cricket, +walking stick, walkingstick, stick insect, +cockroach, roach, +mantis, mantid, +cicada, cicala, +leafhopper, +lacewing, lacewing fly, +"dragonfly, darning needle, devils darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk", +damselfly, +admiral, +ringlet, ringlet butterfly, +monarch, monarch butterfly, milkweed butterfly, Danaus plexippus, +cabbage butterfly, +sulphur butterfly, sulfur butterfly, +lycaenid, lycaenid butterfly, +starfish, sea star, +sea urchin, +sea cucumber, holothurian, +wood rabbit, cottontail, cottontail rabbit, +hare, +Angora, Angora rabbit, +hamster, +porcupine, hedgehog, +fox squirrel, eastern fox squirrel, Sciurus niger, +marmot, +beaver, +guinea pig, Cavia cobaya, +sorrel, +zebra, +hog, pig, grunter, squealer, Sus scrofa, +wild boar, boar, Sus scrofa, +warthog, +hippopotamus, hippo, river horse, Hippopotamus amphibius, +ox, +water buffalo, water ox, Asiatic buffalo, Bubalus bubalis, +bison, +ram, tup, +bighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis, +ibex, Capra ibex, +hartebeest, +impala, Aepyceros melampus, +gazelle, +Arabian camel, dromedary, Camelus dromedarius, +llama, +weasel, +mink, +polecat, fitch, foulmart, foumart, Mustela putorius, +black-footed ferret, ferret, Mustela nigripes, +otter, +skunk, polecat, wood pussy, +badger, +armadillo, +three-toed sloth, ai, Bradypus tridactylus, +orangutan, orang, orangutang, Pongo pygmaeus, +gorilla, Gorilla gorilla, +chimpanzee, chimp, Pan troglodytes, +gibbon, Hylobates lar, +siamang, Hylobates syndactylus, Symphalangus syndactylus, +guenon, guenon monkey, +patas, hussar monkey, Erythrocebus patas, +baboon, +macaque, +langur, +colobus, colobus monkey, +proboscis monkey, Nasalis larvatus, +marmoset, +capuchin, ringtail, Cebus capucinus, +howler monkey, howler, +titi, titi monkey, +spider monkey, Ateles geoffroyi, +squirrel monkey, Saimiri sciureus, +Madagascar cat, ring-tailed lemur, Lemur catta, +indri, indris, Indri indri, Indri brevicaudatus, +Indian elephant, Elephas maximus, +African elephant, Loxodonta africana, +lesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens, +giant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca, +barracouta, snoek, +eel, +coho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch, +rock beauty, Holocanthus tricolor, +anemone fish, +sturgeon, +gar, garfish, garpike, billfish, Lepisosteus osseus, +lionfish, +puffer, pufferfish, blowfish, globefish, +abacus, +abaya, +"academic gown, academic robe, judges robe", +accordion, piano accordion, squeeze box, +acoustic guitar, +aircraft carrier, carrier, flattop, attack aircraft carrier, +airliner, +airship, dirigible, +altar, +ambulance, +amphibian, amphibious vehicle, +analog clock, +apiary, bee house, +apron, +ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin, +assault rifle, assault gun, +backpack, back pack, knapsack, packsack, rucksack, haversack, +bakery, bakeshop, bakehouse, +balance beam, beam, +balloon, +ballpoint, ballpoint pen, ballpen, Biro, +Band Aid, +banjo, +bannister, banister, balustrade, balusters, handrail, +barbell, +barber chair, +barbershop, +barn, +barometer, +barrel, cask, +barrow, garden cart, lawn cart, wheelbarrow, +baseball, +basketball, +bassinet, +bassoon, +bathing cap, swimming cap, +bath towel, +bathtub, bathing tub, bath, tub, +beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon, +beacon, lighthouse, beacon light, pharos, +beaker, +bearskin, busby, shako, +beer bottle, +beer glass, +bell cote, bell cot, +bib, +bicycle-built-for-two, tandem bicycle, tandem, +bikini, two-piece, +binder, ring-binder, +binoculars, field glasses, opera glasses, +birdhouse, +boathouse, +bobsled, bobsleigh, bob, +bolo tie, bolo, bola tie, bola, +bonnet, poke bonnet, +bookcase, +bookshop, bookstore, bookstall, +bottlecap, +bow, +bow tie, bow-tie, bowtie, +brass, memorial tablet, plaque, +brassiere, bra, bandeau, +breakwater, groin, groyne, mole, bulwark, seawall, jetty, +breastplate, aegis, egis, +broom, +bucket, pail, +buckle, +bulletproof vest, +bullet train, bullet, +butcher shop, meat market, +cab, hack, taxi, taxicab, +caldron, cauldron, +candle, taper, wax light, +cannon, +canoe, +can opener, tin opener, +cardigan, +car mirror, +carousel, carrousel, merry-go-round, roundabout, whirligig, +"carpenters kit, tool kit", +carton, +car wheel, +cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM, +cassette, +cassette player, +castle, +catamaran, +CD player, +cello, violoncello, +cellular telephone, cellular phone, cellphone, cell, mobile phone, +chain, +chainlink fence, +chain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour, +chain saw, chainsaw, +chest, +chiffonier, commode, +chime, bell, gong, +china cabinet, china closet, +Christmas stocking, +church, church building, +cinema, movie theater, movie theatre, movie house, picture palace, +cleaver, meat cleaver, chopper, +cliff dwelling, +cloak, +clog, geta, patten, sabot, +cocktail shaker, +coffee mug, +coffeepot, +coil, spiral, volute, whorl, helix, +combination lock, +computer keyboard, keypad, +confectionery, confectionary, candy store, +container ship, containership, container vessel, +convertible, +corkscrew, bottle screw, +cornet, horn, trumpet, trump, +cowboy boot, +cowboy hat, ten-gallon hat, +cradle, +crane, +crash helmet, +crate, +crib, cot, +Crock Pot, +croquet ball, +crutch, +cuirass, +dam, dike, dyke, +desk, +desktop computer, +dial telephone, dial phone, +diaper, nappy, napkin, +digital clock, +digital watch, +dining table, board, +dishrag, dishcloth, +dishwasher, dish washer, dishwashing machine, +disk brake, disc brake, +dock, dockage, docking facility, +dogsled, dog sled, dog sleigh, +dome, +doormat, welcome mat, +drilling platform, offshore rig, +drum, membranophone, tympan, +drumstick, +dumbbell, +Dutch oven, +electric fan, blower, +electric guitar, +electric locomotive, +entertainment center, +envelope, +espresso maker, +face powder, +feather boa, boa, +file, file cabinet, filing cabinet, +fireboat, +fire engine, fire truck, +fire screen, fireguard, +flagpole, flagstaff, +flute, transverse flute, +folding chair, +football helmet, +forklift, +fountain, +fountain pen, +four-poster, +freight car, +French horn, horn, +frying pan, frypan, skillet, +fur coat, +garbage truck, dustcart, +gasmask, respirator, gas helmet, +gas pump, gasoline pump, petrol pump, island dispenser, +goblet, +go-kart, +golf ball, +golfcart, golf cart, +gondola, +gong, tam-tam, +gown, +grand piano, grand, +greenhouse, nursery, glasshouse, +grille, radiator grille, +grocery store, grocery, food market, market, +guillotine, +hair slide, +hair spray, +half track, +hammer, +hamper, +hand blower, blow dryer, blow drier, hair dryer, hair drier, +hand-held computer, hand-held microcomputer, +handkerchief, hankie, hanky, hankey, +hard disc, hard disk, fixed disk, +harmonica, mouth organ, harp, mouth harp, +harp, +harvester, reaper, +hatchet, +holster, +home theater, home theatre, +honeycomb, +hook, claw, +hoopskirt, crinoline, +horizontal bar, high bar, +horse cart, horse-cart, +hourglass, +iPod, +iron, smoothing iron, +"jack-o-lantern", +jean, blue jean, denim, +jeep, landrover, +jersey, T-shirt, tee shirt, +jigsaw puzzle, +jinrikisha, ricksha, rickshaw, +joystick, +kimono, +knee pad, +knot, +lab coat, laboratory coat, +ladle, +lampshade, lamp shade, +laptop, laptop computer, +lawn mower, mower, +lens cap, lens cover, +letter opener, paper knife, paperknife, +library, +lifeboat, +lighter, light, igniter, ignitor, +limousine, limo, +liner, ocean liner, +lipstick, lip rouge, +Loafer, +lotion, +loudspeaker, speaker, speaker unit, loudspeaker system, speaker system, +"loupe, jewelers loupe", +lumbermill, sawmill, +magnetic compass, +mailbag, postbag, +mailbox, letter box, +maillot, +maillot, tank suit, +manhole cover, +maraca, +marimba, xylophone, +mask, +matchstick, +maypole, +maze, labyrinth, +measuring cup, +medicine chest, medicine cabinet, +megalith, megalithic structure, +microphone, mike, +microwave, microwave oven, +military uniform, +milk can, +minibus, +miniskirt, mini, +minivan, +missile, +mitten, +mixing bowl, +mobile home, manufactured home, +Model T, +modem, +monastery, +monitor, +moped, +mortar, +mortarboard, +mosque, +mosquito net, +motor scooter, scooter, +mountain bike, all-terrain bike, off-roader, +mountain tent, +mouse, computer mouse, +mousetrap, +moving van, +muzzle, +nail, +neck brace, +necklace, +nipple, +notebook, notebook computer, +obelisk, +oboe, hautboy, hautbois, +ocarina, sweet potato, +odometer, hodometer, mileometer, milometer, +oil filter, +organ, pipe organ, +oscilloscope, scope, cathode-ray oscilloscope, CRO, +overskirt, +oxcart, +oxygen mask, +packet, +paddle, boat paddle, +paddlewheel, paddle wheel, +padlock, +paintbrush, +"pajama, pyjama, pjs, jammies", +palace, +panpipe, pandean pipe, syrinx, +paper towel, +parachute, chute, +parallel bars, bars, +park bench, +parking meter, +passenger car, coach, carriage, +patio, terrace, +pay-phone, pay-station, +pedestal, plinth, footstall, +pencil box, pencil case, +pencil sharpener, +perfume, essence, +Petri dish, +photocopier, +pick, plectrum, plectron, +pickelhaube, +picket fence, paling, +pickup, pickup truck, +pier, +piggy bank, penny bank, +pill bottle, +pillow, +ping-pong ball, +pinwheel, +pirate, pirate ship, +pitcher, ewer, +"plane, carpenters plane, woodworking plane", +planetarium, +plastic bag, +plate rack, +plow, plough, +"plunger, plumbers helper", +Polaroid camera, Polaroid Land camera, +pole, +police van, police wagon, paddy wagon, patrol wagon, wagon, black Maria, +poncho, +pool table, billiard table, snooker table, +pop bottle, soda bottle, +pot, flowerpot, +"potters wheel", +power drill, +prayer rug, prayer mat, +printer, +prison, prison house, +projectile, missile, +projector, +puck, hockey puck, +punching bag, punch bag, punching ball, punchball, +purse, +quill, quill pen, +quilt, comforter, comfort, puff, +racer, race car, racing car, +racket, racquet, +radiator, +radio, wireless, +radio telescope, radio reflector, +rain barrel, +recreational vehicle, RV, R.V., +reel, +reflex camera, +refrigerator, icebox, +remote control, remote, +restaurant, eating house, eating place, eatery, +revolver, six-gun, six-shooter, +rifle, +rocking chair, rocker, +rotisserie, +rubber eraser, rubber, pencil eraser, +rugby ball, +rule, ruler, +running shoe, +safe, +safety pin, +saltshaker, salt shaker, +sandal, +sarong, +sax, saxophone, +scabbard, +scale, weighing machine, +school bus, +schooner, +scoreboard, +screen, CRT screen, +screw, +screwdriver, +seat belt, seatbelt, +sewing machine, +shield, buckler, +shoe shop, shoe-shop, shoe store, +shoji, +shopping basket, +shopping cart, +shovel, +shower cap, +shower curtain, +ski, +ski mask, +sleeping bag, +slide rule, slipstick, +sliding door, +slot, one-armed bandit, +snorkel, +snowmobile, +snowplow, snowplough, +soap dispenser, +soccer ball, +sock, +solar dish, solar collector, solar furnace, +sombrero, +soup bowl, +space bar, +space heater, +space shuttle, +spatula, +speedboat, +"spider web, spiders web", +spindle, +sports car, sport car, +spotlight, spot, +stage, +steam locomotive, +steel arch bridge, +steel drum, +stethoscope, +stole, +stone wall, +stopwatch, stop watch, +stove, +strainer, +streetcar, tram, tramcar, trolley, trolley car, +stretcher, +studio couch, day bed, +stupa, tope, +submarine, pigboat, sub, U-boat, +suit, suit of clothes, +sundial, +sunglass, +sunglasses, dark glasses, shades, +sunscreen, sunblock, sun blocker, +suspension bridge, +swab, swob, mop, +sweatshirt, +swimming trunks, bathing trunks, +swing, +switch, electric switch, electrical switch, +syringe, +table lamp, +tank, army tank, armored combat vehicle, armoured combat vehicle, +tape player, +teapot, +teddy, teddy bear, +television, television system, +tennis ball, +thatch, thatched roof, +theater curtain, theatre curtain, +thimble, +thresher, thrasher, threshing machine, +throne, +tile roof, +toaster, +tobacco shop, tobacconist shop, tobacconist, +toilet seat, +torch, +totem pole, +tow truck, tow car, wrecker, +toyshop, +tractor, +trailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi, +tray, +trench coat, +tricycle, trike, velocipede, +trimaran, +tripod, +triumphal arch, +trolleybus, trolley coach, trackless trolley, +trombone, +tub, vat, +turnstile, +typewriter keyboard, +umbrella, +unicycle, monocycle, +upright, upright piano, +vacuum, vacuum cleaner, +vase, +vault, +velvet, +vending machine, +vestment, +viaduct, +violin, fiddle, +volleyball, +waffle iron, +wall clock, +wallet, billfold, notecase, pocketbook, +wardrobe, closet, press, +warplane, military plane, +washbasin, handbasin, washbowl, lavabo, wash-hand basin, +washer, automatic washer, washing machine, +water bottle, +water jug, +water tower, +whiskey jug, +whistle, +wig, +window screen, +window shade, +Windsor tie, +wine bottle, +wing, +wok, +wooden spoon, +wool, woolen, woollen, +worm fence, snake fence, snake-rail fence, Virginia fence, +wreck, +yawl, +yurt, +web site, website, internet site, site, +comic book, +crossword puzzle, crossword, +street sign, +traffic light, traffic signal, stoplight, +book jacket, dust cover, dust jacket, dust wrapper, +menu, +plate, +guacamole, +consomme, +hot pot, hotpot, +trifle, +ice cream, icecream, +ice lolly, lolly, lollipop, popsicle, +French loaf, +bagel, beigel, +pretzel, +cheeseburger, +hotdog, hot dog, red hot, +mashed potato, +head cabbage, +broccoli, +cauliflower, +zucchini, courgette, +spaghetti squash, +acorn squash, +butternut squash, +cucumber, cuke, +artichoke, globe artichoke, +bell pepper, +cardoon, +mushroom, +Granny Smith, +strawberry, +orange, +lemon, +fig, +pineapple, ananas, +banana, +jackfruit, jak, jack, +custard apple, +pomegranate, +hay, +carbonara, +chocolate sauce, chocolate syrup, +dough, +meat loaf, meatloaf, +pizza, pizza pie, +potpie, +burrito, +red wine, +espresso, +cup, +eggnog, +alp, +bubble, +cliff, drop, drop-off, +coral reef, +geyser, +lakeside, lakeshore, +promontory, headland, head, foreland, +sandbar, sand bar, +seashore, coast, seacoast, sea-coast, +valley, vale, +volcano, +ballplayer, baseball player, +groom, bridegroom, +scuba diver, +rapeseed, +daisy, +"yellow ladys slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum", +corn, +acorn, +hip, rose hip, rosehip, +buckeye, horse chestnut, conker, +coral fungus, +agaric, +gyromitra, +stinkhorn, carrion fungus, +earthstar, +hen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa, +bolete, +ear, spike, capitulum, +toilet tissue, toilet paper, bathroom tissue diff --git a/python/examples/pipeline/imagenet/pipeline_rpc_client.py b/python/examples/pipeline/imagenet/pipeline_rpc_client.py new file mode 100644 index 0000000000000000000000000000000000000000..3220e6c20b27c92a59cd0c28050719a8790d648d --- /dev/null +++ b/python/examples/pipeline/imagenet/pipeline_rpc_client.py @@ -0,0 +1,36 @@ +# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +from paddle_serving_server_gpu.pipeline import PipelineClient +import numpy as np +import requests +import json +import cv2 +import base64 +import os + +client = PipelineClient() +client.connect(['127.0.0.1:9999']) + + +def cv2_to_base64(image): + return base64.b64encode(image).decode('utf8') + + +with open("daisy.jpg", 'rb') as file: + image_data = file.read() +image = cv2_to_base64(image_data) + +for i in range(1): + ret = client.predict(feed_dict={"image": image}, fetch=["label", "prob"]) + print(ret) diff --git a/python/examples/pipeline/imagenet/resnet50_web_service.py b/python/examples/pipeline/imagenet/resnet50_web_service.py new file mode 100644 index 0000000000000000000000000000000000000000..ece3befee8d62c9af2e0e0a1c576a63e42d86245 --- /dev/null +++ b/python/examples/pipeline/imagenet/resnet50_web_service.py @@ -0,0 +1,71 @@ +# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +import sys +from paddle_serving_app.reader import Sequential, URL2Image, Resize, CenterCrop, RGB2BGR, Transpose, Div, Normalize, Base64ToImage +try: + from paddle_serving_server_gpu.web_service import WebService, Op +except ImportError: + from paddle_serving_server.web_service import WebService, Op +import logging +import numpy as np +import base64, cv2 + + +class ImagenetOp(Op): + def init_op(self): + self.seq = Sequential([ + Resize(256), CenterCrop(224), RGB2BGR(), Transpose((2, 0, 1)), + Div(255), Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225], + True) + ]) + self.label_dict = {} + label_idx = 0 + with open("imagenet.label") as fin: + for line in fin: + self.label_dict[label_idx] = line.strip() + label_idx += 1 + + def preprocess(self, input_dicts, data_id, log_id): + (_, input_dict), = input_dicts.items() + data = base64.b64decode(input_dict["image"].encode('utf8')) + data = np.fromstring(data, np.uint8) + # Note: class variables(self.var) can only be used in process op mode + im = cv2.imdecode(data, cv2.IMREAD_COLOR) + img = self.seq(im) + return {"image": img[np.newaxis, :].copy()}, False, None, "" + + def postprocess(self, input_dicts, fetch_dict, log_id): + print(fetch_dict) + score_list = fetch_dict["score"] + result = {"label": [], "prob": []} + for score in score_list: + score = score.tolist() + max_score = max(score) + result["label"].append(self.label_dict[score.index(max_score)] + .strip().replace(",", "")) + result["prob"].append(max_score) + result["label"] = str(result["label"]) + result["prob"] = str(result["prob"]) + return result, None, "" + + +class ImageService(WebService): + def get_pipeline_response(self, read_op): + image_op = ImagenetOp(name="imagenet", input_ops=[read_op]) + return image_op + + +uci_service = ImageService(name="imagenet") +uci_service.prepare_pipeline_config("config.yml") +uci_service.run_service() diff --git a/python/examples/pipeline/imdb_model_ensemble/README.md b/python/examples/pipeline/imdb_model_ensemble/README.md new file mode 100644 index 0000000000000000000000000000000000000000..f90024f6af04b7d50fe90eb6e04b71dd703300e3 --- /dev/null +++ b/python/examples/pipeline/imdb_model_ensemble/README.md @@ -0,0 +1,19 @@ +# IMDB model ensemble examples + +## Get models +``` +sh get_data.sh +``` + +## Start servers + +``` +python -m paddle_serving_server.serve --model imdb_cnn_model --port 9292 &> cnn.log & +python -m paddle_serving_server.serve --model imdb_bow_model --port 9393 &> bow.log & +python test_pipeline_server.py &>pipeline.log & +``` + +## Start clients +``` +python test_pipeline_client.py +``` diff --git a/python/examples/pipeline/imdb_model_ensemble/README_CN.md b/python/examples/pipeline/imdb_model_ensemble/README_CN.md index 88eeab70c470268775ad22fd65a6d1b999a6b167..fd4785292c3bfa731f76666b7d4e12e4e285fbda 100644 --- a/python/examples/pipeline/imdb_model_ensemble/README_CN.md +++ b/python/examples/pipeline/imdb_model_ensemble/README_CN.md @@ -8,8 +8,8 @@ sh get_data.sh ## 启动服务 ``` -python -m paddle_serving_server_gpu.serve --model imdb_cnn_model --port 9292 &> cnn.log & -python -m paddle_serving_server_gpu.serve --model imdb_bow_model --port 9393 &> bow.log & +python -m paddle_serving_server.serve --model imdb_cnn_model --port 9292 &> cnn.log & +python -m paddle_serving_server.serve --model imdb_bow_model --port 9393 &> bow.log & python test_pipeline_server.py &>pipeline.log & ``` @@ -17,8 +17,3 @@ python test_pipeline_server.py &>pipeline.log & ``` python test_pipeline_client.py ``` - -## HTTP 测试 -``` -curl -X POST -k http://localhost:9999/prediction -d '{"key": ["words"], "value": ["i am very sad | 0"]}' -``` diff --git a/python/examples/pipeline/imdb_model_ensemble/config.yml b/python/examples/pipeline/imdb_model_ensemble/config.yml index 0853033fdccc643c459e19e2e0a573c3091ba9a9..2f25fa861f3ec50d15d5d5795e5e25dbf801e861 100644 --- a/python/examples/pipeline/imdb_model_ensemble/config.yml +++ b/python/examples/pipeline/imdb_model_ensemble/config.yml @@ -1,22 +1,100 @@ -rpc_port: 18080 +#rpc端口, rpc_port和http_port不允许同时为空。当rpc_port为空且http_port不为空时,会自动将rpc_port设置为http_port+1 +rpc_port: 18070 + +#http端口, rpc_port和http_port不允许同时为空。当rpc_port可用且http_port为空时,不自动生成http_port +http_port: 18071 + +#worker_num, 最大并发数。当build_dag_each_worker=True时, 框架会创建worker_num个进程,每个进程内构建grpcSever和DAG +#当build_dag_each_worker=False时,框架会设置主线程grpc线程池的max_workers=worker_num worker_num: 4 -build_dag_each_worker: false + +#build_dag_each_worker, False,框架在进程内创建一条DAG;True,框架会每个进程内创建多个独立的DAG +build_dag_each_worker: False + dag: - is_thread_op: true + #op资源类型, True, 为线程模型;False,为进程模型 + is_thread_op: True + + #重试次数 retry: 1 - use_profile: false + + #使用性能分析, True,生成Timeline性能数据,对性能有一定影响;False为不使用 + use_profile: False + + #channel的最大长度,默认为0 + channel_size: 0 + + #tracer, 跟踪框架吞吐,每个OP和channel的工作情况。无tracer时不生成数据 + tracer: + #每次trace的时间间隔,单位秒/s + interval_s: 10 op: bow: - concurrency: 2 - remote_service_conf: - client_type: brpc - model_config: imdb_bow_model - devices: "" - rpc_port : 9393 + #并发数,is_thread_op=True时,为线程并发;否则为进程并发 + concurrency: 1 + + #client连接类型,brpc + client_type: brpc + + #Serving交互重试次数,默认不重试 + retry: 1 + + #Serving交互超时时间, 单位ms + timeout: 3000 + + #Serving IPs + server_endpoints: ["127.0.0.1:9393"] + + #bow模型client端配置 + client_config: "imdb_bow_client_conf/serving_client_conf.prototxt" + + #Fetch结果列表,以client_config中fetch_var的alias_name为准 + fetch_list: ["prediction"] + + #批量查询Serving的数量, 默认1。batch_size>1要设置auto_batching_timeout,否则不足batch_size时会阻塞 + batch_size: 1 + + #批量查询超时,与batch_size配合使用 + auto_batching_timeout: 2000 cnn: - concurrency: 2 - remote_service_conf: - client_type: brpc - model_config: imdb_cnn_model - devices: "" - rpc_port : 9292 + #并发数,is_thread_op=True时,为线程并发;否则为进程并发 + concurrency: 1 + + #client连接类型,brpc + client_type: brpc + + #Serving交互重试次数,默认不重试 + retry: 1 + + #超时时间, 单位ms + timeout: 3000 + + #Serving IPs + server_endpoints: ["127.0.0.1:9292"] + + #cnn模型client端配置 + client_config: "imdb_cnn_client_conf/serving_client_conf.prototxt" + + #Fetch结果列表,以client_config中fetch_var的alias_name为准 + fetch_list: ["prediction"] + + #批量查询Serving的数量, 默认1。batch_size>1要设置auto_batching_timeout,否则不足batch_size时会阻塞 + batch_size: 1 + + #批量查询超时,与batch_size配合使用 + auto_batching_timeout: 2000 + combine: + #并发数,is_thread_op=True时,为线程并发;否则为进程并发 + concurrency: 1 + + #Serving交互重试次数,默认不重试 + retry: 1 + + #超时时间, 单位ms + timeout: 3000 + + #批量查询Serving的数量, 默认1。batch_size>1要设置auto_batching_timeout,否则不足batch_size时会阻塞 + batch_size: 1 + + #批量查询超时,与batch_size配合使用 + auto_batching_timeout: 2000 diff --git a/python/examples/pipeline/imdb_model_ensemble/test_pipeline_client.py b/python/examples/pipeline/imdb_model_ensemble/test_pipeline_client.py index 765ab7fd5a02a4af59b0773135bc59c802464b42..1737f8f782a25025547f68be6619c237975f5172 100644 --- a/python/examples/pipeline/imdb_model_ensemble/test_pipeline_client.py +++ b/python/examples/pipeline/imdb_model_ensemble/test_pipeline_client.py @@ -15,21 +15,22 @@ from paddle_serving_server.pipeline import PipelineClient import numpy as np client = PipelineClient() -client.connect(['127.0.0.1:18080']) +client.connect(['127.0.0.1:18070']) words = 'i am very sad | 0' futures = [] -for i in range(4): +for i in range(100): futures.append( client.predict( - feed_dict={"words": words}, + feed_dict={"words": words, + "logid": 10000 + i}, fetch=["prediction"], asyn=True, profile=False)) for f in futures: res = f.result() - if res["ecode"] != 0: + if res.err_no != 0: print("predict failed: {}".format(res)) print(res) diff --git a/python/examples/pipeline/imdb_model_ensemble/test_pipeline_server.py b/python/examples/pipeline/imdb_model_ensemble/test_pipeline_server.py index 92a15379c0b6ae1ad0cdc1401a01556e41c7eed7..35171a3910baf0af3ac6c83e521744906f49c948 100644 --- a/python/examples/pipeline/imdb_model_ensemble/test_pipeline_server.py +++ b/python/examples/pipeline/imdb_model_ensemble/test_pipeline_server.py @@ -15,10 +15,14 @@ from paddle_serving_server.pipeline import Op, RequestOp, ResponseOp from paddle_serving_server.pipeline import PipelineServer from paddle_serving_server.pipeline.proto import pipeline_service_pb2 -from paddle_serving_server.pipeline.channel import ChannelDataEcode +from paddle_serving_server.pipeline.channel import ChannelDataErrcode import numpy as np -from paddle_serving_app.reader import IMDBDataset +from paddle_serving_app.reader.imdb_reader import IMDBDataset import logging +try: + from paddle_serving_server.web_service import WebService +except ImportError: + from paddle_serving_server_gpu.web_service import WebService _LOGGER = logging.getLogger() user_handler = logging.StreamHandler() @@ -41,74 +45,68 @@ class ImdbRequestOp(RequestOp): continue words = request.value[idx] word_ids, _ = self.imdb_dataset.get_words_and_label(words) - dictdata[key] = np.array(word_ids) - return dictdata + word_len = len(word_ids) + dictdata[key] = np.array(word_ids).reshape(word_len, 1) + dictdata["{}.lod".format(key)] = np.array([0, word_len]) + + log_id = None + if request.logid is not None: + log_id = request.logid + return dictdata, log_id, None, "" class CombineOp(Op): - def preprocess(self, input_data): + def preprocess(self, input_data, data_id, log_id): + #_LOGGER.info("Enter CombineOp::preprocess") combined_prediction = 0 for op_name, data in input_data.items(): _LOGGER.info("{}: {}".format(op_name, data["prediction"])) combined_prediction += data["prediction"] data = {"prediction": combined_prediction / 2} - return data + return data, False, None, "" class ImdbResponseOp(ResponseOp): # Here ImdbResponseOp is consistent with the default ResponseOp implementation def pack_response_package(self, channeldata): resp = pipeline_service_pb2.Response() - resp.ecode = channeldata.ecode - if resp.ecode == ChannelDataEcode.OK.value: + resp.err_no = channeldata.error_code + if resp.err_no == ChannelDataErrcode.OK.value: feed = channeldata.parse() # ndarray to string for name, var in feed.items(): resp.value.append(var.__repr__()) resp.key.append(name) else: - resp.error_info = channeldata.error_info + resp.err_msg = channeldata.error_info return resp read_op = ImdbRequestOp() -bow_op = Op(name="bow", - input_ops=[read_op], - server_endpoints=["127.0.0.1:9393"], - fetch_list=["prediction"], - client_config="imdb_bow_client_conf/serving_client_conf.prototxt", - concurrency=1, - timeout=-1, - retry=1, - batch_size=3, - auto_batching_timeout=1000) -cnn_op = Op(name="cnn", - input_ops=[read_op], - server_endpoints=["127.0.0.1:9292"], - fetch_list=["prediction"], - client_config="imdb_cnn_client_conf/serving_client_conf.prototxt", - concurrency=1, - timeout=-1, - retry=1, - batch_size=1, - auto_batching_timeout=None) -combine_op = CombineOp( - name="combine", - input_ops=[bow_op, cnn_op], - concurrency=1, - timeout=-1, - retry=1, - batch_size=2, - auto_batching_timeout=None) + + +class BowOp(Op): + def init_op(self): + pass + + +class CnnOp(Op): + def init_op(self): + pass + + +bow_op = BowOp("bow", input_ops=[read_op]) +cnn_op = CnnOp("cnn", input_ops=[read_op]) +combine_op = CombineOp("combine", input_ops=[bow_op, cnn_op]) # fetch output of bow_op -# response_op = ImdbResponseOp(input_ops=[bow_op]) +#response_op = ImdbResponseOp(input_ops=[bow_op]) # fetch output of combine_op response_op = ImdbResponseOp(input_ops=[combine_op]) # use default ResponseOp implementation -# response_op = ResponseOp(input_ops=[combine_op]) +#response_op = ResponseOp(input_ops=[combine_op]) server = PipelineServer() server.set_response_op(response_op) diff --git a/python/examples/pipeline/ocr/README.md b/python/examples/pipeline/ocr/README.md index f51789fc5e419d715141ba59dc49011d4f306e56..de7bcaa2ece7f9fa7ba56de533e8e4dd023ad1f3 100644 --- a/python/examples/pipeline/ocr/README.md +++ b/python/examples/pipeline/ocr/README.md @@ -28,31 +28,9 @@ python web_service.py &>log.txt & python pipeline_http_client.py ``` - -