diff --git a/doc/GRPC_IMPL_CN.md b/doc/GRPC_IMPL_CN.md index 7b10907caec98ae5754126a7ec54096cc4cd48af..9e7ecd268fe0900c1085479c1f96fa083629758c 100644 --- a/doc/GRPC_IMPL_CN.md +++ b/doc/GRPC_IMPL_CN.md @@ -1,52 +1,137 @@ -# gRPCæŽ¥å£ +# gRPC接å£ä½¿ç”¨ä»‹ç» + + - [1.与bRPC接å£å¯¹æ¯”](#1与brpc接å£å¯¹æ¯”) + - [1.1 æœåŠ¡ç«¯å¯¹æ¯”](#11-æœåŠ¡ç«¯å¯¹æ¯”) + - [1.2 客æœç«¯å¯¹æ¯”](#12-客æœç«¯å¯¹æ¯”) + - [1.3 其他](#13-其他) + - [2.示例:线性回归预测æœåŠ¡](#2示例线性回归预测æœåŠ¡) + - [获å–æ•°æ®](#获å–æ•°æ®) + - [å¼€å¯ gRPC æœåŠ¡ç«¯](#å¼€å¯-grpc-æœåŠ¡ç«¯) + - [客户端预测](#客户端预测) + - [åŒæ¥é¢„测](#åŒæ¥é¢„测) + - [异æ¥é¢„测](#异æ¥é¢„测) + - [Batch 预测](#batch-预测) + - [通用 pb 预测](#通用-pb-预测) + - [预测超时](#预测超时) + - [List 输入](#list-输入) + - [3.更多示例](#3更多示例) + +使用gRPC接å£ï¼ŒClient端å¯ä»¥åœ¨Win/Linux/MacOSå¹³å°ä¸Šè°ƒç”¨ä¸åŒè¯è¨€ã€‚gRPC 接å£å®žçŽ°ç»“构如下: + +![](https://github.com/PaddlePaddle/Serving/blob/develop/doc/grpc_impl.png) + +## 1.与bRPC接å£å¯¹æ¯” + +#### 1.1 æœåŠ¡ç«¯å¯¹æ¯” + +* gRPC Server 端 `load_model_config` å‡½æ•°æ·»åŠ `client_config_path` å‚数: -gRPC 接å£å®žçŽ°å½¢å¼ç±»ä¼¼ Web Service: - -![](grpc_impl.png) - -## 与bRPC接å£å¯¹æ¯” - -1. gRPC Server 端 `load_model_config` å‡½æ•°æ·»åŠ `client_config_path` å‚数: - - ```python + ``` def load_model_config(self, server_config_paths, client_config_path=None) ``` + 在一些例åä¸ bRPC Server 端与 bRPC Client 端的é…置文件å¯èƒ½ä¸åŒï¼ˆå¦‚ 在cube local ä¸ï¼ŒClient 端的数æ®å…ˆäº¤ç»™ cube,ç»è¿‡ cube 处ç†åŽå†äº¤ç»™é¢„测库),æ¤æ—¶ gRPC Server 端需è¦æ‰‹åŠ¨è®¾ç½® gRPC Client 端的é…ç½®`client_config_path`。 + **`client_config_path` 默认为 `<server_config_path>/serving_server_conf.prototxt`。** - 在一些例åä¸ bRPC Server 端与 bRPC Client 端的é…置文件å¯èƒ½æ˜¯ä¸åŒçš„(如 cube local 例åä¸ï¼ŒClient 端的数æ®å…ˆäº¤ç»™ cube,ç»è¿‡ cube 处ç†åŽå†äº¤ç»™é¢„测库),所以 gRPC Server 端需è¦èŽ·å– gRPC Client 端的é…置;åŒæ—¶ä¸ºäº†å–消 gRPC Client ç«¯æ‰‹åŠ¨åŠ è½½é…置文件的过程,所以设计 gRPC Server 端åŒæ—¶åŠ 载两个é…置文件。`client_config_path` 默认为 `<server_config_path>/serving_server_conf.prototxt`。 +#### 1.2 客æœç«¯å¯¹æ¯” -2. gRPC Client 端å–消 `load_client_config` æ¥éª¤ï¼š +* gRPC Client 端å–消 `load_client_config` æ¥éª¤ï¼š 在 `connect` æ¥éª¤é€šè¿‡ RPC 获å–相应的 prototxt(从任æ„一个 endpoint 获å–å³å¯ï¼‰ã€‚ -3. gRPC Client 需è¦é€šè¿‡ RPC æ–¹å¼è®¾ç½® timeout 时间(调用形å¼ä¸Ž bRPC Clientä¿æŒä¸€è‡´ï¼‰ +* gRPC Client 需è¦é€šè¿‡ RPC æ–¹å¼è®¾ç½® timeout 时间(调用形å¼ä¸Ž bRPC Clientä¿æŒä¸€è‡´ï¼‰ å› ä¸º bRPC Client 在 `connect` åŽæ— 法更改 timeout 时间,所以当 gRPC Server 收到å˜æ›´ timeout 的调用请求时会é‡æ–°åˆ›å»º bRPC Client 实例以å˜æ›´ bRPC Client timeout时间,åŒæ—¶ gRPC Client 会设置 gRPC çš„ deadline 时间。 **注æ„,设置 timeout 接å£å’Œ Inference 接å£ä¸èƒ½åŒæ—¶è°ƒç”¨ï¼ˆéžçº¿ç¨‹å®‰å…¨ï¼‰ï¼Œå‡ºäºŽæ€§èƒ½è€ƒè™‘æš‚æ—¶ä¸åŠ é”。** -4. gRPC Client 端 `predict` å‡½æ•°æ·»åŠ `asyn` å’Œ `is_python` å‚数: +* gRPC Client 端 `predict` å‡½æ•°æ·»åŠ `asyn` å’Œ `is_python` å‚数: - ```python + ``` def predict(self, feed, fetch, need_variant_tag=False, asyn=False, is_python=True) ``` - å…¶ä¸ï¼Œ`asyn` 为异æ¥è°ƒç”¨é€‰é¡¹ã€‚当 `asyn=True` 时为异æ¥è°ƒç”¨ï¼Œè¿”回 `MultiLangPredictFuture` 对象,通过 `MultiLangPredictFuture.result()` 阻塞获å–预测值;当 `asyn=Fasle` 为åŒæ¥è°ƒç”¨ã€‚ +1. `asyn` 为异æ¥è°ƒç”¨é€‰é¡¹ã€‚当 `asyn=True` 时为异æ¥è°ƒç”¨ï¼Œè¿”回 `MultiLangPredictFuture` 对象,通过 `MultiLangPredictFuture.result()` 阻塞获å–预测值;当 `asyn=Fasle` 为åŒæ¥è°ƒç”¨ã€‚ + +2. `is_python` 为 proto æ ¼å¼é€‰é¡¹ã€‚当 `is_python=True` 时,基于 numpy bytes æ ¼å¼è¿›è¡Œæ•°æ®ä¼ 输,目å‰åªé€‚用于 Python;当 `is_python=False` 时,以普通数æ®æ ¼å¼ä¼ è¾“ï¼Œæ›´åŠ é€šç”¨ã€‚ä½¿ç”¨ numpy bytes æ ¼å¼ä¼ 输耗时比普通数æ®æ ¼å¼å°å¾ˆå¤šï¼ˆè¯¦è§ [#654](https://github.com/PaddlePaddle/Serving/pull/654))。 + +#### 1.3 其他 + +* 异常处ç†ï¼šå½“ gRPC Server 端的 bRPC Client 预测失败(返回 `None`)时,gRPC Client 端åŒæ ·è¿”回None。其他 gRPC 异常会在 Client 内部æ•èŽ·ï¼Œå¹¶åœ¨è¿”回的 fetch_map ä¸æ·»åŠ 一个 "status_code" å—段æ¥åŒºåˆ†æ˜¯å¦é¢„测æ£å¸¸ï¼ˆå‚考 timeout æ ·ä¾‹ï¼‰ã€‚ + +* 由于 gRPC åªæ”¯æŒ pick_first å’Œ round_robin è´Ÿè½½å‡è¡¡ç–略,ABTEST 特性还未打é½ã€‚ + +* 系统兼容性: + * [x] CentOS + * [x] macOS + * [x] Windows + +* å·²ç»æ”¯æŒçš„客户端è¯è¨€ï¼š + + - Python + - Java + - Go + + +## 2.示例:线性回归预测æœåŠ¡ + +以下是采用gRPC实现的关于线性回归预测的一个示例,具体代ç 详è§æ¤[链接](https://github.com/PaddlePaddle/Serving/tree/develop/python/examples/grpc_impl_example/fit_a_line) +#### 获å–æ•°æ® + +```shell +sh get_data.sh +``` + +#### å¼€å¯ gRPC æœåŠ¡ç«¯ + +``` shell +python test_server.py uci_housing_model/ +``` + +也å¯ä»¥é€šè¿‡ä¸‹é¢çš„一行代ç å¼€å¯é»˜è®¤ gRPC æœåŠ¡ï¼š + +```shell +python -m paddle_serving_server.serve --model uci_housing_model --thread 10 --port 9393 --use_multilang +``` +注:--use_multilangå‚数用æ¥å¯ç”¨å¤šè¯è¨€å®¢æˆ·ç«¯ + +### 客户端预测 + +#### åŒæ¥é¢„测 + +``` shell +python test_sync_client.py +``` + +#### 异æ¥é¢„测 + +``` shell +python test_asyn_client.py +``` + +#### Batch 预测 + +``` shell +python test_batch_client.py +``` - `is_python` 为 proto æ ¼å¼é€‰é¡¹ã€‚当 `is_python=True` 时,基于 numpy bytes æ ¼å¼è¿›è¡Œæ•°æ®ä¼ 输,目å‰åªé€‚用于 Python;当 `is_python=False` 时,以普通数æ®æ ¼å¼ä¼ è¾“ï¼Œæ›´åŠ é€šç”¨ã€‚ä½¿ç”¨ numpy bytes æ ¼å¼ä¼ 输耗时比普通数æ®æ ¼å¼å°å¾ˆå¤šï¼ˆè¯¦è§ [#654](https://github.com/PaddlePaddle/Serving/pull/654))。 +#### 通用 pb 预测 -5. 异常处ç†ï¼šå½“ gRPC Server 端的 bRPC Client 预测失败(返回 `None`)时,gRPC Client 端åŒæ ·è¿”回None。其他 gRPC 异常会在 Client 内部æ•èŽ·ï¼Œå¹¶åœ¨è¿”回的 fetch_map ä¸æ·»åŠ 一个 "status_code" å—段æ¥åŒºåˆ†æ˜¯å¦é¢„测æ£å¸¸ï¼ˆå‚考 timeout æ ·ä¾‹ï¼‰ã€‚ +``` shell +python test_general_pb_client.py +``` -6. 由于 gRPC åªæ”¯æŒ pick_first å’Œ round_robin è´Ÿè½½å‡è¡¡ç–略,ABTEST 特性还未打é½ã€‚ +#### 预测超时 -7. ç»æµ‹è¯•ï¼ŒgRPC 版本å¯ä»¥åœ¨ Windowsã€macOS å¹³å°ä½¿ç”¨ã€‚ +``` shell +python test_timeout_client.py +``` -8. 计划支æŒçš„客户端è¯è¨€ï¼š +#### List 输入 - - [x] Python - - [ ] Java - - [ ] Go - - [ ] JavaScript +``` shell +python test_list_input_client.py +``` -## Python 端的一些例å +## 3.更多示例 -è¯¦è§ `python/examples/grpc_impl_example` 下的示例文件。 +详è§[`python/examples/grpc_impl_example`](https://github.com/PaddlePaddle/Serving/tree/develop/python/examples/grpc_impl_example)下的示例文件。 diff --git a/doc/INFERENCE_TO_SERVING.md b/doc/INFERENCE_TO_SERVING.md index e10ee976fb455c8cc49a0d5fa44ed4cc1f300ba9..719aa63c0a9b408d6bff628e7be4f35dfb49c5c8 100644 --- a/doc/INFERENCE_TO_SERVING.md +++ b/doc/INFERENCE_TO_SERVING.md @@ -24,13 +24,13 @@ inference_model_dir = "your_inference_model" serving_client_dir = "serving_client_dir" serving_server_dir = "serving_server_dir" feed_var_names, fetch_var_names = inference_model_to_serving( - inference_model_dir, serving_client_dir, serving_server_dir) + inference_model_dir, serving_server_dir, serving_client_dir) ``` if your model file and params file are both standalone, please use the following api. ``` feed_var_names, fetch_var_names = inference_model_to_serving( - inference_model_dir, serving_client_dir, serving_server_dir, + inference_model_dir, serving_server_dir, serving_client_dir, model_filename="model", params_filename="params") ``` diff --git a/doc/INFERENCE_TO_SERVING_CN.md b/doc/INFERENCE_TO_SERVING_CN.md index e7e909ac04be3b1a0885b3390d99a153dfbd170e..5d783f25a3f367baa94d471e50f227d9e6f733d1 100644 --- a/doc/INFERENCE_TO_SERVING_CN.md +++ b/doc/INFERENCE_TO_SERVING_CN.md @@ -23,11 +23,11 @@ inference_model_dir = "your_inference_model" serving_client_dir = "serving_client_dir" serving_server_dir = "serving_server_dir" feed_var_names, fetch_var_names = inference_model_to_serving( - inference_model_dir, serving_client_dir, serving_server_dir) + inference_model_dir, serving_server_dir, serving_client_dir) ``` 如果模型ä¸æœ‰æ¨¡åž‹æ述文件`model_filename` å’Œ 模型å‚数文件`params_filename`,那么请用 ``` feed_var_names, fetch_var_names = inference_model_to_serving( - inference_model_dir, serving_client_dir, serving_server_dir, + inference_model_dir, serving_server_dir, serving_client_dir, model_filename="model", params_filename="params") ``` diff --git a/java/examples/pom.xml b/java/examples/pom.xml index b6c8bc424f5d528d74a4a45828fd9b5e7e5d008e..745e8d4f0f3d47e488f99bd7fe73ed6a9f887373 100644 --- a/java/examples/pom.xml +++ b/java/examples/pom.xml @@ -75,7 +75,7 @@ <dependency> <groupId>junit</groupId> <artifactId>junit</artifactId> - <version>4.11</version> + <version>4.13.1</version> <scope>test</scope> </dependency> <dependency> diff --git a/python/examples/bert/bert_client.py b/python/examples/bert/bert_client.py index b378f9f791bce4abfe79b068c1875d9b66f1791c..4111589b3ddfde980e415fbac1a5f38f4abafada 100644 --- a/python/examples/bert/bert_client.py +++ b/python/examples/bert/bert_client.py @@ -33,5 +33,5 @@ for line in sys.stdin: for key in feed_dict.keys(): feed_dict[key] = np.array(feed_dict[key]).reshape((128, 1)) #print(feed_dict) - result = client.predict(feed=feed_dict, fetch=fetch, batch=True) + result = client.predict(feed=feed_dict, fetch=fetch, batch=False) print(result) diff --git a/python/examples/bert/bert_web_service.py b/python/examples/bert/bert_web_service.py index e1260dd1c2942fc806f6fd6b2199feb9467a8c2b..7cd34fb99e0ecebbf2f6bec47e9c9d163ac3a44c 100644 --- a/python/examples/bert/bert_web_service.py +++ b/python/examples/bert/bert_web_service.py @@ -29,7 +29,7 @@ class BertService(WebService): def preprocess(self, feed=[], fetch=[]): feed_res = [] - is_batch = True + is_batch = False for ins in feed: feed_dict = self.reader.process(ins["words"].encode("utf-8")) for key in feed_dict.keys(): diff --git a/python/examples/faster_rcnn_model/benchmark.py b/python/examples/faster_rcnn_model/benchmark.py new file mode 100755 index 0000000000000000000000000000000000000000..1930312341c0dac55e43b36c946c6e174a472b65 --- /dev/null +++ b/python/examples/faster_rcnn_model/benchmark.py @@ -0,0 +1,125 @@ +# -*- coding: utf-8 -*- +# +# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# pylint: disable=doc-string-missing + +from __future__ import unicode_literals, absolute_import +import os +import sys +import time +import json +import requests +from paddle_serving_client import Client +from paddle_serving_client.utils import MultiThreadRunner +from paddle_serving_client.utils import benchmark_args, show_latency +from paddle_serving_app.reader import ChineseBertReader + +from paddle_serving_app.reader import * +import numpy as np + + + +args = benchmark_args() + + +def single_func(idx, resource): + img="./000000570688.jpg" + profile_flags = False + latency_flags = False + if os.getenv("FLAGS_profile_client"): + profile_flags = True + if os.getenv("FLAGS_serving_latency"): + latency_flags = True + latency_list = [] + + if args.request == "rpc": + preprocess = Sequential([ + File2Image(), BGR2RGB(), Div(255.0), + Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225], False), + Resize(640, 640), Transpose((2, 0, 1)) + ]) + + postprocess = RCNNPostprocess("label_list.txt", "output") + client = Client() + + client.load_client_config(args.model) + client.connect([resource["endpoint"][idx % len(resource["endpoint"])]]) + + start = time.time() + for i in range(turns): + if args.batch_size >= 1: + l_start = time.time() + feed_batch = [] + b_start = time.time() + im = preprocess(img) + for bi in range(args.batch_size): + print("1111batch") + print(bi) + feed_batch.append({"image": im, + "im_info": np.array(list(im.shape[1:]) + [1.0]), + "im_shape": np.array(list(im.shape[1:]) + [1.0])}) + # im = preprocess(img) + b_end = time.time() + + if profile_flags: + sys.stderr.write( + "PROFILE\tpid:{}\tbert_pre_0:{} bert_pre_1:{}\n".format( + os.getpid(), + int(round(b_start * 1000000)), + int(round(b_end * 1000000)))) + #result = client.predict(feed=feed_batch, fetch=fetch) + fetch_map = client.predict( + feed=feed_batch, + fetch=["multiclass_nms"]) + fetch_map["image"] = img + postprocess(fetch_map) + + l_end = time.time() + if latency_flags: + latency_list.append(l_end * 1000 - l_start * 1000) + else: + print("unsupport batch size {}".format(args.batch_size)) + else: + raise ValueError("not implemented {} request".format(args.request)) + end = time.time() + if latency_flags: + return [[end - start], latency_list] + else: + return [[end - start]] + + +if __name__ == '__main__': + multi_thread_runner = MultiThreadRunner() + endpoint_list = [ + "127.0.0.1:7777" + ] + turns = 10 + start = time.time() + result = multi_thread_runner.run( + single_func, args.thread, {"endpoint": endpoint_list,"turns": turns}) + end = time.time() + total_cost = end - start + + avg_cost = 0 + for i in range(args.thread): + avg_cost += result[0][i] + avg_cost = avg_cost / args.thread + + print("total cost: {}s".format(total_cost)) + print("each thread cost: {}s. ".format(avg_cost)) + print("qps: {}samples/s".format(args.batch_size * args.thread * turns / + total_cost)) + if os.getenv("FLAGS_serving_latency"): + show_latency(result[1]) diff --git a/python/examples/faster_rcnn_model/benchmark.sh b/python/examples/faster_rcnn_model/benchmark.sh new file mode 100755 index 0000000000000000000000000000000000000000..5706fd03c7a0e266bcac18b0544c64f327cbbe9b --- /dev/null +++ b/python/examples/faster_rcnn_model/benchmark.sh @@ -0,0 +1,52 @@ +rm profile_log* +export CUDA_VISIBLE_DEVICES=0 +export FLAGS_profile_server=1 +export FLAGS_profile_client=1 +export FLAGS_serving_latency=1 + +gpu_id=0 +#save cpu and gpu utilization log +if [ -d utilization ];then + rm -rf utilization +else + mkdir utilization +fi +#start server +$PYTHONROOT/bin/python3 -m paddle_serving_server_gpu.serve --model $1 --port 7777 --thread 4 --gpu_ids 0 --ir_optim > elog 2>&1 & +sleep 5 + +#warm up +$PYTHONROOT/bin/python3 benchmark.py --thread 4 --batch_size 1 --model $2/serving_client_conf.prototxt --request rpc > profile 2>&1 +echo -e "import psutil\ncpu_utilization=psutil.cpu_percent(1,False)\nprint('CPU_UTILIZATION:', cpu_utilization)\n" > cpu_utilization.py +for thread_num in 1 4 8 16 +do +for batch_size in 1 +do + job_bt=`date '+%Y%m%d%H%M%S'` + nvidia-smi --id=0 --query-compute-apps=used_memory --format=csv -lms 100 > gpu_use.log 2>&1 & + nvidia-smi --id=0 --query-gpu=utilization.gpu --format=csv -lms 100 > gpu_utilization.log 2>&1 & + gpu_memory_pid=$! + $PYTHONROOT/bin/python3 benchmark.py --thread $thread_num --batch_size $batch_size --model $2/serving_client_conf.prototxt --request rpc > profile 2>&1 + kill ${gpu_memory_pid} + kill `ps -ef|grep used_memory|awk '{print $2}'` + echo "model_name:" $1 + echo "thread_num:" $thread_num + echo "batch_size:" $batch_size + echo "=================Done====================" + echo "model_name:$1" >> profile_log_$1 + echo "batch_size:$batch_size" >> profile_log_$1 + $PYTHONROOT/bin/python3 cpu_utilization.py >> profile_log_$1 + job_et=`date '+%Y%m%d%H%M%S'` + awk 'BEGIN {max = 0} {if(NR>1){if ($1 > max) max=$1}} END {print "MAX_GPU_MEMORY:", max}' gpu_use.log >> profile_log_$1 + awk 'BEGIN {max = 0} {if(NR>1){if ($1 > max) max=$1}} END {print "GPU_UTILIZATION:", max}' gpu_utilization.log >> profile_log_$1 + rm -rf gpu_use.log gpu_utilization.log + $PYTHONROOT/bin/python3 ../util/show_profile.py profile $thread_num >> profile_log_$1 + tail -n 8 profile >> profile_log_$1 + echo "" >> profile_log_$1 +done +done + +#Divided log +awk 'BEGIN{RS="\n\n"}{i++}{print > "bert_log_"i}' profile_log_$1 +mkdir bert_log && mv bert_log_* bert_log +ps -ef|grep 'serving'|grep -v grep|cut -c 9-15 | xargs kill -9 diff --git a/python/examples/imagenet/resnet50_rpc_client.py b/python/examples/imagenet/resnet50_rpc_client.py index 7888ab6302b483672ec1d7270f7db0c551f1778d..b23f99175b97a011c3b1c72d3b7358b646c54e68 100644 --- a/python/examples/imagenet/resnet50_rpc_client.py +++ b/python/examples/imagenet/resnet50_rpc_client.py @@ -38,7 +38,8 @@ start = time.time() image_file = "https://paddle-serving.bj.bcebos.com/imagenet-example/daisy.jpg" for i in range(10): img = seq(image_file) - fetch_map = client.predict(feed={"image": img}, fetch=["score"]) + fetch_map = client.predict( + feed={"image": img}, fetch=["score"], batch=False) prob = max(fetch_map["score"][0]) label = label_dict[fetch_map["score"][0].tolist().index(prob)].strip( ).replace(",", "") diff --git a/python/examples/imagenet/resnet50_web_service.py b/python/examples/imagenet/resnet50_web_service.py index 4c9822757ce233498ef9ec2baf5f3fcac7bc1ccb..7033103717621807ecd74093bf5eba8d31a9b877 100644 --- a/python/examples/imagenet/resnet50_web_service.py +++ b/python/examples/imagenet/resnet50_web_service.py @@ -13,7 +13,7 @@ # limitations under the License. import sys from paddle_serving_client import Client - +import numpy as np from paddle_serving_app.reader import Sequential, URL2Image, Resize, CenterCrop, RGB2BGR, Transpose, Div, Normalize, Base64ToImage if len(sys.argv) != 4: @@ -44,12 +44,13 @@ class ImageService(WebService): def preprocess(self, feed=[], fetch=[]): feed_batch = [] + is_batch = True for ins in feed: if "image" not in ins: raise ("feed data error!") img = self.seq(ins["image"]) feed_batch.append({"image": img[np.newaxis, :]}) - return feed_batch, fetch + return feed_batch, fetch, is_batch def postprocess(self, feed=[], fetch=[], fetch_map={}): score_list = fetch_map["score"] diff --git a/python/examples/imdb/text_classify_service.py b/python/examples/imdb/text_classify_service.py index 1d292194f963466d3e53859dc9e4c6da1789ea20..ca1e26002baf0284f282add235706080f7902c33 100755 --- a/python/examples/imdb/text_classify_service.py +++ b/python/examples/imdb/text_classify_service.py @@ -29,13 +29,14 @@ class IMDBService(WebService): def preprocess(self, feed={}, fetch=[]): feed_batch = [] words_lod = [0] + is_batch = True for ins in feed: words = self.dataset.get_words_only(ins["words"]) words = np.array(words).reshape(len(words), 1) words_lod.append(words_lod[-1] + len(words)) feed_batch.append(words) feed = {"words": np.concatenate(feed_batch), "words.lod": words_lod} - return feed, fetch + return feed, fetch, is_batch imdb_service = IMDBService(name="imdb") diff --git a/python/examples/lac/lac_web_service.py b/python/examples/lac/lac_web_service.py index bed89f54b626c0cce55767f8edacc3dd33f0104c..cf37f66294bd154324f2c7cacd1a35571b6c6350 100644 --- a/python/examples/lac/lac_web_service.py +++ b/python/examples/lac/lac_web_service.py @@ -15,6 +15,7 @@ from paddle_serving_server.web_service import WebService import sys from paddle_serving_app.reader import LACReader +import numpy as np class LACService(WebService): @@ -23,13 +24,21 @@ class LACService(WebService): def preprocess(self, feed={}, fetch=[]): feed_batch = [] + fetch = ["crf_decode"] + lod_info = [0] + is_batch = True for ins in feed: if "words" not in ins: raise ("feed data error!") feed_data = self.reader.process(ins["words"]) - feed_batch.append({"words": feed_data}) - fetch = ["crf_decode"] - return feed_batch, fetch + feed_batch.append(np.array(feed_data).reshape(len(feed_data), 1)) + lod_info.append(lod_info[-1] + len(feed_data)) + feed_dict = { + "words": np.concatenate( + feed_batch, axis=0), + "words.lod": lod_info + } + return feed_dict, fetch, is_batch def postprocess(self, feed={}, fetch=[], fetch_map={}): batch_ret = [] diff --git a/python/examples/pipeline/imagenet/README.md b/python/examples/pipeline/imagenet/README.md new file mode 100644 index 0000000000000000000000000000000000000000..d0fa99e6d72f10d3d2b5907285528b68685128e0 --- /dev/null +++ b/python/examples/pipeline/imagenet/README.md @@ -0,0 +1,19 @@ +# Imagenet Pipeline WebService + +This document will takes Imagenet service as an example to introduce how to use Pipeline WebService. + +## Get model +``` +sh get_model.sh +``` + +## Start server + +``` +python resnet50_web_service.py &>log.txt & +``` + +## RPC test +``` +python pipeline_rpc_client.py +``` diff --git a/python/examples/pipeline/imagenet/README_CN.md b/python/examples/pipeline/imagenet/README_CN.md new file mode 100644 index 0000000000000000000000000000000000000000..325a64e7a01da169978da7fc07b9252c4896f327 --- /dev/null +++ b/python/examples/pipeline/imagenet/README_CN.md @@ -0,0 +1,19 @@ +# Imagenet Pipeline WebService + +这里以 Uci æœåŠ¡ä¸ºä¾‹æ¥ä»‹ç» Pipeline WebService 的使用。 + +## 获å–模型 +``` +sh get_data.sh +``` + +## å¯åŠ¨æœåŠ¡ + +``` +python web_service.py &>log.txt & +``` + +## 测试 +``` +curl -X POST -k http://localhost:18082/uci/prediction -d '{"key": ["x"], "value": ["0.0137, -0.1136, 0.2553, -0.0692, 0.0582, -0.0727, -0.1583, -0.0584, 0.6283, 0.4919, 0.1856, 0.0795, -0.0332"]}' +``` diff --git a/python/examples/pipeline/imagenet/config.yml b/python/examples/pipeline/imagenet/config.yml new file mode 100644 index 0000000000000000000000000000000000000000..52ddab6f3194efe7c884411bfbcd381f76ea075e --- /dev/null +++ b/python/examples/pipeline/imagenet/config.yml @@ -0,0 +1,30 @@ +#worker_num, 最大并å‘数。当build_dag_each_worker=Trueæ—¶, 框架会创建worker_num个进程,æ¯ä¸ªè¿›ç¨‹å†…构建grpcSeverå’ŒDAG +##当build_dag_each_worker=False时,框架会设置主线程grpcçº¿ç¨‹æ± çš„max_workers=worker_num +worker_num: 1 + +#http端å£, rpc_portå’Œhttp_portä¸å…许åŒæ—¶ä¸ºç©ºã€‚当rpc_portå¯ç”¨ä¸”http_port为空时,ä¸è‡ªåŠ¨ç”Ÿæˆhttp_port +http_port: 18082 +rpc_port: 9999 + +dag: + #op资æºç±»åž‹, True, 为线程模型;False,为进程模型 + is_thread_op: False +op: + imagenet: + #当opé…置没有server_endpoints时,从local_service_conf读å–本地æœåŠ¡é…ç½® + local_service_conf: + + #并å‘数,is_thread_op=True时,为线程并å‘ï¼›å¦åˆ™ä¸ºè¿›ç¨‹å¹¶å‘ + concurrency: 2 + + #uci模型路径 + model_config: ResNet50_vd_model + + #计算硬件ID,当devices为""或ä¸å†™æ—¶ä¸ºCPU预测;当devices为"0", "0,1,2"时为GPU预测,表示使用的GPUå¡ + devices: "0" # "0,1" + + #client类型,包括brpc, grpcå’Œlocal_predictor.local_predictorä¸å¯åŠ¨ServingæœåŠ¡ï¼Œè¿›ç¨‹å†…预测 + client_type: local_predictor + + #Fetch结果列表,以client_configä¸fetch_varçš„alias_name为准 + fetch_list: ["score"] diff --git a/python/examples/pipeline/imagenet/daisy.jpg b/python/examples/pipeline/imagenet/daisy.jpg new file mode 100644 index 0000000000000000000000000000000000000000..7edeca63e5f32e68550ef720d81f59df58a8eabc Binary files /dev/null and b/python/examples/pipeline/imagenet/daisy.jpg differ diff --git a/python/examples/pipeline/imagenet/get_model.sh b/python/examples/pipeline/imagenet/get_model.sh new file mode 100644 index 0000000000000000000000000000000000000000..1964c79a2e13dbbe373636ca1a06d2967fad7b79 --- /dev/null +++ b/python/examples/pipeline/imagenet/get_model.sh @@ -0,0 +1,5 @@ +wget --no-check-certificate https://paddle-serving.bj.bcebos.com/imagenet-example/ResNet50_vd.tar.gz +tar -xzvf ResNet50_vd.tar.gz + +wget --no-check-certificate https://paddle-serving.bj.bcebos.com/imagenet-example/image_data.tar.gz +tar -xzvf image_data.tar.gz diff --git a/python/examples/pipeline/imagenet/imagenet.label b/python/examples/pipeline/imagenet/imagenet.label new file mode 100644 index 0000000000000000000000000000000000000000..d7146735146ea1894173d6d0e20fb90af36be849 --- /dev/null +++ b/python/examples/pipeline/imagenet/imagenet.label @@ -0,0 +1,1000 @@ +tench, Tinca tinca, +goldfish, Carassius auratus, +great white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias, +tiger shark, Galeocerdo cuvieri, +hammerhead, hammerhead shark, +electric ray, crampfish, numbfish, torpedo, +stingray, +cock, +hen, +ostrich, Struthio camelus, +brambling, Fringilla montifringilla, +goldfinch, Carduelis carduelis, +house finch, linnet, Carpodacus mexicanus, +junco, snowbird, +indigo bunting, indigo finch, indigo bird, Passerina cyanea, +robin, American robin, Turdus migratorius, +bulbul, +jay, +magpie, +chickadee, +water ouzel, dipper, +kite, +bald eagle, American eagle, Haliaeetus leucocephalus, +vulture, +great grey owl, great gray owl, Strix nebulosa, +European fire salamander, Salamandra salamandra, +common newt, Triturus vulgaris, +eft, +spotted salamander, Ambystoma maculatum, +axolotl, mud puppy, Ambystoma mexicanum, +bullfrog, Rana catesbeiana, +tree frog, tree-frog, +tailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui, +loggerhead, loggerhead turtle, Caretta caretta, +leatherback turtle, leatherback, leathery turtle, Dermochelys coriacea, +mud turtle, +terrapin, +box turtle, box tortoise, +banded gecko, +common iguana, iguana, Iguana iguana, +American chameleon, anole, Anolis carolinensis, +whiptail, whiptail lizard, +agama, +frilled lizard, Chlamydosaurus kingi, +alligator lizard, +Gila monster, Heloderma suspectum, +green lizard, Lacerta viridis, +African chameleon, Chamaeleo chamaeleon, +Komodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis, +African crocodile, Nile crocodile, Crocodylus niloticus, +American alligator, Alligator mississipiensis, +triceratops, +thunder snake, worm snake, Carphophis amoenus, +ringneck snake, ring-necked snake, ring snake, +hognose snake, puff adder, sand viper, +green snake, grass snake, +king snake, kingsnake, +garter snake, grass snake, +water snake, +vine snake, +night snake, Hypsiglena torquata, +boa constrictor, Constrictor constrictor, +rock python, rock snake, Python sebae, +Indian cobra, Naja naja, +green mamba, +sea snake, +horned viper, cerastes, sand viper, horned asp, Cerastes cornutus, +diamondback, diamondback rattlesnake, Crotalus adamanteus, +sidewinder, horned rattlesnake, Crotalus cerastes, +trilobite, +harvestman, daddy longlegs, Phalangium opilio, +scorpion, +black and gold garden spider, Argiope aurantia, +barn spider, Araneus cavaticus, +garden spider, Aranea diademata, +black widow, Latrodectus mactans, +tarantula, +wolf spider, hunting spider, +tick, +centipede, +black grouse, +ptarmigan, +ruffed grouse, partridge, Bonasa umbellus, +prairie chicken, prairie grouse, prairie fowl, +peacock, +quail, +partridge, +African grey, African gray, Psittacus erithacus, +macaw, +sulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita, +lorikeet, +coucal, +bee eater, +hornbill, +hummingbird, +jacamar, +toucan, +drake, +red-breasted merganser, Mergus serrator, +goose, +black swan, Cygnus atratus, +tusker, +echidna, spiny anteater, anteater, +platypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus, +wallaby, brush kangaroo, +koala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus, +wombat, +jellyfish, +sea anemone, anemone, +brain coral, +flatworm, platyhelminth, +nematode, nematode worm, roundworm, +conch, +snail, +slug, +sea slug, nudibranch, +chiton, coat-of-mail shell, sea cradle, polyplacophore, +chambered nautilus, pearly nautilus, nautilus, +Dungeness crab, Cancer magister, +rock crab, Cancer irroratus, +fiddler crab, +king crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica, +American lobster, Northern lobster, Maine lobster, Homarus americanus, +spiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish, +crayfish, crawfish, crawdad, crawdaddy, +hermit crab, +isopod, +white stork, Ciconia ciconia, +black stork, Ciconia nigra, +spoonbill, +flamingo, +little blue heron, Egretta caerulea, +American egret, great white heron, Egretta albus, +bittern, +crane, +limpkin, Aramus pictus, +European gallinule, Porphyrio porphyrio, +American coot, marsh hen, mud hen, water hen, Fulica americana, +bustard, +ruddy turnstone, Arenaria interpres, +red-backed sandpiper, dunlin, Erolia alpina, +redshank, Tringa totanus, +dowitcher, +oystercatcher, oyster catcher, +pelican, +king penguin, Aptenodytes patagonica, +albatross, mollymawk, +grey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus, +killer whale, killer, orca, grampus, sea wolf, Orcinus orca, +dugong, Dugong dugon, +sea lion, +Chihuahua, +Japanese spaniel, +Maltese dog, Maltese terrier, Maltese, +Pekinese, Pekingese, Peke, +Shih-Tzu, +Blenheim spaniel, +papillon, +toy terrier, +Rhodesian ridgeback, +Afghan hound, Afghan, +basset, basset hound, +beagle, +bloodhound, sleuthhound, +bluetick, +black-and-tan coonhound, +Walker hound, Walker foxhound, +English foxhound, +redbone, +borzoi, Russian wolfhound, +Irish wolfhound, +Italian greyhound, +whippet, +Ibizan hound, Ibizan Podenco, +Norwegian elkhound, elkhound, +otterhound, otter hound, +Saluki, gazelle hound, +Scottish deerhound, deerhound, +Weimaraner, +Staffordshire bullterrier, Staffordshire bull terrier, +American Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier, +Bedlington terrier, +Border terrier, +Kerry blue terrier, +Irish terrier, +Norfolk terrier, +Norwich terrier, +Yorkshire terrier, +wire-haired fox terrier, +Lakeland terrier, +Sealyham terrier, Sealyham, +Airedale, Airedale terrier, +cairn, cairn terrier, +Australian terrier, +Dandie Dinmont, Dandie Dinmont terrier, +Boston bull, Boston terrier, +miniature schnauzer, +giant schnauzer, +standard schnauzer, +Scotch terrier, Scottish terrier, Scottie, +Tibetan terrier, chrysanthemum dog, +silky terrier, Sydney silky, +soft-coated wheaten terrier, +West Highland white terrier, +Lhasa, Lhasa apso, +flat-coated retriever, +curly-coated retriever, +golden retriever, +Labrador retriever, +Chesapeake Bay retriever, +German short-haired pointer, +vizsla, Hungarian pointer, +English setter, +Irish setter, red setter, +Gordon setter, +Brittany spaniel, +clumber, clumber spaniel, +English springer, English springer spaniel, +Welsh springer spaniel, +cocker spaniel, English cocker spaniel, cocker, +Sussex spaniel, +Irish water spaniel, +kuvasz, +schipperke, +groenendael, +malinois, +briard, +kelpie, +komondor, +Old English sheepdog, bobtail, +Shetland sheepdog, Shetland sheep dog, Shetland, +collie, +Border collie, +Bouvier des Flandres, Bouviers des Flandres, +Rottweiler, +German shepherd, German shepherd dog, German police dog, alsatian, +Doberman, Doberman pinscher, +miniature pinscher, +Greater Swiss Mountain dog, +Bernese mountain dog, +Appenzeller, +EntleBucher, +boxer, +bull mastiff, +Tibetan mastiff, +French bulldog, +Great Dane, +Saint Bernard, St Bernard, +Eskimo dog, husky, +malamute, malemute, Alaskan malamute, +Siberian husky, +dalmatian, coach dog, carriage dog, +affenpinscher, monkey pinscher, monkey dog, +basenji, +pug, pug-dog, +Leonberg, +Newfoundland, Newfoundland dog, +Great Pyrenees, +Samoyed, Samoyede, +Pomeranian, +chow, chow chow, +keeshond, +Brabancon griffon, +Pembroke, Pembroke Welsh corgi, +Cardigan, Cardigan Welsh corgi, +toy poodle, +miniature poodle, +standard poodle, +Mexican hairless, +timber wolf, grey wolf, gray wolf, Canis lupus, +white wolf, Arctic wolf, Canis lupus tundrarum, +red wolf, maned wolf, Canis rufus, Canis niger, +coyote, prairie wolf, brush wolf, Canis latrans, +dingo, warrigal, warragal, Canis dingo, +dhole, Cuon alpinus, +African hunting dog, hyena dog, Cape hunting dog, Lycaon pictus, +hyena, hyaena, +red fox, Vulpes vulpes, +kit fox, Vulpes macrotis, +Arctic fox, white fox, Alopex lagopus, +grey fox, gray fox, Urocyon cinereoargenteus, +tabby, tabby cat, +tiger cat, +Persian cat, +Siamese cat, Siamese, +Egyptian cat, +cougar, puma, catamount, mountain lion, painter, panther, Felis concolor, +lynx, catamount, +leopard, Panthera pardus, +snow leopard, ounce, Panthera uncia, +jaguar, panther, Panthera onca, Felis onca, +lion, king of beasts, Panthera leo, +tiger, Panthera tigris, +cheetah, chetah, Acinonyx jubatus, +brown bear, bruin, Ursus arctos, +American black bear, black bear, Ursus americanus, Euarctos americanus, +ice bear, polar bear, Ursus Maritimus, Thalarctos maritimus, +sloth bear, Melursus ursinus, Ursus ursinus, +mongoose, +meerkat, mierkat, +tiger beetle, +ladybug, ladybeetle, lady beetle, ladybird, ladybird beetle, +ground beetle, carabid beetle, +long-horned beetle, longicorn, longicorn beetle, +leaf beetle, chrysomelid, +dung beetle, +rhinoceros beetle, +weevil, +fly, +bee, +ant, emmet, pismire, +grasshopper, hopper, +cricket, +walking stick, walkingstick, stick insect, +cockroach, roach, +mantis, mantid, +cicada, cicala, +leafhopper, +lacewing, lacewing fly, +"dragonfly, darning needle, devils darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk", +damselfly, +admiral, +ringlet, ringlet butterfly, +monarch, monarch butterfly, milkweed butterfly, Danaus plexippus, +cabbage butterfly, +sulphur butterfly, sulfur butterfly, +lycaenid, lycaenid butterfly, +starfish, sea star, +sea urchin, +sea cucumber, holothurian, +wood rabbit, cottontail, cottontail rabbit, +hare, +Angora, Angora rabbit, +hamster, +porcupine, hedgehog, +fox squirrel, eastern fox squirrel, Sciurus niger, +marmot, +beaver, +guinea pig, Cavia cobaya, +sorrel, +zebra, +hog, pig, grunter, squealer, Sus scrofa, +wild boar, boar, Sus scrofa, +warthog, +hippopotamus, hippo, river horse, Hippopotamus amphibius, +ox, +water buffalo, water ox, Asiatic buffalo, Bubalus bubalis, +bison, +ram, tup, +bighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis, +ibex, Capra ibex, +hartebeest, +impala, Aepyceros melampus, +gazelle, +Arabian camel, dromedary, Camelus dromedarius, +llama, +weasel, +mink, +polecat, fitch, foulmart, foumart, Mustela putorius, +black-footed ferret, ferret, Mustela nigripes, +otter, +skunk, polecat, wood pussy, +badger, +armadillo, +three-toed sloth, ai, Bradypus tridactylus, +orangutan, orang, orangutang, Pongo pygmaeus, +gorilla, Gorilla gorilla, +chimpanzee, chimp, Pan troglodytes, +gibbon, Hylobates lar, +siamang, Hylobates syndactylus, Symphalangus syndactylus, +guenon, guenon monkey, +patas, hussar monkey, Erythrocebus patas, +baboon, +macaque, +langur, +colobus, colobus monkey, +proboscis monkey, Nasalis larvatus, +marmoset, +capuchin, ringtail, Cebus capucinus, +howler monkey, howler, +titi, titi monkey, +spider monkey, Ateles geoffroyi, +squirrel monkey, Saimiri sciureus, +Madagascar cat, ring-tailed lemur, Lemur catta, +indri, indris, Indri indri, Indri brevicaudatus, +Indian elephant, Elephas maximus, +African elephant, Loxodonta africana, +lesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens, +giant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca, +barracouta, snoek, +eel, +coho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch, +rock beauty, Holocanthus tricolor, +anemone fish, +sturgeon, +gar, garfish, garpike, billfish, Lepisosteus osseus, +lionfish, +puffer, pufferfish, blowfish, globefish, +abacus, +abaya, +"academic gown, academic robe, judges robe", +accordion, piano accordion, squeeze box, +acoustic guitar, +aircraft carrier, carrier, flattop, attack aircraft carrier, +airliner, +airship, dirigible, +altar, +ambulance, +amphibian, amphibious vehicle, +analog clock, +apiary, bee house, +apron, +ashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin, +assault rifle, assault gun, +backpack, back pack, knapsack, packsack, rucksack, haversack, +bakery, bakeshop, bakehouse, +balance beam, beam, +balloon, +ballpoint, ballpoint pen, ballpen, Biro, +Band Aid, +banjo, +bannister, banister, balustrade, balusters, handrail, +barbell, +barber chair, +barbershop, +barn, +barometer, +barrel, cask, +barrow, garden cart, lawn cart, wheelbarrow, +baseball, +basketball, +bassinet, +bassoon, +bathing cap, swimming cap, +bath towel, +bathtub, bathing tub, bath, tub, +beach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon, +beacon, lighthouse, beacon light, pharos, +beaker, +bearskin, busby, shako, +beer bottle, +beer glass, +bell cote, bell cot, +bib, +bicycle-built-for-two, tandem bicycle, tandem, +bikini, two-piece, +binder, ring-binder, +binoculars, field glasses, opera glasses, +birdhouse, +boathouse, +bobsled, bobsleigh, bob, +bolo tie, bolo, bola tie, bola, +bonnet, poke bonnet, +bookcase, +bookshop, bookstore, bookstall, +bottlecap, +bow, +bow tie, bow-tie, bowtie, +brass, memorial tablet, plaque, +brassiere, bra, bandeau, +breakwater, groin, groyne, mole, bulwark, seawall, jetty, +breastplate, aegis, egis, +broom, +bucket, pail, +buckle, +bulletproof vest, +bullet train, bullet, +butcher shop, meat market, +cab, hack, taxi, taxicab, +caldron, cauldron, +candle, taper, wax light, +cannon, +canoe, +can opener, tin opener, +cardigan, +car mirror, +carousel, carrousel, merry-go-round, roundabout, whirligig, +"carpenters kit, tool kit", +carton, +car wheel, +cash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM, +cassette, +cassette player, +castle, +catamaran, +CD player, +cello, violoncello, +cellular telephone, cellular phone, cellphone, cell, mobile phone, +chain, +chainlink fence, +chain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour, +chain saw, chainsaw, +chest, +chiffonier, commode, +chime, bell, gong, +china cabinet, china closet, +Christmas stocking, +church, church building, +cinema, movie theater, movie theatre, movie house, picture palace, +cleaver, meat cleaver, chopper, +cliff dwelling, +cloak, +clog, geta, patten, sabot, +cocktail shaker, +coffee mug, +coffeepot, +coil, spiral, volute, whorl, helix, +combination lock, +computer keyboard, keypad, +confectionery, confectionary, candy store, +container ship, containership, container vessel, +convertible, +corkscrew, bottle screw, +cornet, horn, trumpet, trump, +cowboy boot, +cowboy hat, ten-gallon hat, +cradle, +crane, +crash helmet, +crate, +crib, cot, +Crock Pot, +croquet ball, +crutch, +cuirass, +dam, dike, dyke, +desk, +desktop computer, +dial telephone, dial phone, +diaper, nappy, napkin, +digital clock, +digital watch, +dining table, board, +dishrag, dishcloth, +dishwasher, dish washer, dishwashing machine, +disk brake, disc brake, +dock, dockage, docking facility, +dogsled, dog sled, dog sleigh, +dome, +doormat, welcome mat, +drilling platform, offshore rig, +drum, membranophone, tympan, +drumstick, +dumbbell, +Dutch oven, +electric fan, blower, +electric guitar, +electric locomotive, +entertainment center, +envelope, +espresso maker, +face powder, +feather boa, boa, +file, file cabinet, filing cabinet, +fireboat, +fire engine, fire truck, +fire screen, fireguard, +flagpole, flagstaff, +flute, transverse flute, +folding chair, +football helmet, +forklift, +fountain, +fountain pen, +four-poster, +freight car, +French horn, horn, +frying pan, frypan, skillet, +fur coat, +garbage truck, dustcart, +gasmask, respirator, gas helmet, +gas pump, gasoline pump, petrol pump, island dispenser, +goblet, +go-kart, +golf ball, +golfcart, golf cart, +gondola, +gong, tam-tam, +gown, +grand piano, grand, +greenhouse, nursery, glasshouse, +grille, radiator grille, +grocery store, grocery, food market, market, +guillotine, +hair slide, +hair spray, +half track, +hammer, +hamper, +hand blower, blow dryer, blow drier, hair dryer, hair drier, +hand-held computer, hand-held microcomputer, +handkerchief, hankie, hanky, hankey, +hard disc, hard disk, fixed disk, +harmonica, mouth organ, harp, mouth harp, +harp, +harvester, reaper, +hatchet, +holster, +home theater, home theatre, +honeycomb, +hook, claw, +hoopskirt, crinoline, +horizontal bar, high bar, +horse cart, horse-cart, +hourglass, +iPod, +iron, smoothing iron, +"jack-o-lantern", +jean, blue jean, denim, +jeep, landrover, +jersey, T-shirt, tee shirt, +jigsaw puzzle, +jinrikisha, ricksha, rickshaw, +joystick, +kimono, +knee pad, +knot, +lab coat, laboratory coat, +ladle, +lampshade, lamp shade, +laptop, laptop computer, +lawn mower, mower, +lens cap, lens cover, +letter opener, paper knife, paperknife, +library, +lifeboat, +lighter, light, igniter, ignitor, +limousine, limo, +liner, ocean liner, +lipstick, lip rouge, +Loafer, +lotion, +loudspeaker, speaker, speaker unit, loudspeaker system, speaker system, +"loupe, jewelers loupe", +lumbermill, sawmill, +magnetic compass, +mailbag, postbag, +mailbox, letter box, +maillot, +maillot, tank suit, +manhole cover, +maraca, +marimba, xylophone, +mask, +matchstick, +maypole, +maze, labyrinth, +measuring cup, +medicine chest, medicine cabinet, +megalith, megalithic structure, +microphone, mike, +microwave, microwave oven, +military uniform, +milk can, +minibus, +miniskirt, mini, +minivan, +missile, +mitten, +mixing bowl, +mobile home, manufactured home, +Model T, +modem, +monastery, +monitor, +moped, +mortar, +mortarboard, +mosque, +mosquito net, +motor scooter, scooter, +mountain bike, all-terrain bike, off-roader, +mountain tent, +mouse, computer mouse, +mousetrap, +moving van, +muzzle, +nail, +neck brace, +necklace, +nipple, +notebook, notebook computer, +obelisk, +oboe, hautboy, hautbois, +ocarina, sweet potato, +odometer, hodometer, mileometer, milometer, +oil filter, +organ, pipe organ, +oscilloscope, scope, cathode-ray oscilloscope, CRO, +overskirt, +oxcart, +oxygen mask, +packet, +paddle, boat paddle, +paddlewheel, paddle wheel, +padlock, +paintbrush, +"pajama, pyjama, pjs, jammies", +palace, +panpipe, pandean pipe, syrinx, +paper towel, +parachute, chute, +parallel bars, bars, +park bench, +parking meter, +passenger car, coach, carriage, +patio, terrace, +pay-phone, pay-station, +pedestal, plinth, footstall, +pencil box, pencil case, +pencil sharpener, +perfume, essence, +Petri dish, +photocopier, +pick, plectrum, plectron, +pickelhaube, +picket fence, paling, +pickup, pickup truck, +pier, +piggy bank, penny bank, +pill bottle, +pillow, +ping-pong ball, +pinwheel, +pirate, pirate ship, +pitcher, ewer, +"plane, carpenters plane, woodworking plane", +planetarium, +plastic bag, +plate rack, +plow, plough, +"plunger, plumbers helper", +Polaroid camera, Polaroid Land camera, +pole, +police van, police wagon, paddy wagon, patrol wagon, wagon, black Maria, +poncho, +pool table, billiard table, snooker table, +pop bottle, soda bottle, +pot, flowerpot, +"potters wheel", +power drill, +prayer rug, prayer mat, +printer, +prison, prison house, +projectile, missile, +projector, +puck, hockey puck, +punching bag, punch bag, punching ball, punchball, +purse, +quill, quill pen, +quilt, comforter, comfort, puff, +racer, race car, racing car, +racket, racquet, +radiator, +radio, wireless, +radio telescope, radio reflector, +rain barrel, +recreational vehicle, RV, R.V., +reel, +reflex camera, +refrigerator, icebox, +remote control, remote, +restaurant, eating house, eating place, eatery, +revolver, six-gun, six-shooter, +rifle, +rocking chair, rocker, +rotisserie, +rubber eraser, rubber, pencil eraser, +rugby ball, +rule, ruler, +running shoe, +safe, +safety pin, +saltshaker, salt shaker, +sandal, +sarong, +sax, saxophone, +scabbard, +scale, weighing machine, +school bus, +schooner, +scoreboard, +screen, CRT screen, +screw, +screwdriver, +seat belt, seatbelt, +sewing machine, +shield, buckler, +shoe shop, shoe-shop, shoe store, +shoji, +shopping basket, +shopping cart, +shovel, +shower cap, +shower curtain, +ski, +ski mask, +sleeping bag, +slide rule, slipstick, +sliding door, +slot, one-armed bandit, +snorkel, +snowmobile, +snowplow, snowplough, +soap dispenser, +soccer ball, +sock, +solar dish, solar collector, solar furnace, +sombrero, +soup bowl, +space bar, +space heater, +space shuttle, +spatula, +speedboat, +"spider web, spiders web", +spindle, +sports car, sport car, +spotlight, spot, +stage, +steam locomotive, +steel arch bridge, +steel drum, +stethoscope, +stole, +stone wall, +stopwatch, stop watch, +stove, +strainer, +streetcar, tram, tramcar, trolley, trolley car, +stretcher, +studio couch, day bed, +stupa, tope, +submarine, pigboat, sub, U-boat, +suit, suit of clothes, +sundial, +sunglass, +sunglasses, dark glasses, shades, +sunscreen, sunblock, sun blocker, +suspension bridge, +swab, swob, mop, +sweatshirt, +swimming trunks, bathing trunks, +swing, +switch, electric switch, electrical switch, +syringe, +table lamp, +tank, army tank, armored combat vehicle, armoured combat vehicle, +tape player, +teapot, +teddy, teddy bear, +television, television system, +tennis ball, +thatch, thatched roof, +theater curtain, theatre curtain, +thimble, +thresher, thrasher, threshing machine, +throne, +tile roof, +toaster, +tobacco shop, tobacconist shop, tobacconist, +toilet seat, +torch, +totem pole, +tow truck, tow car, wrecker, +toyshop, +tractor, +trailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi, +tray, +trench coat, +tricycle, trike, velocipede, +trimaran, +tripod, +triumphal arch, +trolleybus, trolley coach, trackless trolley, +trombone, +tub, vat, +turnstile, +typewriter keyboard, +umbrella, +unicycle, monocycle, +upright, upright piano, +vacuum, vacuum cleaner, +vase, +vault, +velvet, +vending machine, +vestment, +viaduct, +violin, fiddle, +volleyball, +waffle iron, +wall clock, +wallet, billfold, notecase, pocketbook, +wardrobe, closet, press, +warplane, military plane, +washbasin, handbasin, washbowl, lavabo, wash-hand basin, +washer, automatic washer, washing machine, +water bottle, +water jug, +water tower, +whiskey jug, +whistle, +wig, +window screen, +window shade, +Windsor tie, +wine bottle, +wing, +wok, +wooden spoon, +wool, woolen, woollen, +worm fence, snake fence, snake-rail fence, Virginia fence, +wreck, +yawl, +yurt, +web site, website, internet site, site, +comic book, +crossword puzzle, crossword, +street sign, +traffic light, traffic signal, stoplight, +book jacket, dust cover, dust jacket, dust wrapper, +menu, +plate, +guacamole, +consomme, +hot pot, hotpot, +trifle, +ice cream, icecream, +ice lolly, lolly, lollipop, popsicle, +French loaf, +bagel, beigel, +pretzel, +cheeseburger, +hotdog, hot dog, red hot, +mashed potato, +head cabbage, +broccoli, +cauliflower, +zucchini, courgette, +spaghetti squash, +acorn squash, +butternut squash, +cucumber, cuke, +artichoke, globe artichoke, +bell pepper, +cardoon, +mushroom, +Granny Smith, +strawberry, +orange, +lemon, +fig, +pineapple, ananas, +banana, +jackfruit, jak, jack, +custard apple, +pomegranate, +hay, +carbonara, +chocolate sauce, chocolate syrup, +dough, +meat loaf, meatloaf, +pizza, pizza pie, +potpie, +burrito, +red wine, +espresso, +cup, +eggnog, +alp, +bubble, +cliff, drop, drop-off, +coral reef, +geyser, +lakeside, lakeshore, +promontory, headland, head, foreland, +sandbar, sand bar, +seashore, coast, seacoast, sea-coast, +valley, vale, +volcano, +ballplayer, baseball player, +groom, bridegroom, +scuba diver, +rapeseed, +daisy, +"yellow ladys slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum", +corn, +acorn, +hip, rose hip, rosehip, +buckeye, horse chestnut, conker, +coral fungus, +agaric, +gyromitra, +stinkhorn, carrion fungus, +earthstar, +hen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa, +bolete, +ear, spike, capitulum, +toilet tissue, toilet paper, bathroom tissue diff --git a/python/examples/pipeline/imagenet/pipeline_rpc_client.py b/python/examples/pipeline/imagenet/pipeline_rpc_client.py new file mode 100644 index 0000000000000000000000000000000000000000..3220e6c20b27c92a59cd0c28050719a8790d648d --- /dev/null +++ b/python/examples/pipeline/imagenet/pipeline_rpc_client.py @@ -0,0 +1,36 @@ +# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +from paddle_serving_server_gpu.pipeline import PipelineClient +import numpy as np +import requests +import json +import cv2 +import base64 +import os + +client = PipelineClient() +client.connect(['127.0.0.1:9999']) + + +def cv2_to_base64(image): + return base64.b64encode(image).decode('utf8') + + +with open("daisy.jpg", 'rb') as file: + image_data = file.read() +image = cv2_to_base64(image_data) + +for i in range(1): + ret = client.predict(feed_dict={"image": image}, fetch=["label", "prob"]) + print(ret) diff --git a/python/examples/pipeline/imagenet/resnet50_web_service.py b/python/examples/pipeline/imagenet/resnet50_web_service.py new file mode 100644 index 0000000000000000000000000000000000000000..ece3befee8d62c9af2e0e0a1c576a63e42d86245 --- /dev/null +++ b/python/examples/pipeline/imagenet/resnet50_web_service.py @@ -0,0 +1,71 @@ +# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +import sys +from paddle_serving_app.reader import Sequential, URL2Image, Resize, CenterCrop, RGB2BGR, Transpose, Div, Normalize, Base64ToImage +try: + from paddle_serving_server_gpu.web_service import WebService, Op +except ImportError: + from paddle_serving_server.web_service import WebService, Op +import logging +import numpy as np +import base64, cv2 + + +class ImagenetOp(Op): + def init_op(self): + self.seq = Sequential([ + Resize(256), CenterCrop(224), RGB2BGR(), Transpose((2, 0, 1)), + Div(255), Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225], + True) + ]) + self.label_dict = {} + label_idx = 0 + with open("imagenet.label") as fin: + for line in fin: + self.label_dict[label_idx] = line.strip() + label_idx += 1 + + def preprocess(self, input_dicts, data_id, log_id): + (_, input_dict), = input_dicts.items() + data = base64.b64decode(input_dict["image"].encode('utf8')) + data = np.fromstring(data, np.uint8) + # Note: class variables(self.var) can only be used in process op mode + im = cv2.imdecode(data, cv2.IMREAD_COLOR) + img = self.seq(im) + return {"image": img[np.newaxis, :].copy()}, False, None, "" + + def postprocess(self, input_dicts, fetch_dict, log_id): + print(fetch_dict) + score_list = fetch_dict["score"] + result = {"label": [], "prob": []} + for score in score_list: + score = score.tolist() + max_score = max(score) + result["label"].append(self.label_dict[score.index(max_score)] + .strip().replace(",", "")) + result["prob"].append(max_score) + result["label"] = str(result["label"]) + result["prob"] = str(result["prob"]) + return result, None, "" + + +class ImageService(WebService): + def get_pipeline_response(self, read_op): + image_op = ImagenetOp(name="imagenet", input_ops=[read_op]) + return image_op + + +uci_service = ImageService(name="imagenet") +uci_service.prepare_pipeline_config("config.yml") +uci_service.run_service() diff --git a/python/examples/senta/senta_web_service.py b/python/examples/senta/senta_web_service.py index 477064f3988a1c8152f77ce7fe068eb0a2181198..6a1009412d6a4192bacce0ef7bce0685119713b1 100644 --- a/python/examples/senta/senta_web_service.py +++ b/python/examples/senta/senta_web_service.py @@ -37,6 +37,7 @@ class SentaService(WebService): #定义senta模型预测æœåŠ¡çš„预处ç†ï¼Œè°ƒç”¨é¡ºåºï¼šlac reader->lac模型预测->预测结果åŽå¤„ç†->senta reader def preprocess(self, feed=[], fetch=[]): feed_batch = [] + is_batch = True words_lod = [0] for ins in feed: if "words" not in ins: @@ -64,14 +65,13 @@ class SentaService(WebService): return { "words": np.concatenate(feed_batch), "words.lod": words_lod - }, fetch + }, fetch, is_batch senta_service = SentaService(name="senta") senta_service.load_model_config("senta_bilstm_model") senta_service.prepare_server(workdir="workdir") senta_service.init_lac_client( - lac_port=9300, - lac_client_config="lac/lac_model/serving_server_conf.prototxt") + lac_port=9300, lac_client_config="lac_model/serving_server_conf.prototxt") senta_service.run_rpc_service() senta_service.run_web_service() diff --git a/python/examples/unet_for_image_seg/unet_benchmark/README.md b/python/examples/unet_for_image_seg/unet_benchmark/README.md new file mode 100644 index 0000000000000000000000000000000000000000..edb2af5864db746dc3368423dd7414575ed7b675 --- /dev/null +++ b/python/examples/unet_for_image_seg/unet_benchmark/README.md @@ -0,0 +1,8 @@ +#UNET_BENCHMARK 使用说明 +## 功能 +* benchmark测试 +## 注æ„事项 +* 示例图片(å¯ä»¥æœ‰å¤šå¼ )请放置于与img_data路径ä¸ï¼Œæ”¯æŒjpg,jpeg +* å›¾ç‰‡å¼ æ•°åº”è¯¥å¤§äºŽç‰äºŽå¹¶å‘æ•°é‡ +## TODO +* http benchmark diff --git a/python/examples/unet_for_image_seg/unet_benchmark/img_data/N0060.jpg b/python/examples/unet_for_image_seg/unet_benchmark/img_data/N0060.jpg new file mode 100644 index 0000000000000000000000000000000000000000..feac2837eaa5ae5db414d9769a0c5a830dde268d Binary files /dev/null and b/python/examples/unet_for_image_seg/unet_benchmark/img_data/N0060.jpg differ diff --git a/python/examples/unet_for_image_seg/unet_benchmark/launch_benckmark.sh b/python/examples/unet_for_image_seg/unet_benchmark/launch_benckmark.sh new file mode 100644 index 0000000000000000000000000000000000000000..59c2293e34b11dd2efd088c97a3c8de0dc62cf6f --- /dev/null +++ b/python/examples/unet_for_image_seg/unet_benchmark/launch_benckmark.sh @@ -0,0 +1,3 @@ +#!/bin/bash +python unet_benchmark.py --thread 1 --batch_size 1 --model ../unet_client/serving_client_conf.prototxt +# thread/batch can be modified as you wish diff --git a/python/examples/unet_for_image_seg/unet_benchmark/unet_benchmark.py b/python/examples/unet_for_image_seg/unet_benchmark/unet_benchmark.py new file mode 100644 index 0000000000000000000000000000000000000000..172643e364c5462aeed59ebe5e7b45bee7abf8ef --- /dev/null +++ b/python/examples/unet_for_image_seg/unet_benchmark/unet_benchmark.py @@ -0,0 +1,159 @@ +# -*- coding: utf-8 -*- +# +# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +""" + unet bench mark script + 20201130 first edition by cg82616424 +""" +from __future__ import unicode_literals, absolute_import +import os +import time +import json +import requests +from paddle_serving_client import Client +from paddle_serving_client.utils import MultiThreadRunner +from paddle_serving_client.utils import benchmark_args, show_latency +from paddle_serving_app.reader import Sequential, File2Image, Resize, Transpose, BGR2RGB, SegPostprocess +args = benchmark_args() + + +def get_img_names(path): + """ + Brief: + get img files(jpg) under this path + if any exception happened return None + Args: + path (string): image file path + Returns: + list: images names under this folder + """ + if not os.path.exists(path): + return None + if not os.path.isdir(path): + return None + list_name = [] + for f_handler in os.listdir(path): + file_path = os.path.join(path, f_handler) + if os.path.isdir(file_path): + continue + else: + if not file_path.endswith(".jpeg") and not file_path.endswith( + ".jpg"): + continue + list_name.append(file_path) + return list_name + + +def preprocess_img(img_list): + """ + Brief: + prepare img data for benchmark + Args: + img_list(list): list for img file path + Returns: + image content binary list after preprocess + """ + preprocess = Sequential([File2Image(), Resize((512, 512))]) + result_list = [] + for img in img_list: + img_tmp = preprocess(img) + result_list.append(img_tmp) + return result_list + + +def benckmark_worker(idx, resource): + """ + Brief: + benchmark single worker for unet + Args: + idx(int): worker idx ,use idx to select backend unet service + resource(dict): unet serving endpoint dict + Returns: + latency + TODO: + http benckmarks + """ + profile_flags = False + latency_flags = False + postprocess = SegPostprocess(2) + if os.getenv("FLAGS_profile_client"): + profile_flags = True + if os.getenv("FLAGS_serving_latency"): + latency_flags = True + latency_list = [] + client_handler = Client() + client_handler.load_client_config(args.model) + client_handler.connect( + [resource["endpoint"][idx % len(resource["endpoint"])]]) + start = time.time() + turns = resource["turns"] + img_list = resource["img_list"] + for i in range(turns): + if args.batch_size >= 1: + l_start = time.time() + feed_batch = [] + b_start = time.time() + for bi in range(args.batch_size): + feed_batch.append({"image": img_list[bi]}) + b_end = time.time() + if profile_flags: + sys.stderr.write( + "PROFILE\tpid:{}\tunt_pre_0:{} unet_pre_1:{}\n".format( + os.getpid(), + int(round(b_start * 1000000)), + int(round(b_end * 1000000)))) + result = client_handler.predict( + feed={"image": img_list[bi]}, fetch=["output"]) + #result["filename"] = "./img_data/N0060.jpg" % (os.getpid(), idx, time.time()) + #postprocess(result) # if you want to measure post process time, you have to uncomment this line + l_end = time.time() + if latency_flags: + latency_list.append(l_end * 1000 - l_start * 1000) + else: + print("unsupport batch size {}".format(args.batch_size)) + end = time.time() + if latency_flags: + return [[end - start], latency_list] + else: + return [[end - start]] + + +if __name__ == '__main__': + """ + usage: + """ + img_file_list = get_img_names("./img_data") + img_content_list = preprocess_img(img_file_list) + multi_thread_runner = MultiThreadRunner() + endpoint_list = ["127.0.0.1:9494"] + turns = 1 + start = time.time() + result = multi_thread_runner.run(benckmark_worker, args.thread, { + "endpoint": endpoint_list, + "turns": turns, + "img_list": img_content_list + }) + end = time.time() + total_cost = end - start + avg_cost = 0 + for i in range(args.thread): + avg_cost += result[0][i] + avg_cost = avg_cost / args.thread + print("total cost: {}s".format(total_cost)) + print("each thread cost: {}s. ".format(avg_cost)) + print("qps: {}samples/s".format(args.batch_size * args.thread * turns / + total_cost)) + if os.getenv("FLAGS_serving_latency"): + show_latency(result[1]) diff --git a/python/paddle_serving_server/__init__.py b/python/paddle_serving_server/__init__.py index ad64e9787857f7b05054007055113824abb1e471..30f4583a3b785dfe8824a5c14014c5e816fbc27e 100644 --- a/python/paddle_serving_server/__init__.py +++ b/python/paddle_serving_server/__init__.py @@ -23,13 +23,13 @@ import paddle_serving_server as paddle_serving_server from .version import serving_server_version from contextlib import closing import collections -import fcntl - import shutil import numpy as np import grpc from .proto import multi_lang_general_model_service_pb2 import sys +if sys.platform.startswith('win') is False: + import fcntl sys.path.append( os.path.join(os.path.abspath(os.path.dirname(__file__)), 'proto')) from .proto import multi_lang_general_model_service_pb2_grpc diff --git a/python/paddle_serving_server/web_service.py b/python/paddle_serving_server/web_service.py index 0327af03647377e1fc10c4b42fc6aca67e366d8a..18e6664edbfd486bb0156ecc58232795f16d74bb 100644 --- a/python/paddle_serving_server/web_service.py +++ b/python/paddle_serving_server/web_service.py @@ -52,6 +52,20 @@ class WebService(object): def load_model_config(self, model_config): print("This API will be deprecated later. Please do not use it") self.model_config = model_config + import os + from .proto import general_model_config_pb2 as m_config + import google.protobuf.text_format + if os.path.isdir(model_config): + client_config = "{}/serving_server_conf.prototxt".format( + model_config) + elif os.path.isfile(path): + client_config = model_config + model_conf = m_config.GeneralModelConfig() + f = open(client_config, 'r') + model_conf = google.protobuf.text_format.Merge( + str(f.read()), model_conf) + self.feed_names = [var.alias_name for var in model_conf.feed_var] + self.fetch_names = [var.alias_name for var in model_conf.fetch_var] def _launch_rpc_service(self): op_maker = OpMaker() @@ -179,10 +193,7 @@ class WebService(object): def run_web_service(self): print("This API will be deprecated later. Please do not use it") - self.app_instance.run(host="0.0.0.0", - port=self.port, - threaded=False, - processes=1) + self.app_instance.run(host="0.0.0.0", port=self.port, threaded=True) def get_app_instance(self): return self.app_instance diff --git a/python/paddle_serving_server_gpu/web_service.py b/python/paddle_serving_server_gpu/web_service.py index 560ffa83d067de4bb296a3e6b479c0b00f595a8c..47bf38bf94d6b6444377e3e3967b196bb3edd6a7 100644 --- a/python/paddle_serving_server_gpu/web_service.py +++ b/python/paddle_serving_server_gpu/web_service.py @@ -58,6 +58,20 @@ class WebService(object): def load_model_config(self, model_config): print("This API will be deprecated later. Please do not use it") self.model_config = model_config + import os + from .proto import general_model_config_pb2 as m_config + import google.protobuf.text_format + if os.path.isdir(model_config): + client_config = "{}/serving_server_conf.prototxt".format( + model_config) + elif os.path.isfile(path): + client_config = model_config + model_conf = m_config.GeneralModelConfig() + f = open(client_config, 'r') + model_conf = google.protobuf.text_format.Merge( + str(f.read()), model_conf) + self.feed_names = [var.alias_name for var in model_conf.feed_var] + self.fetch_names = [var.alias_name for var in model_conf.fetch_var] def set_gpus(self, gpus): print("This API will be deprecated later. Please do not use it") @@ -240,10 +254,7 @@ class WebService(object): def run_web_service(self): print("This API will be deprecated later. Please do not use it") - self.app_instance.run(host="0.0.0.0", - port=self.port, - threaded=False, - processes=4) + self.app_instance.run(host="0.0.0.0", port=self.port, threaded=True) def get_app_instance(self): return self.app_instance diff --git a/python/pipeline/operator.py b/python/pipeline/operator.py index 2cf8a10c576461a93590ffdf4187790337432a1d..92e0c0c6e0bb2e415f48729d25c2153d2026b6b2 100644 --- a/python/pipeline/operator.py +++ b/python/pipeline/operator.py @@ -1343,7 +1343,7 @@ class ResponseOp(Op): type(var))) _LOGGER.error("(logid={}) Failed to pack RPC " "response package: {}".format( - channeldata.id, resp.error_info)) + channeldata.id, resp.err_msg)) break resp.value.append(var) resp.key.append(name) diff --git a/python/pipeline/pipeline_client.py b/python/pipeline/pipeline_client.py index 971f3da71373443de8973ca446be2bc11e9f5672..265f88c444e2484e7e50705b507bf00bbe0db0e1 100644 --- a/python/pipeline/pipeline_client.py +++ b/python/pipeline/pipeline_client.py @@ -23,7 +23,7 @@ import socket from .channel import ChannelDataErrcode from .proto import pipeline_service_pb2 from .proto import pipeline_service_pb2_grpc - +import six _LOGGER = logging.getLogger(__name__) @@ -53,7 +53,10 @@ class PipelineClient(object): if logid is None: req.logid = 0 else: - req.logid = long(logid) + if six.PY2: + req.logid = long(logid) + elif six.PY3: + req.logid = int(log_id) feed_dict.pop("logid") clientip = feed_dict.get("clientip") diff --git a/python/setup.py.app.in b/python/setup.py.app.in index 1a06b0d352c1da4cdd09f74cb900853d4016afa8..8480ed8471e60c7e7eb8f14bf11a1cc2d23204cf 100644 --- a/python/setup.py.app.in +++ b/python/setup.py.app.in @@ -32,8 +32,8 @@ if '${PACK}' == 'ON': REQUIRED_PACKAGES = [ - 'six >= 1.10.0', 'sentencepiece', 'opencv-python<=4.2.0.32', 'pillow', - 'shapely<=1.6.1', 'pyclipper' + 'six >= 1.10.0', 'sentencepiece<=0.1.92', 'opencv-python<=4.2.0.32', 'pillow', + 'pyclipper' ] packages=['paddle_serving_app', diff --git a/requirements.txt b/requirements.txt new file mode 100644 index 0000000000000000000000000000000000000000..a1eb26e5cec23a8f76a50be48608f8a4532c6993 --- /dev/null +++ b/requirements.txt @@ -0,0 +1,5 @@ +sphinx==2.1.0 +mistune +sphinx_rtd_theme +paddlepaddle>=1.8.4 +shapely<=1.6.1 diff --git a/doc/requirements.txt b/requirements_win.txt similarity index 58% rename from doc/requirements.txt rename to requirements_win.txt index 1560ebc5f9d74fbae773ac5bc45c5b42b044287a..a202642af70fec0c57642cc53b8ead82a5a7c7f1 100644 --- a/doc/requirements.txt +++ b/requirements_win.txt @@ -1,4 +1,5 @@ sphinx==2.1.0 mistune sphinx_rtd_theme -paddlepaddle>=1.6 +paddlepaddle>=1.8.4 +shapely diff --git a/tools/Dockerfile.cuda10.1-cudnn7-trt6.devel b/tools/Dockerfile.cuda10.1-cudnn7-trt6.devel new file mode 100644 index 0000000000000000000000000000000000000000..c6e1c1e050505e631493efe21732a98abd1bd52e --- /dev/null +++ b/tools/Dockerfile.cuda10.1-cudnn7-trt6.devel @@ -0,0 +1,60 @@ +FROM nvidia/cuda:10.1-cudnn7-devel-centos7 + +RUN export http_proxy="http://172.19.56.199:3128" \ + && export https_proxy="http://172.19.56.199:3128" \ + && yum -y install wget >/dev/null \ + && yum -y install gcc gcc-c++ make glibc-static which \ + && yum -y install git openssl-devel curl-devel bzip2-devel python-devel \ + && yum -y install libSM-1.2.2-2.el7.x86_64 --setopt=protected_multilib=false \ + && yum -y install libXrender-0.9.10-1.el7.x86_64 --setopt=protected_multilib=false \ + && yum -y install libXext-1.3.3-3.el7.x86_64 --setopt=protected_multilib=false + +RUN export http_proxy="http://172.19.56.199:3128" \ + && export https_proxy="http://172.19.56.199:3128" && \ + wget https://github.com/protocolbuffers/protobuf/releases/download/v3.11.2/protobuf-all-3.11.2.tar.gz && \ + tar zxf protobuf-all-3.11.2.tar.gz && \ + cd protobuf-3.11.2 && \ + ./configure && make -j4 && make install && \ + make clean && \ + cd .. && rm -rf protobuf-* + +RUN export http_proxy="http://172.19.56.199:3128" \ + && export https_proxy="http://172.19.56.199:3128" && \ + wget https://cmake.org/files/v3.2/cmake-3.2.0-Linux-x86_64.tar.gz >/dev/null \ + && tar xzf cmake-3.2.0-Linux-x86_64.tar.gz \ + && mv cmake-3.2.0-Linux-x86_64 /usr/local/cmake3.2.0 \ + && echo 'export PATH=/usr/local/cmake3.2.0/bin:$PATH' >> /root/.bashrc \ + && rm cmake-3.2.0-Linux-x86_64.tar.gz + + +RUN export http_proxy="http://172.19.56.199:3128" \ + && export https_proxy="http://172.19.56.199:3128" && \ + wget https://dl.google.com/go/go1.14.linux-amd64.tar.gz >/dev/null \ + && tar xzf go1.14.linux-amd64.tar.gz \ + && mv go /usr/local/go \ + && echo 'export GOROOT=/usr/local/go' >> /root/.bashrc \ + && echo 'export PATH=/usr/local/go/bin:$PATH' >> /root/.bashrc \ + && rm go1.14.linux-amd64.tar.gz + +RUN export http_proxy="http://172.19.56.199:3128" \ + && export https_proxy="http://172.19.56.199:3128" && \ + yum -y install python-devel sqlite-devel \ + && curl https://bootstrap.pypa.io/get-pip.py -o get-pip.py >/dev/null \ + && python get-pip.py >/dev/null \ + && rm get-pip.py + +RUN export http_proxy="http://172.19.56.199:3128" \ + && export https_proxy="http://172.19.56.199:3128" && \ + yum install -y python3 python3-devel \ + && yum -y install epel-release && yum -y install patchelf libXext libSM libXrender\ + && yum clean all + +RUN localedef -c -i en_US -f UTF-8 en_US.UTF-8 \ + && echo "export LANG=en_US.utf8" >> /root/.bashrc \ + && echo "export LANGUAGE=en_US.utf8" >> /root/.bashrc + +RUN wget https://paddle-serving.bj.bcebos.com/tools/TensorRT-6.0.1.5.CentOS-7.6.x86_64-gnu.cuda-10.1.cudnn7.6.tar.gz \ + && tar -xzf TensorRT-6.0.1.5.CentOS-7.6.x86_64-gnu.cuda-10.1.cudnn7.6.tar.gz \ + && mv TensorRT-6.0.1.5 /usr/local/ \ + && rm TensorRT-6.0.1.5.CentOS-7.6.x86_64-gnu.cuda-10.1.cudnn7.6.tar.gz \ + && echo 'export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/TensorRT-6.0.1.5/lib/' >> /root/.bashrc