提交 8335de4a 编写于 作者: B barrierye

Merge branch 'develop' of https://github.com/PaddlePaddle/Serving into develop

<p align="center">
<br>
<img src='https://paddle-serving.bj.bcebos.com/imdb-demo%2FLogoMakr-3Bd2NM-300dpi.png' width = "600" height = "130">
<img src='doc/serving_logo.png' width = "600" height = "130">
<br>
<p>
......@@ -166,7 +166,7 @@ python image_classification_service_demo.py resnet50_serving_model
<p>
``` shell
curl -H "Content-Type:application/json" -X POST -d '{"url": "https://paddle-serving.bj.bcebos.com/imagenet-example/daisy.jpg", "fetch": ["score"]}' http://127.0.0.1:9292/image/prediction
curl -H "Content-Type:application/json" -X POST -d '{"feed":[{"url": "https://paddle-serving.bj.bcebos.com/imagenet-example/daisy.jpg"}], "fetch": ["score"]}' http://127.0.0.1:9292/image/prediction
```
- **Request result**:
``` shell
......
......@@ -171,7 +171,7 @@ python image_classification_service_demo.py resnet50_serving_model
<p>
``` shell
curl -H "Content-Type:application/json" -X POST -d '{"url": "https://paddle-serving.bj.bcebos.com/imagenet-example/daisy.jpg", "fetch": ["score"]}' http://127.0.0.1:9292/image/prediction
curl -H "Content-Type:application/json" -X POST -d '{"feed":[{"url": "https://paddle-serving.bj.bcebos.com/imagenet-example/daisy.jpg"}], "fetch": ["score"]}' http://127.0.0.1:9292/image/prediction
```
- **返回结果示例**:
``` shell
......
# Chinese sentence sentiment classification
([简体中文](./README_CN.md)|English)
## Get model files and sample data
```
sh get_data.sh
......@@ -12,5 +12,5 @@ In the Chinese sentiment classification task, the Chinese word segmentation need
In this demo, the LAC task is placed in the preprocessing part of the HTTP prediction service of the sentiment classification task. The LAC prediction service is deployed on the CPU, and the sentiment classification task is deployed on the GPU, which can be changed according to the actual situation.
## Client prediction
```
curl -H "Content-Type:application/json" -X POST -d '{"words": "天气不错 | 0", "fetch":["sentence_feature"]}' http://127.0.0.1:9292/senta/prediction
curl -H "Content-Type:application/json" -X POST -d '{"feed":[{"words": "天气不错"}], "fetch":["class_probs"]}' http://127.0.0.1:9292/senta/prediction
```
# 中文语句情感分类
(简体中文|[English](./README.md))
## 获取模型文件和样例数据
```
sh get_data.sh
......@@ -13,5 +13,5 @@ python senta_web_service.py senta_bilstm_model/ workdir 9292
## 客户端预测
```
curl -H "Content-Type:application/json" -X POST -d '{"words": "天气不错 | 0", "fetch":["sentence_feature"]}' http://127.0.0.1:9292/senta/prediction
curl -H "Content-Type:application/json" -X POST -d '{"feed":[{"words": "天气不错"}], "fetch":["class_probs"]}' http://127.0.0.1:9292/senta/prediction
```
#wget https://paddle-serving.bj.bcebos.com/paddle_hub_models/text/SentimentAnalysis/senta_bilstm.tar.gz --no-check-certificate
#tar -xzvf senta_bilstm.tar.gz
wget https://paddle-serving.bj.bcebos.com/paddle_hub_models/text/SentimentAnalysis/senta_bilstm.tar.gz --no-check-certificate
tar -xzvf senta_bilstm.tar.gz
wget https://paddle-serving.bj.bcebos.com/paddle_hub_models/text/LexicalAnalysis/lac_model.tar.gz --no-check-certificate
tar -xzvf lac_model.tar.gz
wget https://paddle-serving.bj.bcebos.com/reader/lac/lac_dict.tar.gz --no-check-certificate
......
......@@ -69,11 +69,8 @@ class SentaService(WebService):
def init_senta_reader(self):
self.senta_reader = SentaReader(vocab_path=self.senta_dict_path)
def preprocess(self, feed={}, fetch={}):
if "words" not in feed:
raise ("feed data error!")
feed_data = self.lac_reader.process(feed["words"])
fetch = ["crf_decode"]
def preprocess(self, feed=[], fetch=[]):
feed_data = self.lac_reader.process(feed[0]["words"])
if self.show:
print("---- lac reader ----")
print(feed_data)
......@@ -81,7 +78,7 @@ class SentaService(WebService):
if self.show:
print("---- lac out ----")
print(lac_result)
segs = self.lac_reader.parse_result(feed["words"],
segs = self.lac_reader.parse_result(feed[0]["words"],
lac_result["crf_decode"])
if self.show:
print("---- lac parse ----")
......@@ -90,7 +87,6 @@ class SentaService(WebService):
if self.show:
print("---- senta reader ----")
print("feed_data", feed_data)
fetch = ["class_probs"]
return {"words": feed_data}, fetch
......@@ -107,31 +103,4 @@ senta_service.init_lac_reader()
senta_service.init_senta_reader()
senta_service.init_lac_service()
senta_service.run_server()
#senta_service.run_flask()
from flask import Flask, request
app_instance = Flask(__name__)
@app_instance.before_first_request
def init():
global uci_service
senta_service._launch_web_service()
service_name = "/" + senta_service.name + "/prediction"
@app_instance.route(service_name, methods=["POST"])
def run():
print("---- run ----")
print(request.json)
return senta_service.get_prediction(request)
if __name__ == "__main__":
app_instance.run(host="0.0.0.0",
port=senta_service.port,
threaded=False,
processes=4)
senta_service.run_flask()
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册