未验证 提交 62a386be 编写于 作者: M MRXLT 提交者: GitHub

Merge pull request #507 from MRXLT/senta-fix

fix senta demo
...@@ -166,7 +166,7 @@ python image_classification_service_demo.py resnet50_serving_model ...@@ -166,7 +166,7 @@ python image_classification_service_demo.py resnet50_serving_model
<p> <p>
``` shell ``` shell
curl -H "Content-Type:application/json" -X POST -d '{"url": "https://paddle-serving.bj.bcebos.com/imagenet-example/daisy.jpg", "fetch": ["score"]}' http://127.0.0.1:9292/image/prediction curl -H "Content-Type:application/json" -X POST -d '{"feed":[{"url": "https://paddle-serving.bj.bcebos.com/imagenet-example/daisy.jpg"}], "fetch": ["score"]}' http://127.0.0.1:9292/image/prediction
``` ```
- **Request result**: - **Request result**:
``` shell ``` shell
......
...@@ -171,7 +171,7 @@ python image_classification_service_demo.py resnet50_serving_model ...@@ -171,7 +171,7 @@ python image_classification_service_demo.py resnet50_serving_model
<p> <p>
``` shell ``` shell
curl -H "Content-Type:application/json" -X POST -d '{"url": "https://paddle-serving.bj.bcebos.com/imagenet-example/daisy.jpg", "fetch": ["score"]}' http://127.0.0.1:9292/image/prediction curl -H "Content-Type:application/json" -X POST -d '{"feed":[{"url": "https://paddle-serving.bj.bcebos.com/imagenet-example/daisy.jpg"}], "fetch": ["score"]}' http://127.0.0.1:9292/image/prediction
``` ```
- **返回结果示例**: - **返回结果示例**:
``` shell ``` shell
......
# Chinese sentence sentiment classification # Chinese sentence sentiment classification
([简体中文](./README_CN.md)|English)
## Get model files and sample data ## Get model files and sample data
``` ```
sh get_data.sh sh get_data.sh
...@@ -12,5 +12,5 @@ In the Chinese sentiment classification task, the Chinese word segmentation need ...@@ -12,5 +12,5 @@ In the Chinese sentiment classification task, the Chinese word segmentation need
In this demo, the LAC task is placed in the preprocessing part of the HTTP prediction service of the sentiment classification task. The LAC prediction service is deployed on the CPU, and the sentiment classification task is deployed on the GPU, which can be changed according to the actual situation. In this demo, the LAC task is placed in the preprocessing part of the HTTP prediction service of the sentiment classification task. The LAC prediction service is deployed on the CPU, and the sentiment classification task is deployed on the GPU, which can be changed according to the actual situation.
## Client prediction ## Client prediction
``` ```
curl -H "Content-Type:application/json" -X POST -d '{"words": "天气不错 | 0", "fetch":["sentence_feature"]}' http://127.0.0.1:9292/senta/prediction curl -H "Content-Type:application/json" -X POST -d '{"feed":[{"words": "天气不错"}], "fetch":["class_probs"]}' http://127.0.0.1:9292/senta/prediction
``` ```
# 中文语句情感分类 # 中文语句情感分类
(简体中文|[English](./README.md))
## 获取模型文件和样例数据 ## 获取模型文件和样例数据
``` ```
sh get_data.sh sh get_data.sh
...@@ -13,5 +13,5 @@ python senta_web_service.py senta_bilstm_model/ workdir 9292 ...@@ -13,5 +13,5 @@ python senta_web_service.py senta_bilstm_model/ workdir 9292
## 客户端预测 ## 客户端预测
``` ```
curl -H "Content-Type:application/json" -X POST -d '{"words": "天气不错 | 0", "fetch":["sentence_feature"]}' http://127.0.0.1:9292/senta/prediction curl -H "Content-Type:application/json" -X POST -d '{"feed":[{"words": "天气不错"}], "fetch":["class_probs"]}' http://127.0.0.1:9292/senta/prediction
``` ```
#wget https://paddle-serving.bj.bcebos.com/paddle_hub_models/text/SentimentAnalysis/senta_bilstm.tar.gz --no-check-certificate wget https://paddle-serving.bj.bcebos.com/paddle_hub_models/text/SentimentAnalysis/senta_bilstm.tar.gz --no-check-certificate
#tar -xzvf senta_bilstm.tar.gz tar -xzvf senta_bilstm.tar.gz
wget https://paddle-serving.bj.bcebos.com/paddle_hub_models/text/LexicalAnalysis/lac_model.tar.gz --no-check-certificate wget https://paddle-serving.bj.bcebos.com/paddle_hub_models/text/LexicalAnalysis/lac_model.tar.gz --no-check-certificate
tar -xzvf lac_model.tar.gz tar -xzvf lac_model.tar.gz
wget https://paddle-serving.bj.bcebos.com/reader/lac/lac_dict.tar.gz --no-check-certificate wget https://paddle-serving.bj.bcebos.com/reader/lac/lac_dict.tar.gz --no-check-certificate
......
...@@ -69,11 +69,8 @@ class SentaService(WebService): ...@@ -69,11 +69,8 @@ class SentaService(WebService):
def init_senta_reader(self): def init_senta_reader(self):
self.senta_reader = SentaReader(vocab_path=self.senta_dict_path) self.senta_reader = SentaReader(vocab_path=self.senta_dict_path)
def preprocess(self, feed={}, fetch={}): def preprocess(self, feed=[], fetch=[]):
if "words" not in feed: feed_data = self.lac_reader.process(feed[0]["words"])
raise ("feed data error!")
feed_data = self.lac_reader.process(feed["words"])
fetch = ["crf_decode"]
if self.show: if self.show:
print("---- lac reader ----") print("---- lac reader ----")
print(feed_data) print(feed_data)
...@@ -81,7 +78,7 @@ class SentaService(WebService): ...@@ -81,7 +78,7 @@ class SentaService(WebService):
if self.show: if self.show:
print("---- lac out ----") print("---- lac out ----")
print(lac_result) print(lac_result)
segs = self.lac_reader.parse_result(feed["words"], segs = self.lac_reader.parse_result(feed[0]["words"],
lac_result["crf_decode"]) lac_result["crf_decode"])
if self.show: if self.show:
print("---- lac parse ----") print("---- lac parse ----")
...@@ -90,7 +87,6 @@ class SentaService(WebService): ...@@ -90,7 +87,6 @@ class SentaService(WebService):
if self.show: if self.show:
print("---- senta reader ----") print("---- senta reader ----")
print("feed_data", feed_data) print("feed_data", feed_data)
fetch = ["class_probs"]
return {"words": feed_data}, fetch return {"words": feed_data}, fetch
...@@ -107,31 +103,4 @@ senta_service.init_lac_reader() ...@@ -107,31 +103,4 @@ senta_service.init_lac_reader()
senta_service.init_senta_reader() senta_service.init_senta_reader()
senta_service.init_lac_service() senta_service.init_lac_service()
senta_service.run_server() senta_service.run_server()
#senta_service.run_flask() senta_service.run_flask()
from flask import Flask, request
app_instance = Flask(__name__)
@app_instance.before_first_request
def init():
global uci_service
senta_service._launch_web_service()
service_name = "/" + senta_service.name + "/prediction"
@app_instance.route(service_name, methods=["POST"])
def run():
print("---- run ----")
print(request.json)
return senta_service.get_prediction(request)
if __name__ == "__main__":
app_instance.run(host="0.0.0.0",
port=senta_service.port,
threaded=False,
processes=4)
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册