提交 f1364e0e 编写于 作者: M MRXLT

refine senta demo

上级 1f9f0839
# Chinese sentence sentiment classification # Chinese Sentence Sentiment Classification
([简体中文](./README_CN.md)|English) ([简体中文](./README_CN.md)|English)
## Get model files and sample data
```
sh get_data.sh
```
## Install preprocess module
## Get Model
``` ```
pip install paddle_serving_app python -m paddle_serving_app.package --get_model senta_bilstm
python -m paddle_serving_app.package --get_model lac
``` ```
## Start http service ## Start HTTP Service
``` ```
python senta_web_service.py senta_bilstm_model/ workdir 9292 python -m paddle_serving_server.serve --model lac_model --port 9300
python senta_web_service.py
``` ```
In the Chinese sentiment classification task, the Chinese word segmentation needs to be done through [LAC task] (../lac). Set model path by ```lac_model_path``` and dictionary path by ```lac_dict_path```. In the Chinese sentiment classification task, the Chinese word segmentation needs to be done through [LAC task] (../lac).
In this demo, the LAC task is placed in the preprocessing part of the HTTP prediction service of the sentiment classification task. The LAC prediction service is deployed on the CPU, and the sentiment classification task is deployed on the GPU, which can be changed according to the actual situation. In this demo, the LAC task is placed in the preprocessing part of the HTTP prediction service of the sentiment classification task.
## Client prediction ## Client prediction
``` ```
curl -H "Content-Type:application/json" -X POST -d '{"feed":[{"words": "天气不错"}], "fetch":["class_probs"]}' http://127.0.0.1:9292/senta/prediction curl -H "Content-Type:application/json" -X POST -d '{"feed":[{"words": "天气不错"}], "fetch":["class_probs"]}' http://127.0.0.1:9292/senta/prediction
......
# 中文语句情感分类 # 中文语句情感分类
(简体中文|[English](./README.md)) (简体中文|[English](./README.md))
## 获取模型文件和样例数据
``` ## 获取模型文件
sh get_data.sh
```
## 安装数据预处理模块
``` ```
pip install paddle_serving_app python -m paddle_serving_app.package --get_model senta_bilstm
python -m paddle_serving_app.package --get_model lac
``` ```
## 启动HTTP服务 ## 启动HTTP服务
``` ```
python senta_web_service.py senta_bilstm_model/ workdir 9292 python -m paddle_serving_server.serve --model lac_model --port 9300
python senta_web_service.py
``` ```
中文情感分类任务中需要先通过[LAC任务](../lac)进行中文分词,在脚本中通过```lac_model_path```参数配置LAC任务的模型文件路径,```lac_dict_path```参数配置LAC任务词典路径 中文情感分类任务中需要先通过[LAC任务](../lac)进行中文分词。
示例中将LAC任务放在情感分类任务的HTTP预测服务的预处理部分,LAC预测服务部署在CPU上,情感分类任务部署在GPU上,可以根据实际情况进行更改 示例中将LAC任务放在情感分类任务的HTTP预测服务的预处理部分。
## 客户端预测 ## 客户端预测
``` ```
......
#encoding=utf-8
# Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved. # Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
# #
# Licensed under the Apache License, Version 2.0 (the "License"); # Licensed under the Apache License, Version 2.0 (the "License");
...@@ -12,56 +13,28 @@ ...@@ -12,56 +13,28 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from paddle_serving_server_gpu.web_service import WebService from paddle_serving_server.web_service import WebService
from paddle_serving_client import Client from paddle_serving_client import Client
from paddle_serving_app.reader import LACReader, SentaReader from paddle_serving_app.reader import LACReader, SentaReader
import os import os
import sys import sys
from multiprocessing import Process
#senta_web_service.py
from paddle_serving_server.web_service import WebService
from paddle_serving_client import Client
from paddle_serving_app.reader import LACReader, SentaReader
class SentaService(WebService):
def set_config(
self,
lac_model_path,
lac_dict_path,
senta_dict_path, ):
self.lac_model_path = lac_model_path
self.lac_client_config_path = lac_model_path + "/serving_server_conf.prototxt"
self.lac_dict_path = lac_dict_path
self.senta_dict_path = senta_dict_path
def start_lac_service(self):
if not os.path.exists('./lac_serving'):
os.mkdir("./lac_serving")
os.chdir('./lac_serving')
self.lac_port = self.port + 100
r = os.popen(
"python -m paddle_serving_server.serve --model {} --port {} &".
format("../" + self.lac_model_path, self.lac_port))
os.chdir('..')
def init_lac_service(self):
ps = Process(target=self.start_lac_service())
ps.start()
self.init_lac_client()
def lac_predict(self, feed_data):
lac_result = self.lac_client.predict(
feed={"words": feed_data}, fetch=["crf_decode"])
return lac_result
def init_lac_client(self):
self.lac_client = Client()
self.lac_client.load_client_config(self.lac_client_config_path)
self.lac_client.connect(["127.0.0.1:{}".format(self.lac_port)])
def init_lac_reader(self): class SentaService(WebService):
#初始化lac模型预测服务
def init_lac_client(self, lac_port, lac_client_config):
self.lac_reader = LACReader() self.lac_reader = LACReader()
def init_senta_reader(self):
self.senta_reader = SentaReader() self.senta_reader = SentaReader()
self.lac_client = Client()
self.lac_client.load_client_config(lac_client_config)
self.lac_client.connect(["127.0.0.1:{}".format(lac_port)])
#定义senta模型预测服务的预处理,调用顺序:lac reader->lac模型预测->预测结果后处理->senta reader
def preprocess(self, feed=[], fetch=[]): def preprocess(self, feed=[], fetch=[]):
feed_data = [{ feed_data = [{
"words": self.lac_reader.process(x["words"]) "words": self.lac_reader.process(x["words"])
...@@ -80,15 +53,9 @@ class SentaService(WebService): ...@@ -80,15 +53,9 @@ class SentaService(WebService):
senta_service = SentaService(name="senta") senta_service = SentaService(name="senta")
senta_service.set_config( senta_service.load_model_config("senta_bilstm_model")
lac_model_path="./lac_model", senta_service.prepare_server(workdir="workdir")
lac_dict_path="./lac_dict", senta_service.init_lac_client(
senta_dict_path="./vocab.txt") lac_port=9300, lac_client_config="lac_model/serving_server_conf.prototxt")
senta_service.load_model_config(sys.argv[1])
senta_service.prepare_server(
workdir=sys.argv[2], port=int(sys.argv[3]), device="cpu")
senta_service.init_lac_reader()
senta_service.init_senta_reader()
senta_service.init_lac_service()
senta_service.run_rpc_service() senta_service.run_rpc_service()
senta_service.run_web_service() senta_service.run_web_service()
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册