提交 8f820382 编写于 作者: M MRXLT

add readme

上级 bc8d6abb
......@@ -8,7 +8,8 @@ sh get_data.sh
```
python senta_web_service.py senta_bilstm_model/ workdir 9292
```
In the Chinese sentiment classification task, the Chinese word segmentation needs to be done through [LAC task] (../lac). Set model path by ```lac_model_path``` and dictionary path by ```lac_dict_path```. Please refer to [LAC task](../lac) documents to get these files.
In this demo, the LAC task is placed in the preprocessing part of the HTTP prediction service of the sentiment classification task. The LAC prediction service is deployed on the CPU, and the sentiment classification task is deployed on the GPU, which can be changed according to the actual situation.
## Client prediction
```
curl -H "Content-Type:application/json" -X POST -d '{"words": "天气不错 | 0", "fetch":["sentence_feature"]}' http://127.0.0.1:9292/senta/prediction
......
# 中文语句情感分类
## 获取模型文件和样例数据
```
sh get_data.sh
```
## 启动HTTP服务
```
python senta_web_service.py senta_bilstm_model/ workdir 9292
```
中文情感分类任务中需要先通过[LAC任务](../lac)进行中文分词,在脚本中通过```lac_model_path```参数配置LAC任务的模型文件路径,```lac_dict_path```参数配置LAC任务词典路径。相关文件请参考[LAC任务](../lac)文档获取。
示例中将LAC任务放在情感分类任务的HTTP预测服务的预处理部分,LAC预测服务部署在CPU上,情感分类任务部署在GPU上,可以根据实际情况进行更改。
## 客户端预测
```
curl -H "Content-Type:application/json" -X POST -d '{"words": "天气不错 | 0", "fetch":["sentence_feature"]}' http://127.0.0.1:9292/senta/prediction
```
wget https://paddle-serving.bj.bcebos.com/paddle_hub_models/text/SentimentAnalysis/senta_bilstm.tar.gz --no-check-certificate
tar -xzvf senta_bilstm.tar.gz
wget https://baidu-nlp.bj.bcebos.com/sentiment_classification-dataset-1.0.0.tar.gz
wget https://baidu-nlp.bj.bcebos.com/sentiment_classification-dataset-1.0.0.tar.gz --no-check-certificate
tar -zxvf sentiment_classification-dataset-1.0.0.tar.gz
......@@ -24,13 +24,23 @@ from multiprocessing import Process, Queue
class SentaService(WebService):
def __init__(
self,
lac_model_path,
lac_dict_path,
senta_dict_path, ):
self.lac_model_path = lac_model_path
self.lac_client_config_path = lac_model_path + "/serving_server_conf.prototxt"
self.lac_dict_path = lac_dict_path
self.senta_dict_path = senta_dict_path
def start_lac_service(self):
print(" ---- start lac service ---- ")
os.chdir('./lac_serving')
self.lac_port = self.port + 100
r = os.popen(
"python -m paddle_serving_server.serve --model ../../lac/jieba_server_model/ --port {} &".
format(self.lac_port))
"python -m paddle_serving_server_gpu.serve --model {} --port {} &".
format(self.lac_model_path, self.lac_port))
os.chdir('..')
def init_lac_service(self):
......@@ -47,15 +57,14 @@ class SentaService(WebService):
def init_lac_client(self):
self.lac_client = Client()
self.lac_client.load_client_config(
"../lac/jieba_client_conf/serving_client_conf.prototxt")
self.lac_client.load_client_config(self.lac_client_config_path)
self.lac_client.connect(["127.0.0.1:{}".format(self.lac_port)])
def init_lac_reader(self):
self.lac_reader = LACReader("../lac/lac_dict")
self.lac_reader = LACReader(self.lac_dict_path)
def init_senta_reader(self):
self.senta_reader = SentaReader(vocab_path="./senta_data/word_dict.txt")
self.senta_reader = SentaReader(vocab_path=self.senta_dict_path)
def preprocess(self, feed={}, fetch={}):
print("---- preprocess ----")
......@@ -79,7 +88,12 @@ class SentaService(WebService):
return {"words": feed_data}, fetch
senta_service = SentaService(name="senta")
senta_service = SentaService(
name="senta",
lac_model_path="../../lac/jieba_server_model/",
lac_client_config_path="../lac/jieba_client_conf/serving_client_conf.prototxt",
lac_dict="../lac/lac_dict",
senta_dict="./senta_data/word_dict.txt")
senta_service.load_model_config(sys.argv[1])
senta_service.prepare_server(
workdir=sys.argv[2], port=int(sys.argv[3]), device="cpu")
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册