提交 e4705d52 编写于 作者: M MRXLT

refine lac demo

上级 4533bce1
......@@ -2,28 +2,27 @@
([简体中文](./README_CN.md)|English)
### Get model files and sample data
### Get Model
```
sh get_data.sh
python -m paddle_serving_app.package --get_model lac
tar -xzvf lac.tar.gz
```
the package downloaded contains lac model config along with lac dictionary.
#### Start RPC inference service
```
python -m paddle_serving_server.serve --model jieba_server_model/ --port 9292
python -m paddle_serving_server.serve --model lac_model/ --port 9292
```
### RPC Infer
```
echo "我爱北京天安门" | python lac_client.py jieba_client_conf/serving_client_conf.prototxt lac_dict/
echo "我爱北京天安门" | python lac_client.py lac_client/serving_client_conf.prototxt
```
it will get the segmentation result
It will get the segmentation result.
### Start HTTP inference service
```
python lac_web_service.py jieba_server_model/ lac_workdir 9292
python lac_web_service.py lac_model/ lac_workdir 9292
```
### HTTP Infer
......
......@@ -2,28 +2,27 @@
(简体中文|[English](./README.md))
### 获取模型和字典文件
### 获取模型
```
sh get_data.sh
python -m paddle_serving_app.package --get_model lac
tar -xzvf lac.tar.gz
```
下载包里包含了lac模型和lac模型预测需要的字典文件
#### 开启RPC预测服务
```
python -m paddle_serving_server.serve --model jieba_server_model/ --port 9292
python -m paddle_serving_server.serve --model lac_model/ --port 9292
```
### 执行RPC预测
```
echo "我爱北京天安门" | python lac_client.py jieba_client_conf/serving_client_conf.prototxt lac_dict/
echo "我爱北京天安门" | python lac_client.py lac_client/serving_client_conf.prototxt
```
我们就能得到分词结果
### 开启HTTP预测服务
```
python lac_web_service.py jieba_server_model/ lac_workdir 9292
python lac_web_service.py lac_model/ lac_workdir 9292
```
### 执行HTTP预测
......
......@@ -16,7 +16,7 @@
import sys
import time
import requests
from lac_reader import LACReader
from paddle_serving_app.reader import LACReader
from paddle_serving_client import Client
from paddle_serving_client.utils import MultiThreadRunner
from paddle_serving_client.utils import benchmark_args
......@@ -25,7 +25,7 @@ args = benchmark_args()
def single_func(idx, resource):
reader = LACReader("lac_dict")
reader = LACReader()
start = time.time()
if args.request == "rpc":
client = Client()
......
wget --no-check-certificate https://paddle-serving.bj.bcebos.com/lac/lac_model_jieba_web.tar.gz
tar -zxvf lac_model_jieba_web.tar.gz
......@@ -15,7 +15,7 @@
# pylint: disable=doc-string-missing
from paddle_serving_client import Client
from lac_reader import LACReader
from paddle_serving_app.reader import LACReader
import sys
import os
import io
......@@ -24,7 +24,7 @@ client = Client()
client.load_client_config(sys.argv[1])
client.connect(["127.0.0.1:9292"])
reader = LACReader(sys.argv[2])
reader = LACReader()
for line in sys.stdin:
if len(line) <= 0:
continue
......@@ -32,4 +32,8 @@ for line in sys.stdin:
if len(feed_data) <= 0:
continue
fetch_map = client.predict(feed={"words": feed_data}, fetch=["crf_decode"])
print(fetch_map)
begin = fetch_map['crf_decode.lod'][0]
end = fetch_map['crf_decode.lod'][1]
segs = reader.parse_result(line, fetch_map["crf_decode"][begin:end])
print({"word_seg": "|".join(segs)})
......@@ -14,7 +14,7 @@
from paddle_serving_server.web_service import WebService
import sys
from lac_reader import LACReader
from paddle_serving_app.reader import LACReader
class LACService(WebService):
......
......@@ -86,7 +86,7 @@ class WebService(object):
for key in fetch_map:
fetch_map[key] = fetch_map[key].tolist()
fetch_map = self.postprocess(
feed=feed, fetch=fetch, fetch_map=fetch_map)
feed=request.json["feed"], fetch=fetch, fetch_map=fetch_map)
result = {"result": fetch_map}
except ValueError:
result = {"result": "Request Value Error"}
......
#!/usr/bin/env bash
set -x
function unsetproxy() {
HTTP_PROXY_TEMP=$http_proxy
HTTPS_PROXY_TEMP=$https_proxy
......@@ -455,15 +455,16 @@ function python_test_lac() {
cd lac # pwd: /Serving/python/examples/lac
case $TYPE in
CPU)
sh get_data.sh
check_cmd "python -m paddle_serving_server.serve --model jieba_server_model/ --port 9292 &"
python -m paddle_serving_app.package --get_model lac
tar -xzvf lac.tar.gz
check_cmd "python -m paddle_serving_server.serve --model lac_model/ --port 9292 &"
sleep 5
check_cmd "echo \"我爱北京天安门\" | python lac_client.py jieba_client_conf/serving_client_conf.prototxt lac_dict/"
check_cmd "echo \"我爱北京天安门\" | python lac_client.py lac_client/serving_client_conf.prototxt "
echo "lac CPU RPC inference pass"
kill_server_process
unsetproxy # maybe the proxy is used on iPipe, which makes web-test failed.
check_cmd "python lac_web_service.py jieba_server_model/ lac_workdir 9292 &"
check_cmd "python lac_web_service.py lac_model/ lac_workdir 9292 &"
sleep 5
check_cmd "curl -H \"Content-Type:application/json\" -X POST -d '{\"feed\":[{\"words\": \"我爱北京天安门\"}], \"fetch\":[\"word_seg\"]}' http://127.0.0.1:9292/lac/prediction"
# check http code
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册