Skip to content
体验新版
项目
组织
正在加载...
登录
切换导航
打开侧边栏
PaddlePaddle
Serving
提交
bd42658a
S
Serving
项目概览
PaddlePaddle
/
Serving
大约 1 年 前同步成功
通知
186
Star
833
Fork
253
代码
文件
提交
分支
Tags
贡献者
分支图
Diff
Issue
105
列表
看板
标记
里程碑
合并请求
10
Wiki
2
Wiki
分析
仓库
DevOps
项目成员
Pages
S
Serving
项目概览
项目概览
详情
发布
仓库
仓库
文件
提交
分支
标签
贡献者
分支图
比较
Issue
105
Issue
105
列表
看板
标记
里程碑
合并请求
10
合并请求
10
Pages
分析
分析
仓库分析
DevOps
Wiki
2
Wiki
成员
成员
收起侧边栏
关闭侧边栏
动态
分支图
创建新Issue
提交
Issue看板
提交
bd42658a
编写于
5月 26, 2020
作者:
M
MRXLT
提交者:
GitHub
5月 26, 2020
浏览文件
操作
浏览文件
下载
差异文件
Merge pull request #604 from MRXLT/0.2.2-web-service
fix app && web service && update version
上级
6dc83a9f
d55a0062
变更
29
隐藏空白更改
内联
并排
Showing
29 changed file
with
81 addition
and
82 deletion
+81
-82
doc/DESIGN_DOC_CN.md
doc/DESIGN_DOC_CN.md
+1
-1
python/examples/bert/benchmark.py
python/examples/bert/benchmark.py
+2
-2
python/examples/bert/bert_client.py
python/examples/bert/bert_client.py
+1
-1
python/examples/bert/bert_web_service.py
python/examples/bert/bert_web_service.py
+4
-4
python/examples/imagenet/resnet50_web_service.py
python/examples/imagenet/resnet50_web_service.py
+2
-2
python/examples/imdb/benchmark.py
python/examples/imdb/benchmark.py
+1
-1
python/examples/imdb/test_client.py
python/examples/imdb/test_client.py
+1
-1
python/examples/imdb/text_classify_service.py
python/examples/imdb/text_classify_service.py
+3
-3
python/examples/lac/lac_web_service.py
python/examples/lac/lac_web_service.py
+2
-2
python/examples/resnet_v2_50/resnet50_debug.py
python/examples/resnet_v2_50/resnet50_debug.py
+1
-1
python/examples/senta/get_data.sh
python/examples/senta/get_data.sh
+1
-1
python/examples/senta/senta_web_service.py
python/examples/senta/senta_web_service.py
+20
-32
python/paddle_serving_app/README.md
python/paddle_serving_app/README.md
+1
-1
python/paddle_serving_app/README_CN.md
python/paddle_serving_app/README_CN.md
+1
-1
python/paddle_serving_app/__init__.py
python/paddle_serving_app/__init__.py
+0
-6
python/paddle_serving_app/models/model_list.py
python/paddle_serving_app/models/model_list.py
+1
-1
python/paddle_serving_app/reader/__init__.py
python/paddle_serving_app/reader/__init__.py
+4
-0
python/paddle_serving_app/reader/lac_reader.py
python/paddle_serving_app/reader/lac_reader.py
+7
-1
python/paddle_serving_app/reader/senta_reader.py
python/paddle_serving_app/reader/senta_reader.py
+9
-1
python/paddle_serving_app/version.py
python/paddle_serving_app/version.py
+1
-1
python/paddle_serving_client/__init__.py
python/paddle_serving_client/__init__.py
+3
-2
python/paddle_serving_client/io/__init__.py
python/paddle_serving_client/io/__init__.py
+0
-1
python/paddle_serving_client/version.py
python/paddle_serving_client/version.py
+3
-3
python/paddle_serving_server/serve.py
python/paddle_serving_server/serve.py
+1
-1
python/paddle_serving_server/version.py
python/paddle_serving_server/version.py
+3
-3
python/paddle_serving_server/web_service.py
python/paddle_serving_server/web_service.py
+2
-2
python/paddle_serving_server_gpu/serve.py
python/paddle_serving_server_gpu/serve.py
+1
-1
python/paddle_serving_server_gpu/version.py
python/paddle_serving_server_gpu/version.py
+3
-3
python/paddle_serving_server_gpu/web_service.py
python/paddle_serving_server_gpu/web_service.py
+2
-3
未找到文件。
doc/DESIGN_DOC_CN.md
浏览文件 @
bd42658a
...
@@ -26,7 +26,7 @@ serving_io.save_model("serving_model", "client_conf",
...
@@ -26,7 +26,7 @@ serving_io.save_model("serving_model", "client_conf",
{
"words"
:
data
},
{
"prediction"
:
prediction
},
{
"words"
:
data
},
{
"prediction"
:
prediction
},
fluid
.
default_main_program
())
fluid
.
default_main_program
())
```
```
代码示例中,
`{"words": data}`
和
`{"prediction": prediction}`
分别指定了模型的输入和输出,
`"words"`
和
`"prediction"`
是输
出
和输出变量的别名,设计别名的目的是为了使开发者能够记忆自己训练模型的输入输出对应的字段。
`data`
和
`prediction`
则是Paddle训练过程中的
`[Variable](https://www.paddlepaddle.org.cn/documentation/docs/zh/api_cn/fluid_cn/Variable_cn.html#variable)`
,通常代表张量(
[
Tensor
](
https://www.paddlepaddle.org.cn/documentation/docs/zh/api_cn/fluid_cn/Tensor_cn.html#tensor
)
)或变长张量(
[
LodTensor
](
https://www.paddlepaddle.org.cn/documentation/docs/zh/beginners_guide/basic_concept/lod_tensor.html#lodtensor
)
)。调用保存命令后,会按照用户指定的
`"serving_model"`
和
`"client_conf"`
生成两个目录,内容如下:
代码示例中,
`{"words": data}`
和
`{"prediction": prediction}`
分别指定了模型的输入和输出,
`"words"`
和
`"prediction"`
是输
入
和输出变量的别名,设计别名的目的是为了使开发者能够记忆自己训练模型的输入输出对应的字段。
`data`
和
`prediction`
则是Paddle训练过程中的
`[Variable](https://www.paddlepaddle.org.cn/documentation/docs/zh/api_cn/fluid_cn/Variable_cn.html#variable)`
,通常代表张量(
[
Tensor
](
https://www.paddlepaddle.org.cn/documentation/docs/zh/api_cn/fluid_cn/Tensor_cn.html#tensor
)
)或变长张量(
[
LodTensor
](
https://www.paddlepaddle.org.cn/documentation/docs/zh/beginners_guide/basic_concept/lod_tensor.html#lodtensor
)
)。调用保存命令后,会按照用户指定的
`"serving_model"`
和
`"client_conf"`
生成两个目录,内容如下:
```
shell
```
shell
.
.
├── client_conf
├── client_conf
...
...
python/examples/bert/benchmark.py
浏览文件 @
bd42658a
...
@@ -26,7 +26,7 @@ from batching import pad_batch_data
...
@@ -26,7 +26,7 @@ from batching import pad_batch_data
import
tokenization
import
tokenization
import
requests
import
requests
import
json
import
json
from
bert_reader
import
BertReader
from
paddle_serving_app.reader
import
Chinese
BertReader
args
=
benchmark_args
()
args
=
benchmark_args
()
...
@@ -37,7 +37,7 @@ def single_func(idx, resource):
...
@@ -37,7 +37,7 @@ def single_func(idx, resource):
for
line
in
fin
:
for
line
in
fin
:
dataset
.
append
(
line
.
strip
())
dataset
.
append
(
line
.
strip
())
if
args
.
request
==
"rpc"
:
if
args
.
request
==
"rpc"
:
reader
=
BertReader
(
vocab_file
=
"vocab.txt"
,
max_seq_len
=
20
)
reader
=
Chinese
BertReader
(
vocab_file
=
"vocab.txt"
,
max_seq_len
=
20
)
fetch
=
[
"pooled_output"
]
fetch
=
[
"pooled_output"
]
client
=
Client
()
client
=
Client
()
client
.
load_client_config
(
args
.
model
)
client
.
load_client_config
(
args
.
model
)
...
...
python/examples/bert/bert_client.py
浏览文件 @
bd42658a
...
@@ -25,7 +25,7 @@ from paddlehub.common.logger import logger
...
@@ -25,7 +25,7 @@ from paddlehub.common.logger import logger
import
socket
import
socket
from
paddle_serving_client
import
Client
from
paddle_serving_client
import
Client
from
paddle_serving_client.utils
import
benchmark_args
from
paddle_serving_client.utils
import
benchmark_args
from
paddle_serving_app
import
ChineseBertReader
from
paddle_serving_app
.reader
import
ChineseBertReader
args
=
benchmark_args
()
args
=
benchmark_args
()
...
...
python/examples/bert/bert_web_service.py
浏览文件 @
bd42658a
...
@@ -14,14 +14,14 @@
...
@@ -14,14 +14,14 @@
# limitations under the License.
# limitations under the License.
# pylint: disable=doc-string-missing
# pylint: disable=doc-string-missing
from
paddle_serving_server_gpu.web_service
import
WebService
from
paddle_serving_server_gpu.web_service
import
WebService
from
bert_reader
import
BertReader
from
paddle_serving_app.reader
import
Chinese
BertReader
import
sys
import
sys
import
os
import
os
class
BertService
(
WebService
):
class
BertService
(
WebService
):
def
load
(
self
):
def
load
(
self
):
self
.
reader
=
BertReader
(
vocab_file
=
"vocab.txt"
,
max_seq_len
=
128
)
self
.
reader
=
Chinese
BertReader
(
vocab_file
=
"vocab.txt"
,
max_seq_len
=
128
)
def
preprocess
(
self
,
feed
=
[],
fetch
=
[]):
def
preprocess
(
self
,
feed
=
[],
fetch
=
[]):
feed_res
=
[
feed_res
=
[
...
@@ -37,5 +37,5 @@ gpu_ids = os.environ["CUDA_VISIBLE_DEVICES"]
...
@@ -37,5 +37,5 @@ gpu_ids = os.environ["CUDA_VISIBLE_DEVICES"]
bert_service
.
set_gpus
(
gpu_ids
)
bert_service
.
set_gpus
(
gpu_ids
)
bert_service
.
prepare_server
(
bert_service
.
prepare_server
(
workdir
=
"workdir"
,
port
=
int
(
sys
.
argv
[
2
]),
device
=
"gpu"
)
workdir
=
"workdir"
,
port
=
int
(
sys
.
argv
[
2
]),
device
=
"gpu"
)
bert_service
.
run_
server
()
bert_service
.
run_
rpc_service
()
bert_service
.
run_
flask
()
bert_service
.
run_
web_service
()
python/examples/imagenet/resnet50_web_service.py
浏览文件 @
bd42658a
...
@@ -68,5 +68,5 @@ if device == "gpu":
...
@@ -68,5 +68,5 @@ if device == "gpu":
image_service
.
set_gpus
(
"0,1"
)
image_service
.
set_gpus
(
"0,1"
)
image_service
.
prepare_server
(
image_service
.
prepare_server
(
workdir
=
"workdir"
,
port
=
int
(
sys
.
argv
[
3
]),
device
=
device
)
workdir
=
"workdir"
,
port
=
int
(
sys
.
argv
[
3
]),
device
=
device
)
image_service
.
run_
server
()
image_service
.
run_
rpc_service
()
image_service
.
run_
flask
()
image_service
.
run_
web_service
()
python/examples/imdb/benchmark.py
浏览文件 @
bd42658a
...
@@ -16,7 +16,7 @@
...
@@ -16,7 +16,7 @@
import
sys
import
sys
import
time
import
time
import
requests
import
requests
from
paddle_serving_app
import
IMDBDataset
from
paddle_serving_app
.reader
import
IMDBDataset
from
paddle_serving_client
import
Client
from
paddle_serving_client
import
Client
from
paddle_serving_client.utils
import
MultiThreadRunner
from
paddle_serving_client.utils
import
MultiThreadRunner
from
paddle_serving_client.utils
import
benchmark_args
from
paddle_serving_client.utils
import
benchmark_args
...
...
python/examples/imdb/test_client.py
浏览文件 @
bd42658a
...
@@ -13,7 +13,7 @@
...
@@ -13,7 +13,7 @@
# limitations under the License.
# limitations under the License.
# pylint: disable=doc-string-missing
# pylint: disable=doc-string-missing
from
paddle_serving_client
import
Client
from
paddle_serving_client
import
Client
from
paddle_serving_app
import
IMDBDataset
from
paddle_serving_app
.reader
import
IMDBDataset
import
sys
import
sys
client
=
Client
()
client
=
Client
()
...
...
python/examples/imdb/text_classify_service.py
浏览文件 @
bd42658a
...
@@ -14,7 +14,7 @@
...
@@ -14,7 +14,7 @@
# pylint: disable=doc-string-missing
# pylint: disable=doc-string-missing
from
paddle_serving_server.web_service
import
WebService
from
paddle_serving_server.web_service
import
WebService
from
paddle_serving_app
import
IMDBDataset
from
paddle_serving_app
.reader
import
IMDBDataset
import
sys
import
sys
...
@@ -37,5 +37,5 @@ imdb_service.load_model_config(sys.argv[1])
...
@@ -37,5 +37,5 @@ imdb_service.load_model_config(sys.argv[1])
imdb_service
.
prepare_server
(
imdb_service
.
prepare_server
(
workdir
=
sys
.
argv
[
2
],
port
=
int
(
sys
.
argv
[
3
]),
device
=
"cpu"
)
workdir
=
sys
.
argv
[
2
],
port
=
int
(
sys
.
argv
[
3
]),
device
=
"cpu"
)
imdb_service
.
prepare_dict
({
"dict_file_path"
:
sys
.
argv
[
4
]})
imdb_service
.
prepare_dict
({
"dict_file_path"
:
sys
.
argv
[
4
]})
imdb_service
.
run_
server
()
imdb_service
.
run_
rpc_service
()
imdb_service
.
run_
flask
()
imdb_service
.
run_
web_service
()
python/examples/lac/lac_web_service.py
浏览文件 @
bd42658a
...
@@ -47,5 +47,5 @@ lac_service.load_model_config(sys.argv[1])
...
@@ -47,5 +47,5 @@ lac_service.load_model_config(sys.argv[1])
lac_service
.
load_reader
()
lac_service
.
load_reader
()
lac_service
.
prepare_server
(
lac_service
.
prepare_server
(
workdir
=
sys
.
argv
[
2
],
port
=
int
(
sys
.
argv
[
3
]),
device
=
"cpu"
)
workdir
=
sys
.
argv
[
2
],
port
=
int
(
sys
.
argv
[
3
]),
device
=
"cpu"
)
lac_service
.
run_
server
()
lac_service
.
run_
rpc_service
()
lac_service
.
run_
flask
()
lac_service
.
run_
web_service
()
python/examples/resnet_v2_50/resnet50_debug.py
浏览文件 @
bd42658a
...
@@ -14,7 +14,7 @@
...
@@ -14,7 +14,7 @@
from
paddle_serving_app.reader
import
Sequential
,
File2Image
,
Resize
,
CenterCrop
from
paddle_serving_app.reader
import
Sequential
,
File2Image
,
Resize
,
CenterCrop
from
paddle_serving_app.reader
import
RGB2BGR
,
Transpose
,
Div
,
Normalize
from
paddle_serving_app.reader
import
RGB2BGR
,
Transpose
,
Div
,
Normalize
from
paddle_serving_app
import
Debugger
from
paddle_serving_app
.local_predict
import
Debugger
import
sys
import
sys
debugger
=
Debugger
()
debugger
=
Debugger
()
...
...
python/examples/senta/get_data.sh
浏览文件 @
bd42658a
wget https://paddle-serving.bj.bcebos.com/paddle_hub_models/text/SentimentAnalysis/senta_bilstm.tar.gz
--no-check-certificate
wget https://paddle-serving.bj.bcebos.com/paddle_hub_models/text/SentimentAnalysis/senta_bilstm.tar.gz
--no-check-certificate
tar
-xzvf
senta_bilstm.tar.gz
tar
-xzvf
senta_bilstm.tar.gz
wget https://paddle-serving.bj.bcebos.com/paddle_hub_models/text/LexicalAnalysis/lac
_model
.tar.gz
--no-check-certificate
wget https://paddle-serving.bj.bcebos.com/paddle_hub_models/text/LexicalAnalysis/lac.tar.gz
--no-check-certificate
tar
-xzvf
lac_model.tar.gz
tar
-xzvf
lac_model.tar.gz
wget https://paddle-serving.bj.bcebos.com/reader/lac/lac_dict.tar.gz
--no-check-certificate
wget https://paddle-serving.bj.bcebos.com/reader/lac/lac_dict.tar.gz
--no-check-certificate
tar
-xzvf
lac_dict.tar.gz
tar
-xzvf
lac_dict.tar.gz
...
...
python/examples/senta/senta_web_service.py
浏览文件 @
bd42658a
...
@@ -14,13 +14,10 @@
...
@@ -14,13 +14,10 @@
from
paddle_serving_server_gpu.web_service
import
WebService
from
paddle_serving_server_gpu.web_service
import
WebService
from
paddle_serving_client
import
Client
from
paddle_serving_client
import
Client
from
paddle_serving_app
import
LACReader
,
SentaReader
from
paddle_serving_app.reader
import
LACReader
,
SentaReader
import
numpy
as
np
import
os
import
os
import
io
import
sys
import
sys
import
subprocess
from
multiprocessing
import
Process
from
multiprocessing
import
Process
,
Queue
class
SentaService
(
WebService
):
class
SentaService
(
WebService
):
...
@@ -33,10 +30,6 @@ class SentaService(WebService):
...
@@ -33,10 +30,6 @@ class SentaService(WebService):
self
.
lac_client_config_path
=
lac_model_path
+
"/serving_server_conf.prototxt"
self
.
lac_client_config_path
=
lac_model_path
+
"/serving_server_conf.prototxt"
self
.
lac_dict_path
=
lac_dict_path
self
.
lac_dict_path
=
lac_dict_path
self
.
senta_dict_path
=
senta_dict_path
self
.
senta_dict_path
=
senta_dict_path
self
.
show
=
False
def
show_detail
(
self
,
show
=
False
):
self
.
show
=
show
def
start_lac_service
(
self
):
def
start_lac_service
(
self
):
if
not
os
.
path
.
exists
(
'./lac_serving'
):
if
not
os
.
path
.
exists
(
'./lac_serving'
):
...
@@ -64,34 +57,29 @@ class SentaService(WebService):
...
@@ -64,34 +57,29 @@ class SentaService(WebService):
self
.
lac_client
.
connect
([
"127.0.0.1:{}"
.
format
(
self
.
lac_port
)])
self
.
lac_client
.
connect
([
"127.0.0.1:{}"
.
format
(
self
.
lac_port
)])
def
init_lac_reader
(
self
):
def
init_lac_reader
(
self
):
self
.
lac_reader
=
LACReader
(
self
.
lac_dict_path
)
self
.
lac_reader
=
LACReader
()
def
init_senta_reader
(
self
):
def
init_senta_reader
(
self
):
self
.
senta_reader
=
SentaReader
(
vocab_path
=
self
.
senta_dict_path
)
self
.
senta_reader
=
SentaReader
()
def
preprocess
(
self
,
feed
=
[],
fetch
=
[]):
def
preprocess
(
self
,
feed
=
[],
fetch
=
[]):
feed_data
=
self
.
lac_reader
.
process
(
feed
[
0
][
"words"
])
feed_data
=
[{
if
self
.
show
:
"words"
:
self
.
lac_reader
.
process
(
x
[
"words"
])
print
(
"---- lac reader ----"
)
}
for
x
in
feed
]
print
(
feed_data
)
lac_result
=
self
.
lac_client
.
predict
(
lac_result
=
self
.
lac_predict
(
feed_data
)
feed
=
feed_data
,
fetch
=
[
"crf_decode"
])
if
self
.
show
:
feed_batch
=
[]
print
(
"---- lac out ----"
)
result_lod
=
lac_result
[
"crf_decode.lod"
]
print
(
lac_result
)
for
i
in
range
(
len
(
feed
)):
segs
=
self
.
lac_reader
.
parse_result
(
feed
[
0
][
"words"
],
segs
=
self
.
lac_reader
.
parse_result
(
lac_result
[
"crf_decode"
])
feed
[
i
][
"words"
],
if
self
.
show
:
lac_result
[
"crf_decode"
][
result_lod
[
i
]:
result_lod
[
i
+
1
]])
print
(
"---- lac parse ----"
)
feed_data
=
self
.
senta_reader
.
process
(
segs
)
print
(
segs
)
feed_batch
.
append
({
"words"
:
feed_data
})
feed_data
=
self
.
senta_reader
.
process
(
segs
)
return
feed_batch
,
fetch
if
self
.
show
:
print
(
"---- senta reader ----"
)
print
(
"feed_data"
,
feed_data
)
return
[{
"words"
:
feed_data
}],
fetch
senta_service
=
SentaService
(
name
=
"senta"
)
senta_service
=
SentaService
(
name
=
"senta"
)
#senta_service.show_detail(True)
senta_service
.
set_config
(
senta_service
.
set_config
(
lac_model_path
=
"./lac_model"
,
lac_model_path
=
"./lac_model"
,
lac_dict_path
=
"./lac_dict"
,
lac_dict_path
=
"./lac_dict"
,
...
@@ -102,5 +90,5 @@ senta_service.prepare_server(
...
@@ -102,5 +90,5 @@ senta_service.prepare_server(
senta_service
.
init_lac_reader
()
senta_service
.
init_lac_reader
()
senta_service
.
init_senta_reader
()
senta_service
.
init_senta_reader
()
senta_service
.
init_lac_service
()
senta_service
.
init_lac_service
()
senta_service
.
run_
server
()
senta_service
.
run_
rpc_service
()
senta_service
.
run_
flask
()
senta_service
.
run_
web_service
()
python/paddle_serving_app/README.md
浏览文件 @
bd42658a
...
@@ -158,7 +158,7 @@ Therefore, a local prediction tool is built into the paddle_serving_app, which i
...
@@ -158,7 +158,7 @@ Therefore, a local prediction tool is built into the paddle_serving_app, which i
Taking
[
fit_a_line prediction service
](
../examples/fit_a_line
)
as an example, the following code can be used to run local prediction.
Taking
[
fit_a_line prediction service
](
../examples/fit_a_line
)
as an example, the following code can be used to run local prediction.
```
python
```
python
from
paddle_serving_app
import
Debugger
from
paddle_serving_app
.local_predict
import
Debugger
import
numpy
as
np
import
numpy
as
np
debugger
=
Debugger
()
debugger
=
Debugger
()
...
...
python/paddle_serving_app/README_CN.md
浏览文件 @
bd42658a
...
@@ -147,7 +147,7 @@ Paddle Serving框架的server预测op使用了Paddle 的预测框架,在部署
...
@@ -147,7 +147,7 @@ Paddle Serving框架的server预测op使用了Paddle 的预测框架,在部署
以
[
fit_a_line预测服务
](
../examples/fit_a_line
)
为例,使用以下代码即可执行本地预测。
以
[
fit_a_line预测服务
](
../examples/fit_a_line
)
为例,使用以下代码即可执行本地预测。
```
python
```
python
from
paddle_serving_app
import
Debugger
from
paddle_serving_app
.local_predict
import
Debugger
import
numpy
as
np
import
numpy
as
np
debugger
=
Debugger
()
debugger
=
Debugger
()
...
...
python/paddle_serving_app/__init__.py
浏览文件 @
bd42658a
...
@@ -11,10 +11,4 @@
...
@@ -11,10 +11,4 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# See the License for the specific language governing permissions and
# limitations under the License.
# limitations under the License.
from
.reader.chinese_bert_reader
import
ChineseBertReader
from
.reader.image_reader
import
ImageReader
,
File2Image
,
URL2Image
,
Sequential
,
Normalize
,
CenterCrop
,
Resize
,
PadStride
from
.reader.lac_reader
import
LACReader
from
.reader.senta_reader
import
SentaReader
from
.reader.imdb_reader
import
IMDBDataset
from
.models
import
ServingModels
from
.models
import
ServingModels
from
.local_predict
import
Debugger
python/paddle_serving_app/models/model_list.py
浏览文件 @
bd42658a
...
@@ -37,7 +37,7 @@ class ServingModels(object):
...
@@ -37,7 +37,7 @@ class ServingModels(object):
object_detection_url
=
"https://paddle-serving.bj.bcebos.com/paddle_hub_models/image/ObjectDetection/"
object_detection_url
=
"https://paddle-serving.bj.bcebos.com/paddle_hub_models/image/ObjectDetection/"
senta_url
=
"https://paddle-serving.bj.bcebos.com/paddle_hub_models/text/SentimentAnalysis/"
senta_url
=
"https://paddle-serving.bj.bcebos.com/paddle_hub_models/text/SentimentAnalysis/"
semantic_url
=
"https://paddle-serving.bj.bcebos.com/paddle_hub_models/text/SemanticRepresentation/"
semantic_url
=
"https://paddle-serving.bj.bcebos.com/paddle_hub_models/text/SemanticRepresentation/"
wordseg_url
=
"https://paddle-serving.bj.bcebos.com/paddle_hub_models/text/
ChineseWordSegmentation
/"
wordseg_url
=
"https://paddle-serving.bj.bcebos.com/paddle_hub_models/text/
LexicalAnalysis
/"
self
.
url_dict
=
{}
self
.
url_dict
=
{}
...
...
python/paddle_serving_app/reader/__init__.py
浏览文件 @
bd42658a
...
@@ -11,4 +11,8 @@
...
@@ -11,4 +11,8 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# See the License for the specific language governing permissions and
# limitations under the License.
# limitations under the License.
from
.chinese_bert_reader
import
ChineseBertReader
from
.image_reader
import
ImageReader
,
File2Image
,
URL2Image
,
Sequential
,
Normalize
,
CenterCrop
,
Resize
,
Transpose
,
Div
,
RGB2BGR
,
BGR2RGB
,
RCNNPostprocess
,
SegPostprocess
,
PadStride
from
.image_reader
import
ImageReader
,
File2Image
,
URL2Image
,
Sequential
,
Normalize
,
CenterCrop
,
Resize
,
Transpose
,
Div
,
RGB2BGR
,
BGR2RGB
,
RCNNPostprocess
,
SegPostprocess
,
PadStride
from
.lac_reader
import
LACReader
from
.senta_reader
import
SentaReader
from
.imdb_reader
import
IMDBDataset
python/paddle_serving_app/reader/lac_reader.py
浏览文件 @
bd42658a
...
@@ -48,10 +48,16 @@ def load_kv_dict(dict_path,
...
@@ -48,10 +48,16 @@ def load_kv_dict(dict_path,
class
LACReader
(
object
):
class
LACReader
(
object
):
"""data reader"""
"""data reader"""
def
__init__
(
self
,
dict_folder
):
def
__init__
(
self
,
dict_folder
=
""
):
# read dict
# read dict
#basepath = os.path.abspath(__file__)
#basepath = os.path.abspath(__file__)
#folder = os.path.dirname(basepath)
#folder = os.path.dirname(basepath)
if
dict_folder
==
""
:
dict_folder
=
"lac_dict"
if
not
os
.
path
.
exists
(
dict_folder
):
r
=
os
.
system
(
"wget https://paddle-serving.bj.bcebos.com/reader/lac/lac_dict.tar.gz --no-check-certificate && tar -xzvf lac_dict.tar.gz"
)
word_dict_path
=
os
.
path
.
join
(
dict_folder
,
"word.dic"
)
word_dict_path
=
os
.
path
.
join
(
dict_folder
,
"word.dic"
)
label_dict_path
=
os
.
path
.
join
(
dict_folder
,
"tag.dic"
)
label_dict_path
=
os
.
path
.
join
(
dict_folder
,
"tag.dic"
)
replace_dict_path
=
os
.
path
.
join
(
dict_folder
,
"q2b.dic"
)
replace_dict_path
=
os
.
path
.
join
(
dict_folder
,
"q2b.dic"
)
...
...
python/paddle_serving_app/reader/senta_reader.py
浏览文件 @
bd42658a
...
@@ -14,10 +14,11 @@
...
@@ -14,10 +14,11 @@
import
sys
import
sys
import
io
import
io
import
os
class
SentaReader
():
class
SentaReader
():
def
__init__
(
self
,
vocab_path
,
max_seq_len
=
20
):
def
__init__
(
self
,
vocab_path
=
""
,
max_seq_len
=
20
):
self
.
max_seq_len
=
max_seq_len
self
.
max_seq_len
=
max_seq_len
self
.
word_dict
=
self
.
load_vocab
(
vocab_path
)
self
.
word_dict
=
self
.
load_vocab
(
vocab_path
)
...
@@ -25,6 +26,13 @@ class SentaReader():
...
@@ -25,6 +26,13 @@ class SentaReader():
"""
"""
load the given vocabulary
load the given vocabulary
"""
"""
if
vocab_path
==
""
:
vocab_path
=
"senta_vocab.txt"
if
not
os
.
path
.
exists
(
vocab_path
):
r
=
os
.
system
(
" wget https://paddle-serving.bj.bcebos.com/reader/senta/senta_vocab.txt --no-check-certificate"
)
vocab
=
{}
vocab
=
{}
with
io
.
open
(
vocab_path
,
'r'
,
encoding
=
'utf8'
)
as
f
:
with
io
.
open
(
vocab_path
,
'r'
,
encoding
=
'utf8'
)
as
f
:
for
line
in
f
:
for
line
in
f
:
...
...
python/paddle_serving_app/version.py
浏览文件 @
bd42658a
...
@@ -12,4 +12,4 @@
...
@@ -12,4 +12,4 @@
# See the License for the specific language governing permissions and
# See the License for the specific language governing permissions and
# limitations under the License.
# limitations under the License.
""" Paddle Serving App version string """
""" Paddle Serving App version string """
serving_app_version
=
"0.
0.3
"
serving_app_version
=
"0.
1.0
"
python/paddle_serving_client/__init__.py
浏览文件 @
bd42658a
...
@@ -207,8 +207,9 @@ class Client(object):
...
@@ -207,8 +207,9 @@ class Client(object):
key
))
key
))
if
type
(
feed
[
key
]).
__module__
==
np
.
__name__
and
np
.
size
(
feed
[
if
type
(
feed
[
key
]).
__module__
==
np
.
__name__
and
np
.
size
(
feed
[
key
])
!=
self
.
feed_tensor_len
[
key
]:
key
])
!=
self
.
feed_tensor_len
[
key
]:
raise
SystemExit
(
"The shape of feed tensor {} not match."
.
format
(
#raise SystemExit("The shape of feed tensor {} not match.".format(
key
))
# key))
pass
def
predict
(
self
,
feed
=
None
,
fetch
=
None
,
need_variant_tag
=
False
):
def
predict
(
self
,
feed
=
None
,
fetch
=
None
,
need_variant_tag
=
False
):
self
.
profile_
.
record
(
'py_prepro_0'
)
self
.
profile_
.
record
(
'py_prepro_0'
)
...
...
python/paddle_serving_client/io/__init__.py
浏览文件 @
bd42658a
...
@@ -33,7 +33,6 @@ def save_model(server_model_folder,
...
@@ -33,7 +33,6 @@ def save_model(server_model_folder,
executor
=
Executor
(
place
=
CPUPlace
())
executor
=
Executor
(
place
=
CPUPlace
())
feed_var_names
=
[
feed_var_dict
[
x
].
name
for
x
in
feed_var_dict
]
feed_var_names
=
[
feed_var_dict
[
x
].
name
for
x
in
feed_var_dict
]
#target_vars = list(fetch_var_dict.values())
target_vars
=
[]
target_vars
=
[]
target_var_names
=
[]
target_var_names
=
[]
for
key
in
sorted
(
fetch_var_dict
.
keys
()):
for
key
in
sorted
(
fetch_var_dict
.
keys
()):
...
...
python/paddle_serving_client/version.py
浏览文件 @
bd42658a
...
@@ -12,6 +12,6 @@
...
@@ -12,6 +12,6 @@
# See the License for the specific language governing permissions and
# See the License for the specific language governing permissions and
# limitations under the License.
# limitations under the License.
""" Paddle Serving Client version string """
""" Paddle Serving Client version string """
serving_client_version
=
"0.
2.2
"
serving_client_version
=
"0.
3.0
"
serving_server_version
=
"0.
2.2
"
serving_server_version
=
"0.
3.0
"
module_proto_version
=
"0.
2.2
"
module_proto_version
=
"0.
3.0
"
python/paddle_serving_server/serve.py
浏览文件 @
bd42658a
...
@@ -103,7 +103,7 @@ if __name__ == "__main__":
...
@@ -103,7 +103,7 @@ if __name__ == "__main__":
service
.
load_model_config
(
args
.
model
)
service
.
load_model_config
(
args
.
model
)
service
.
prepare_server
(
service
.
prepare_server
(
workdir
=
args
.
workdir
,
port
=
args
.
port
,
device
=
args
.
device
)
workdir
=
args
.
workdir
,
port
=
args
.
port
,
device
=
args
.
device
)
service
.
run_
server
()
service
.
run_
rpc_service
()
app_instance
=
Flask
(
__name__
)
app_instance
=
Flask
(
__name__
)
...
...
python/paddle_serving_server/version.py
浏览文件 @
bd42658a
...
@@ -12,6 +12,6 @@
...
@@ -12,6 +12,6 @@
# See the License for the specific language governing permissions and
# See the License for the specific language governing permissions and
# limitations under the License.
# limitations under the License.
""" Paddle Serving Client version string """
""" Paddle Serving Client version string """
serving_client_version
=
"0.
2.2
"
serving_client_version
=
"0.
3.0
"
serving_server_version
=
"0.
2.2
"
serving_server_version
=
"0.
3.0
"
module_proto_version
=
"0.
2.2
"
module_proto_version
=
"0.
3.0
"
python/paddle_serving_server/web_service.py
浏览文件 @
bd42658a
...
@@ -92,7 +92,7 @@ class WebService(object):
...
@@ -92,7 +92,7 @@ class WebService(object):
result
=
{
"result"
:
"Request Value Error"
}
result
=
{
"result"
:
"Request Value Error"
}
return
result
return
result
def
run_
server
(
self
):
def
run_
rpc_service
(
self
):
import
socket
import
socket
localIP
=
socket
.
gethostbyname
(
socket
.
gethostname
())
localIP
=
socket
.
gethostbyname
(
socket
.
gethostname
())
print
(
"web service address:"
)
print
(
"web service address:"
)
...
@@ -115,7 +115,7 @@ class WebService(object):
...
@@ -115,7 +115,7 @@ class WebService(object):
self
.
app_instance
=
app_instance
self
.
app_instance
=
app_instance
def
run_
flask
(
self
):
def
run_
web_service
(
self
):
self
.
app_instance
.
run
(
host
=
"0.0.0.0"
,
self
.
app_instance
.
run
(
host
=
"0.0.0.0"
,
port
=
self
.
port
,
port
=
self
.
port
,
threaded
=
False
,
threaded
=
False
,
...
...
python/paddle_serving_server_gpu/serve.py
浏览文件 @
bd42658a
...
@@ -118,7 +118,7 @@ if __name__ == "__main__":
...
@@ -118,7 +118,7 @@ if __name__ == "__main__":
web_service
.
set_gpus
(
gpu_ids
)
web_service
.
set_gpus
(
gpu_ids
)
web_service
.
prepare_server
(
web_service
.
prepare_server
(
workdir
=
args
.
workdir
,
port
=
args
.
port
,
device
=
args
.
device
)
workdir
=
args
.
workdir
,
port
=
args
.
port
,
device
=
args
.
device
)
web_service
.
run_
server
()
web_service
.
run_
rpc_service
()
app_instance
=
Flask
(
__name__
)
app_instance
=
Flask
(
__name__
)
...
...
python/paddle_serving_server_gpu/version.py
浏览文件 @
bd42658a
...
@@ -12,6 +12,6 @@
...
@@ -12,6 +12,6 @@
# See the License for the specific language governing permissions and
# See the License for the specific language governing permissions and
# limitations under the License.
# limitations under the License.
""" Paddle Serving Client version string """
""" Paddle Serving Client version string """
serving_client_version
=
"0.
2.2
"
serving_client_version
=
"0.
3.0
"
serving_server_version
=
"0.
2.2
"
serving_server_version
=
"0.
3.0
"
module_proto_version
=
"0.
2.2
"
module_proto_version
=
"0.
3.0
"
python/paddle_serving_server_gpu/web_service.py
浏览文件 @
bd42658a
...
@@ -133,12 +133,11 @@ class WebService(object):
...
@@ -133,12 +133,11 @@ class WebService(object):
result
=
self
.
postprocess
(
result
=
self
.
postprocess
(
feed
=
feed
,
fetch
=
fetch
,
fetch_map
=
fetch_map
)
feed
=
feed
,
fetch
=
fetch
,
fetch_map
=
fetch_map
)
result
=
{
"result"
:
result
}
result
=
{
"result"
:
result
}
result
=
{
"result"
:
fetch_map
}
except
ValueError
:
except
ValueError
:
result
=
{
"result"
:
"Request Value Error"
}
result
=
{
"result"
:
"Request Value Error"
}
return
result
return
result
def
run_
server
(
self
):
def
run_
rpc_service
(
self
):
import
socket
import
socket
localIP
=
socket
.
gethostbyname
(
socket
.
gethostname
())
localIP
=
socket
.
gethostbyname
(
socket
.
gethostname
())
print
(
"web service address:"
)
print
(
"web service address:"
)
...
@@ -165,7 +164,7 @@ class WebService(object):
...
@@ -165,7 +164,7 @@ class WebService(object):
self
.
app_instance
=
app_instance
self
.
app_instance
=
app_instance
def
run_
flask
(
self
):
def
run_
web_service
(
self
):
self
.
app_instance
.
run
(
host
=
"0.0.0.0"
,
self
.
app_instance
.
run
(
host
=
"0.0.0.0"
,
port
=
self
.
port
,
port
=
self
.
port
,
threaded
=
False
,
threaded
=
False
,
...
...
编辑
预览
Markdown
is supported
0%
请重试
或
添加新附件
.
添加附件
取消
You are about to add
0
people
to the discussion. Proceed with caution.
先完成此消息的编辑!
取消
想要评论请
注册
或
登录