提交 9a8b21a5 编写于 作者: B bjjwwang

Merge branch 'develop' of https://github.com/paddlepaddle/serving into develop

......@@ -22,7 +22,7 @@ cmake -DPYTHON_INCLUDE_DIR=/usr/include/python3.7m/ \
-DSERVER=ON ..
make -j10
```
You can run `make install` to produce the target in `./output` directory. Add `-DCMAKE_INSTALL_PREFIX=./output` to specify the output path to CMake command shown above
You can run `make install` to produce the target in `./output` directory. Add `-DCMAKE_INSTALL_PREFIX=./output` to specify the output path to CMake command shown above. Please specify `-DWITH_MKL=ON` on Intel CPU platform with AVX2 support.
* Compile the Serving Client
```
mkdir -p client-build-arm && cd client-build-arm
......
......@@ -70,7 +70,8 @@ def single_func(idx, resource):
os.getpid(),
int(round(b_start * 1000000)),
int(round(b_end * 1000000))))
result = client.predict(feed=feed_batch, fetch=fetch)
result = client.predict(
feed=feed_batch, fetch=fetch, batch=True)
l_end = time.time()
if latency_flags:
......
......@@ -26,7 +26,7 @@ this script will download Chinese Dictionary File vocab.txt and Chinese Sample D
### Start Service
```
python3 bert_web_service.py serving_server 7703
python3 -m paddle_serving_server.serve --model serving_server --port 7703 --use_lite --use_xpu --ir_optim
```
### Client Prediction
......
......@@ -31,7 +31,7 @@ client.connect(endpoint_list)
for line in sys.stdin:
feed_dict = reader.process(line)
for key in feed_dict.keys():
feed_dict[key] = np.array(feed_dict[key]).reshape((128, 1))
feed_dict[key] = np.array(feed_dict[key]).reshape((1, 128))
#print(feed_dict)
result = client.predict(feed=feed_dict, fetch=fetch, batch=False)
result = client.predict(feed=feed_dict, fetch=fetch, batch=True)
print(result)
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册