未验证 提交 6be73cbf 编写于 作者: T TeslaZhao 提交者: GitHub

Merge pull request #45 from PaddlePaddle/develop

Sync
......@@ -22,7 +22,7 @@ cmake -DPYTHON_INCLUDE_DIR=/usr/include/python3.7m/ \
-DSERVER=ON ..
make -j10
```
You can run `make install` to produce the target in `./output` directory. Add `-DCMAKE_INSTALL_PREFIX=./output` to specify the output path to CMake command shown above
You can run `make install` to produce the target in `./output` directory. Add `-DCMAKE_INSTALL_PREFIX=./output` to specify the output path to CMake command shown above. Please specify `-DWITH_MKL=ON` on Intel CPU platform with AVX2 support.
* Compile the Serving Client
```
mkdir -p client-build-arm && cd client-build-arm
......
......@@ -27,38 +27,6 @@ mvn compile
mvn install
```
### Start the server(not pipeline)
Take the fit_a_line model as an example, the server starts
```
cd ../../python/examples/fit_a_line
sh get_data.sh
python -m paddle_serving_server.serve --model uci_housing_model --thread 10 --port 9393 --use_multilang &
```
Client prediction
```
cd ../../../java/examples/target
java -cp paddle-serving-sdk-java-examples-0.0.1-jar-with-dependencies.jar PaddleServingClientExample fit_a_line
```
Take yolov4 as an example, the server starts
```
python -m paddle_serving_app.package --get_model yolov4
tar -xzvf yolov4.tar.gz
python -m paddle_serving_server_gpu.serve --model yolov4_model --port 9393 --gpu_ids 0 --use_multilang & #It needs to be executed in GPU Docker, otherwise the execution method of CPU must be used.
```
Client prediction
```
# in /Serving/java/examples/target
java -cp paddle-serving-sdk-java-examples-0.0.1-jar-with-dependencies.jar PaddleServingClientExample yolov4 ../../../python/examples/yolov4/000000570688.jpg
# The case of yolov4 needs to specify a picture as input
```
### Start the server(pipeline)
as for input data type = string,take IMDB model ensemble as an example,the server starts
......
......@@ -27,40 +27,6 @@ mvn compile
mvn install
```
### 启动服务端(非pipeline方式)
以fit_a_line模型为例,服务端启动
```
cd ../../python/examples/fit_a_line
sh get_data.sh
python -m paddle_serving_server.serve --model uci_housing_model --thread 10 --port 9393 --use_multilang &
```
客户端预测
```
cd ../../../java/examples/target
java -cp paddle-serving-sdk-java-examples-0.0.1-jar-with-dependencies.jar PaddleServingClientExample fit_a_line
```
以yolov4为例子,服务端启动
```
python -m paddle_serving_app.package --get_model yolov4
tar -xzvf yolov4.tar.gz
python -m paddle_serving_server_gpu.serve --model yolov4_model --port 9393 --gpu_ids 0 --use_multilang & #需要在GPU Docker当中执行,否则要使用CPU的执行方式。
```
客户端预测
```
# in /Serving/java/examples/target
java -cp paddle-serving-sdk-java-examples-0.0.1-jar-with-dependencies.jar PaddleServingClientExample yolov4 ../../../python/examples/yolov4/000000570688.jpg
# yolov4的案例需要指定一个图片作为输入
```
### 启动服务端(Pipeline方式)
对于input data type = string类型,以IMDB model ensemble模型为例,服务端启动
......
......@@ -70,7 +70,8 @@ def single_func(idx, resource):
os.getpid(),
int(round(b_start * 1000000)),
int(round(b_end * 1000000))))
result = client.predict(feed=feed_batch, fetch=fetch)
result = client.predict(
feed=feed_batch, fetch=fetch, batch=True)
l_end = time.time()
if latency_flags:
......
......@@ -26,7 +26,7 @@ this script will download Chinese Dictionary File vocab.txt and Chinese Sample D
### Start Service
```
python3 bert_web_service.py serving_server 7703
python3 -m paddle_serving_server.serve --model serving_server --port 7703 --use_lite --use_xpu --ir_optim
```
### Client Prediction
......
......@@ -31,7 +31,7 @@ client.connect(endpoint_list)
for line in sys.stdin:
feed_dict = reader.process(line)
for key in feed_dict.keys():
feed_dict[key] = np.array(feed_dict[key]).reshape((128, 1))
feed_dict[key] = np.array(feed_dict[key]).reshape((1, 128))
#print(feed_dict)
result = client.predict(feed=feed_dict, fetch=fetch, batch=False)
result = client.predict(feed=feed_dict, fetch=fetch, batch=True)
print(result)
# A image for building paddle binaries
# Use cuda devel base image for both cpu and gpu environment
# When you modify it, please be aware of cudnn-runtime version
FROM hub.baidubce.com/ctr/cuda:11.0-cudnn8-devel-ubuntu18.04
FROM nvidia/cuda:11.2.0-cudnn8-devel-ubuntu16.04
MAINTAINER PaddlePaddle Authors <paddle-dev@baidu.com>
# ENV variables
......@@ -10,40 +10,45 @@ ARG WITH_AVX
ENV WITH_GPU=${WITH_GPU:-ON}
ENV WITH_AVX=${WITH_AVX:-ON}
ENV DEBIAN_FRONTEND=noninteractive
ENV LD_LIBRARY_PATH=/usr/local/cuda-11.0/targets/x86_64-linux/lib:$LD_LIBRARY_PATH
ENV HOME /root
# Add bash enhancements
COPY tools/dockerfile/scripts/root/ /root/
COPY tools/dockerfiles/root/ /root/
# Prepare packages for Python
RUN apt-get update && \
apt-get install -y software-properties-common && add-apt-repository ppa:deadsnakes/ppa && \
apt-get update && \
apt-get install -y curl wget vim git unzip unrar tar xz-utils libssl-dev bzip2 gzip \
coreutils ntp language-pack-zh-hans python-qt4 libsm6 libxext6 libxrender-dev libgl1-mesa-glx \
bison graphviz libjpeg-dev zlib1g-dev automake locales swig net-tools libtool module-init-tools libcurl4-openssl-dev libffi-dev
apt-get install -y make build-essential libssl-dev zlib1g-dev libbz2-dev \
libreadline-dev libsqlite3-dev wget curl llvm libncurses5-dev libncursesw5-dev \
xz-utils tk-dev libffi-dev liblzma-dev
RUN ln -s /usr/lib/x86_64-linux-gnu/libssl.so.1.0.0 /usr/lib/libssl.so.10 && \
ln -s /usr/lib/x86_64-linux-gnu/libcrypto.so.1.0.0 /usr/lib/libcrypto.so.10
RUN apt-get update && \
apt-get install -y --allow-downgrades --allow-change-held-packages \
patchelf git python-pip python-dev python-opencv openssh-server bison \
wget unzip unrar tar xz-utils bzip2 gzip coreutils ntp \
curl sed grep graphviz libjpeg-dev zlib1g-dev \
python-matplotlib unzip \
automake locales clang-format swig \
liblapack-dev liblapacke-dev libcurl4-openssl-dev \
net-tools libtool module-init-tools vim && \
apt-get clean -y
RUN ln -s /usr/lib/x86_64-linux-gnu/libssl.so /usr/lib/libssl.so.10 && \
ln -s /usr/lib/x86_64-linux-gnu/libcrypto.so /usr/lib/libcrypto.so.10
RUN wget https://github.com/koalaman/shellcheck/releases/download/v0.7.1/shellcheck-v0.7.1.linux.x86_64.tar.xz -O shellcheck-v0.7.1.linux.x86_64.tar.xz && \
tar -xf shellcheck-v0.7.1.linux.x86_64.tar.xz && cp shellcheck-v0.7.1/shellcheck /usr/bin/shellcheck && \
rm -rf shellcheck-v0.7.1.linux.x86_64.tar.xz shellcheck-v0.7.1
# Downgrade gcc&&g++
WORKDIR /usr/bin
COPY tools/dockerfile/build_scripts /build_scripts
RUN bash /build_scripts/install_trt.sh
RUN bash /build_scripts/install_gcc.sh gcc82 && rm -rf /build_scripts
RUN cp gcc gcc.bak && cp g++ g++.bak && rm gcc && rm g++
RUN ln -s /usr/local/gcc-8.2/bin/gcc /usr/local/bin/gcc
RUN ln -s /usr/local/gcc-8.2/bin/g++ /usr/local/bin/g++
RUN ln -s /usr/local/gcc-8.2/bin/gcc /usr/bin/gcc
RUN ln -s /usr/local/gcc-8.2/bin/g++ /usr/bin/g++
ENV PATH=/usr/local/gcc-8.2/bin:$PATH
# install cmake
WORKDIR /home
RUN wget -q https://cmake.org/files/v3.16/cmake-3.16.0-Linux-x86_64.tar.gz && tar -zxvf cmake-3.16.0-Linux-x86_64.tar.gz && rm cmake-3.16.0-Linux-x86_64.tar.gz
ENV PATH=/home/cmake-3.16.0-Linux-x86_64/bin:$PATH
COPY tools/dockerfiles/build_scripts /build_scripts
RUN bash /build_scripts/install_gcc.sh gcc82 && rm -rf /build_scripts
RUN cp gcc gcc.bak && cp g++ g++.bak && rm gcc && rm g++
RUN ln -s /usr/local/gcc-8.2/bin/gcc /usr/local/bin/gcc
RUN ln -s /usr/local/gcc-8.2/bin/g++ /usr/local/bin/g++
RUN ln -s /usr/local/gcc-8.2/bin/gcc /usr/bin/gcc
RUN ln -s /usr/local/gcc-8.2/bin/g++ /usr/bin/g++
ENV PATH=/usr/local/gcc-8.2/bin:$PATH
# install cmake
WORKDIR /home
......@@ -53,75 +58,30 @@ ENV PATH=/home/cmake-3.16.0-Linux-x86_64/bin:$PATH
# Install Python3.6
RUN mkdir -p /root/python_build/ && wget -q https://www.sqlite.org/2018/sqlite-autoconf-3250300.tar.gz && \
tar -zxf sqlite-autoconf-3250300.tar.gz && cd sqlite-autoconf-3250300 && \
./configure -prefix=/usr/local && make -j8 && make install && cd ../ && rm sqlite-autoconf-3250300.tar.gz && \
wget -q https://www.python.org/ftp/python/3.6.0/Python-3.6.0.tgz && \
./configure -prefix=/usr/local && make -j8 && make install && cd ../ && rm sqlite-autoconf-3250300.tar.gz
RUN wget -q https://www.python.org/ftp/python/3.6.0/Python-3.6.0.tgz && \
tar -xzf Python-3.6.0.tgz && cd Python-3.6.0 && \
CFLAGS="-Wformat" ./configure --prefix=/usr/local/ --enable-shared > /dev/null && \
make -j8 > /dev/null && make altinstall > /dev/null && ldconfig
make -j8 > /dev/null && make altinstall > /dev/null && ldconfig && cd .. && rm -rf Python-3.6.0*
# Install Python3.7
RUN wget -q https://www.python.org/ftp/python/3.7.0/Python-3.7.0.tgz && \
tar -xzf Python-3.7.0.tgz && cd Python-3.7.0 && \
CFLAGS="-Wformat" ./configure --prefix=/usr/local/ --enable-shared > /dev/null && \
make -j8 > /dev/null && make altinstall > /dev/null && ldconfig
make -j8 > /dev/null && make altinstall > /dev/null && ldconfig && cd .. && rm -rf Python-3.7.0*
# Install Python3.8
RUN wget -q https://www.python.org/ftp/python/3.8.0/Python-3.8.0.tgz && \
tar -xzf Python-3.8.0.tgz && cd Python-3.8.0 && \
CFLAGS="-Wformat" ./configure --prefix=/usr/local/ --enable-shared > /dev/null && \
make -j8 > /dev/null && make altinstall > /dev/null && ldconfig
make -j8 > /dev/null && make altinstall > /dev/null && ldconfig && cd .. && rm -rf Python-3.8.0*
# Install Python3.5
RUN wget -q https://www.python.org/ftp/python/3.5.1/Python-3.5.1.tgz && \
tar -xzf Python-3.5.1.tgz && cd Python-3.5.1 && \
CFLAGS="-Wformat" ./configure --prefix=/usr/local/python3.5.1 --enable-shared > /dev/null && \
make -j8 > /dev/null && make altinstall > /dev/null && ldconfig
ENV PATH=/usr/local/include/python3.6m/:/usr/local/python3.5.1/include:${PATH}
ENV PATH=/usr/local/bin:/usr/local/python3.5.1/bin:${PATH}
ENV LD_LIBRARY_PATH=/usr/local/lib:/usr/local/python3.5.1/lib:${LD_LIBRARY_PATH}
ENV CPLUS_INCLUDE_PATH=/usr/local/python3.5.1/include/python3.5:/usr/local/include/python3.6m/:$CPLUS_INCLUDE_PATH
RUN ln -sf /usr/local/python3.5.1/bin/python3.5 /usr/local/bin/python3.5 && ln -sf /usr/local/python3.5.1/bin/python3.5 /usr/bin/python3.5 && ln -sf /usr/local/python3.5.1/bin/pip3.5 /usr/local/bin/pip3.5 && ln -sf /usr/local/python3.5.1/bin/pip3.5 /usr/bin/pip3.5 && ln -sf /usr/local/bin/python3.6 /usr/local/bin/python3 && ln -sf /usr/local/bin/python3.6 /usr/bin/python3 && ln -sf /usr/local/bin/pip3.6 /usr/local/bin/pip3 && ln -sf /usr/local/bin/pip3.6 /usr/bin/pip3
ENV LD_LIBRARY_PATH=/usr/local/lib:${LD_LIBRARY_PATH}
RUN ln -sf /usr/local/bin/python3.6 /usr/local/bin/python3 && ln -sf /usr/local/bin/python3.6 /usr/bin/python3 && ln -sf /usr/local/bin/pip3.6 /usr/local/bin/pip3 && ln -sf /usr/local/bin/pip3.6 /usr/bin/pip3
RUN rm -r /root/python_build
# Install Python2.7.15 to replace original python
WORKDIR /home
ENV version=2.7.15
RUN wget https://www.python.org/ftp/python/$version/Python-$version.tgz && tar -xvf Python-$version.tgz
WORKDIR /home/Python-$version
RUN ./configure --enable-unicode=ucs4 --enable-shared CFLAGS=-fPIC --prefix=/usr/local/python2.7.15 && make && make install
RUN echo "export PATH=/usr/local/python2.7.15/include:${PATH}" >> ~/.bashrc && echo "export PATH=/usr/local/python2.7.15/bin:${PATH}" >> ~/.bashrc && echo "export LD_LIBRARY_PATH=/usr/local/python2.7.15/lib:${LD_LIBRARY_PATH}" >> ~/.bashrc && echo "export CPLUS_INCLUDE_PATH=/usr/local/python2.7.15/include/python2.7:$CPLUS_INCLUDE_PATH" >> ~/.bashrc
ENV PATH=/usr/local/python2.7.15/include:${PATH}
ENV PATH=/usr/local/python2.7.15/bin:${PATH}
ENV LD_LIBRARY_PATH=/usr/local/python2.7.15/lib:${LD_LIBRARY_PATH}
ENV CPLUS_INCLUDE_PATH=/usr/local/python2.7.15/include/python2.7:$CPLUS_INCLUDE_PATH
RUN mv /usr/bin/python /usr/bin/python.bak && ln -s /usr/local/python2.7.15/bin/python2.7 /usr/local/bin/python && ln -s /usr/local/python2.7.15/bin/python2.7 /usr/bin/python
WORKDIR /home
RUN wget https://files.pythonhosted.org/packages/b0/d1/8acb42f391cba52e35b131e442e80deffbb8d0676b93261d761b1f0ef8fb/setuptools-40.6.2.zip && apt-get -y install unzip && unzip setuptools-40.6.2.zip
WORKDIR /home/setuptools-40.6.2
RUN python setup.py build && python setup.py install
WORKDIR /home
RUN wget https://files.pythonhosted.org/packages/69/81/52b68d0a4de760a2f1979b0931ba7889202f302072cc7a0d614211bc7579/pip-18.0.tar.gz && tar -zxvf pip-18.0.tar.gz
WORKDIR pip-18.0
RUN python setup.py install && \
python3.8 setup.py install && \
python3.7 setup.py install && \
python3.6 setup.py install
WORKDIR /home
RUN rm Python-$version.tgz setuptools-40.6.2.zip pip-18.0.tar.gz && \
rm -r Python-$version setuptools-40.6.2 pip-18.0
# remove them when apt-get support 2.27 and higher version
RUN wget -q https://ftp.gnu.org/gnu/binutils/binutils-2.33.1.tar.gz && \
tar -xzf binutils-2.33.1.tar.gz && \
cd binutils-2.33.1 && \
./configure && make -j && make install && cd .. && rm -rf binutils-2.33.1 binutils-2.33.1.tar.gz
# Install Go and glide
RUN wget -qO- https://dl.google.com/go/go1.14.linux-amd64.tar.gz | \
tar -xz -C /usr/local && \
......@@ -132,8 +92,8 @@ RUN wget -qO- https://dl.google.com/go/go1.14.linux-amd64.tar.gz | \
echo "GOPATH=/root/go" >> /root/.bashrc && \
echo "PATH=/usr/local/go/bin:/root/go/bin:$PATH" >> /root/.bashrc
ENV GOROOT=/usr/local/go GOPATH=/root/go
ENV PATH=/usr/local/go/bin:/root/go/bin:$PATH
# should not be in the same line with GOROOT definition, otherwise docker build could not find GOROOT.
ENV PATH=usr/local/go/bin:/root/go/bin:${PATH}
# Install TensorRT
# following TensorRT.tar.gz is not the default official one, we do two miny changes:
......@@ -142,9 +102,9 @@ ENV PATH=/usr/local/go/bin:/root/go/bin:$PATH
# 2. Manually add ~IPluginFactory() in IPluginFactory class of NvInfer.h, otherwise, it couldn't work in paddle.
# See https://github.com/PaddlePaddle/Paddle/issues/10129 for details.
# Downgrade TensorRT
COPY tools/dockerfile/build_scripts /build_scripts
RUN bash /build_scripts/install_trt.sh
# Downgrade TensorRT
COPY tools/dockerfiles/build_scripts /build_scripts
RUN bash /build_scripts/install_trt.sh
RUN rm -rf /build_scripts
# git credential to skip password typing
......@@ -162,8 +122,8 @@ RUN wget -q https://paddle-ci.cdn.bcebos.com/patchelf_0.10-2_amd64.deb && \
dpkg -i patchelf_0.10-2_amd64.deb
# Configure OpenSSH server. c.f. https://docs.docker.com/engine/examples/running_ssh_service
#RUN mkdir /var/run/sshd && echo 'root:root' | chpasswd && sed -ri 's/^PermitRootLogin\s+.*/PermitRootLogin yes/' /etc/ssh/sshd_config && sed -ri 's/UsePAM yes/#UsePAM yes/g' /etc/ssh/sshd_config
#CMD source ~/.bashrc
RUN mkdir /var/run/sshd && echo 'root:root' | chpasswd && sed -ri 's/^PermitRootLogin\s+.*/PermitRootLogin yes/' /etc/ssh/sshd_config && sed -ri 's/UsePAM yes/#UsePAM yes/g' /etc/ssh/sshd_config
CMD source ~/.bashrc
# ccache 3.7.9
RUN wget https://paddle-ci.gz.bcebos.com/ccache-3.7.9.tar.gz && \
......@@ -172,11 +132,9 @@ RUN wget https://paddle-ci.gz.bcebos.com/ccache-3.7.9.tar.gz && \
make -j8 && make install && \
ln -s /usr/local/ccache-3.7.9/bin/ccache /usr/local/bin/ccache
RUN ln -s /usr/share/pyshared/lsb_release.py /usr/bin/lsb_release.py
RUN python3.8 -m pip install --upgrade pip requests && \
python3.7 -m pip install --upgrade pip requests && \
python3.6 -m pip install --upgrade pip requests && \
python2.7 -m pip install --upgrade pip requests
python3.6 -m pip install --upgrade pip requests
RUN wget https://paddle-serving.bj.bcebos.com/others/centos_ssl.tar && \
tar xf centos_ssl.tar && rm -rf centos_ssl.tar && \
......
......@@ -20,6 +20,7 @@
set -ex
if [ -f "/etc/redhat-release" ];then
lib_so_3=/usr/lib64/libgfortran.so.3
lib_so_5=/usr/lib64/libgfortran.so.5
lib_so_6=/usr/lib64/libstdc++.so.6
lib_path=/usr/lib64
......@@ -44,4 +45,20 @@ if [ "$1" == "gcc82" ]; then
ln -s /usr/local/gcc-8.2/lib64/libgfortran.so.5 ${lib_so_5} && \
ln -s /usr/local/gcc-8.2/lib64/libstdc++.so.6 ${lib_so_6} && \
cp /usr/local/gcc-8.2/lib64/libstdc++.so.6.0.25 ${lib_path}
elif [ "$1" == "gcc54" ]; then
wget -q https://paddle-ci.gz.bcebos.com/gcc-5.4.0.tar.gz
tar -xvf gcc-5.4.0.tar.gz
cd gcc-5.4.0 && \
unset LIBRARY_PATH CPATH C_INCLUDE_PATH PKG_CONFIG_PATH CPLUS_INCLUDE_PATH INCLUDE && \
./contrib/download_prerequisites && \
cd .. && mkdir temp_gcc54 && cd temp_gcc54 && \
../gcc-5.4.0/configure --prefix=/usr/local/gcc-5.4 --enable-threads=posix --disable-checking --disable-multilib && \
make -j8 && make install
cd .. && rm -rf temp_gcc54
rm -rf gcc-5.4.0 gcc-5.4.0.tar.gz
cp ${lib_so_6} ${lib_so_6}.bak && rm -f ${lib_so_6} &&
ln -s /usr/local/gcc-5.4/lib64/libgfortran.so.3 ${lib_so_3} && \
ln -s /usr/local/gcc-5.4/lib64/libstdc++.so.6 ${lib_so_6} && \
cp /usr/local/gcc-5.4/lib64/libstdc++.so.6.0.21 ${lib_path}
fi
......@@ -37,4 +37,13 @@ elif [[ "$VERSION" == "3.7" ]];then
make -j8 > /dev/null && make altinstall > /dev/null && ldconfig
cd .. && rm -rf Python-3.7.0*
python3.7 -m pip install -U pip
elif [[ "$VERSION" == "3.8" ]];then
wget -q https://www.python.org/ftp/python/3.8.0/Python-3.8.0.tgz && \
tar -xzf Python-3.8.0.tgz && cd Python-3.8.0 && \
CFLAGS="-Wformat" ./configure --prefix=/usr/local/ --enable-shared > /dev/null && \
make -j8 > /dev/null && make altinstall > /dev/null && ldconfig
cd .. && rm -rf Python-3.8.0*
python3.8 -m pip install -U pip
fi
......@@ -18,37 +18,74 @@ SERVING_VERSION=$1
PADDLE_VERSION=$2
RUN_ENV=$3 # cpu/10.1 10.2
PYTHON_VERSION=$4
serving_release=
client_release="paddle-serving-client==$SERVING_VERSION"
app_release="paddle-serving-app==0.3.1"
if [[ $PYTHON_VERSION == "3.6" ]];then
CPYTHON="36"
elif [[ $PYTHON_VERSION == "3.7" ]];then
CPYTHON="37"
elif [[ $PYTHON_VERSION == "3.8" ]];then
CPYTHON="38"
fi
if [[ $SERVING_VERSION == "0.5.0" ]]; then
if [[ "$RUN_ENV" == "cpu" ]];then
server_release="paddle-serving-server==$SERVING_VERSION"
serving_bin="https://paddle-serving.bj.bcebos.com/bin/serving-cpu-noavx-openblas-${SERVING_VERSION}.tar.gz"
elif [[ "$RUN_ENV" == "cuda10.1" ]];then
server_release="paddle-serving-server-gpu==$SERVING_VERSION.post101"
serving_bin="https://paddle-serving.bj.bcebos.com/bin/serving-gpu-101-${SERVING_VERSION}.tar.gz"
elif [[ "$RUN_ENV" == "cuda10.2" ]];then
server_release="paddle-serving-server-gpu==$SERVING_VERSION.post102"
serving_bin="https://paddle-serving.bj.bcebos.com/bin/serving-gpu-102-${SERVING_VERSION}.tar.gz"
fi
client_release="paddle-serving-client==$SERVING_VERSION"
app_release="paddle-serving-app==0.3.1"
elif [[ $SERVING_VERSION == "0.6.0" ]]; then
if [[ "$RUN_ENV" == "cpu" ]];then
server_release="https://paddle-serving.bj.bcebos.com/test-dev/whl/paddle_serving_server-$SERVING_VERSION-py3-none-any.whl"
serving_bin="https://paddle-serving.bj.bcebos.com/test-dev/bin/serving-cpu-noavx-openblas-$SERVING_VERSION.tar.gz"
elif [[ "$RUN_ENV" == "cuda10.1" ]];then
server_release="https://paddle-serving.bj.bcebos.com/test-dev/whl/paddle_serving_server_gpu-$SERVING_VERSION.post101-py3-none-any.whl"
serving_bin="https://paddle-serving.bj.bcebos.com/test-dev/bin/serving-gpu-101-$SERVING_VERSION.tar.gz"
elif [[ "$RUN_ENV" == "cuda10.2" ]];then
server_release="https://paddle-serving.bj.bcebos.com/test-dev/whl/paddle_serving_server_gpu-$SERVING_VERSION.post102-py3-none-any.whl"
serving_bin="https://paddle-serving.bj.bcebos.com/test-dev/bin/serving-gpu-102-$SERVING_VERSION.tar.gz"
fi
client_release="https://paddle-serving.bj.bcebos.com/test-dev/whl/paddle_serving_client-$SERVING_VERSION-cp$CPYTHON-none-any.whl"
app_release="https://paddle-serving.bj.bcebos.com/test-dev/whl/paddle_serving_app-$SERVING_VERSION-py3-none-any.whl"
fi
if [[ "$RUN_ENV" == "cpu" ]];then
server_release="paddle-serving-server==$SERVING_VERSION"
python$PYTHON_VERSION -m pip install $client_release $app_release $server_release
python$PYTHON_VERSION -m pip install paddlepaddle==${PADDLE_VERSION}
cd /usr/local/
wget https://paddle-serving.bj.bcebos.com/bin/serving-cpu-noavx-openblas-${SERVING_VERSION}.tar.gz
wget $serving_bin
tar xf serving-cpu-noavx-openblas-${SERVING_VERSION}.tar.gz
echo "export SERVING_BIN=$PWD/serving-cpu-noavx-openblas-${SERVING_VERSION}/serving">>/root/.bashrc
mv $PWD/serving-cpu-noavx-openblas-${SERVING_VERSION} $PWD/serving_bin
echo "export SERVING_BIN=$PWD/serving_bin/serving">>/root/.bashrc
rm -rf serving-cpu-noavx-openblas-${SERVING_VERSION}.tar.gz
cd -
elif [[ "$RUN_ENV" == "cuda10.1" ]];then
server_release="paddle-serving-server-gpu==$SERVING_VERSION.post101"
python$PYTHON_VERSION -m pip install $client_release $app_release $server_release
python$PYTHON_VERSION -m pip install paddlepaddle-gpu==${PADDLE_VERSION}
cd /usr/local/
wget https://paddle-serving.bj.bcebos.com/bin/serving-gpu-101-${SERVING_VERSION}.tar.gz
wget $serving_bin
tar xf serving-gpu-101-${SERVING_VERSION}.tar.gz
echo "export SERVING_BIN=$PWD/serving-gpu-101-${SERVING_VERSION}/serving">>/root/.bashrc
mv $PWD/serving-gpu-101-${SERVING_VERSION} $PWD/serving_bin
echo "export SERVING_BIN=$PWD/serving_bin/serving">>/root/.bashrc
rm -rf serving-gpu-101-${SERVING_VERSION}.tar.gz
cd -
elif [[ "$RUN_ENV" == "cuda10.2" ]];then
server_release="paddle-serving-server-gpu==$SERVING_VERSION.post102"
python$PYTHON_VERSION -m pip install $client_release $app_release $server_release
python$PYTHON_VERSION -m pip install paddlepaddle-gpu==${PADDLE_VERSION}
cd /usr/local/
wget https://paddle-serving.bj.bcebos.com/bin/serving-gpu-102-${SERVING_VERSION}.tar.gz
wget $serving_bin
tar xf serving-gpu-102-${SERVING_VERSION}.tar.gz
echo "export SERVING_BIN=$PWD/serving-gpu-102-${SERVING_VERSION}/serving">>/root/.bashrc
mv $PWD/serving-gpu-102-${SERVING_VERSION} $PWD/serving_bin
echo "export SERVING_BIN=$PWD/serving_bin/serving">>/root/.bashrc
rm -rf serving-gpu-102-${SERVING_VERSION}.tar.gz
cd -
fi
......
......@@ -8,7 +8,7 @@ function usage
echo "usage: sh tools/generate_runtime_docker.sh --SOME_ARG ARG_VALUE"
echo " ";
echo " --env : running env, cpu/cuda10.1/cuda10.2/cuda11";
echo " --python : python version, 2.7/3.6/3.7 ";
echo " --python : python version, 3.6/3.7/3.8 ";
echo " --serving : serving version(0.5.0)";
echo " --paddle : paddle version(2.0.1)"
echo " --image_name : image name(default serving_runtime:env-python)"
......@@ -73,8 +73,8 @@ function run
echo "named arg: paddle: $paddle"
echo "named arg: image_name: $image_name"
sed -e "s/<<base_image>>/$base_image/g" -e "s/<<python_version>>/$python/g" -e "s/<<run_env>>/$env/g" tools/Dockerfile.runtime_template > Dockerfile.tmp
docker build --build-arg ftp_proxy=http://172.19.57.45:3128 --build-arg https_proxy=http://172.19.57.45:3128 --build-arg http_proxy=http://172.19.57.45:3128 --build-arg HTTP_PROXY=http://172.19.57.45:3128 --build-arg HTTPS_PROXY=http://172.19.57.45:3128 -t $image_name -f Dockerfile.tmp .
sed -e "s/<<base_image>>/$base_image/g" -e "s/<<python_version>>/$python/g" -e "s/<<run_env>>/$env/g" -e "s/<<serving_version>>/$serving/g" -e "s/<<paddle_version>>/$paddle/g" tools/Dockerfile.runtime_template > Dockerfile.tmp
docker build --network=host --build-arg ftp_proxy=http://172.19.57.45:3128 --build-arg https_proxy=http://172.19.57.45:3128 --build-arg http_proxy=http://172.19.57.45:3128 --build-arg HTTP_PROXY=http://172.19.57.45:3128 --build-arg HTTPS_PROXY=http://172.19.57.45:3128 -t $image_name -f Dockerfile.tmp .
}
run "$@";
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册