From 81c5f7ed8720adfe1525866a3f335b3f8e7d56fa Mon Sep 17 00:00:00 2001 From: zhouwei25 <52485244+zhouwei25@users.noreply.github.com> Date: Mon, 28 Oct 2019 12:05:42 +0800 Subject: [PATCH] fix inference lib package on windows (#1558) fix the cmake command. Remove snappy and snappystream. --- .../deploy/inference/windows_cpp_inference.md | 8 +++----- .../deploy/inference/windows_cpp_inference_en.md | 8 +++----- 2 files changed, 6 insertions(+), 10 deletions(-) diff --git a/doc/fluid/advanced_usage/deploy/inference/windows_cpp_inference.md b/doc/fluid/advanced_usage/deploy/inference/windows_cpp_inference.md index 4def40bf6..30ca11144 100755 --- a/doc/fluid/advanced_usage/deploy/inference/windows_cpp_inference.md +++ b/doc/fluid/advanced_usage/deploy/inference/windows_cpp_inference.md @@ -43,11 +43,11 @@ Windows下安装与编译预测库步骤:(在Windows命令提示符下执行 cd build - cmake .. -G "Visual Studio 14 2015 Win64" -DCMAKE_BUILD_TYPE=Release -DWITH_MKL=OFF -DWITH_GPU=OFF -DON_INFER=ON -DWITH_PYTHON=OFF + cmake .. -G "Visual Studio 14 2015" -A x64 -DCMAKE_BUILD_TYPE=Release -DWITH_MKL=OFF -DWITH_GPU=OFF -DON_INFER=ON -DWITH_PYTHON=OFF # -DWITH_GPU`为是否使用GPU的配置选项,-DWITH_MKL 为是否使用Intel MKL(数学核心库)的配置选项,请按需配置。 # Windows默认使用 /MT 模式进行编译,如果想使用 /MD 模式,请使用以下命令。如不清楚两者的区别,请使用上面的命令 - cmake .. -G "Visual Studio 14 2015 Win64" -DCMAKE_BUILD_TYPE=Release -DWITH_MKL=OFF -DWITH_GPU=OFF -DON_INFER=ON -DWITH_PYTHON=OFF -DMSVC_STATIC_CRT=OFF + cmake .. -G "Visual Studio 14 2015" -A x64 -DCMAKE_BUILD_TYPE=Release -DWITH_MKL=OFF -DWITH_GPU=OFF -DON_INFER=ON -DWITH_PYTHON=OFF -DMSVC_STATIC_CRT=OFF ``` 3. 使用Blend for Visual Studio 2015 打开 `paddle.sln` 文件,选择平台为`x64`,配置为`Release`,编译inference_lib_dist项目。 @@ -82,8 +82,6 @@ Windows下安装与编译预测库步骤:(在Windows命令提示符下执行 │   ├── mkldnn │   ├── mklml │   ├── protobuf - │   ├── snappy - │   ├── snappystream │   ├── xxhash │   └── zlib └── version.txt @@ -130,7 +128,7 @@ version.txt 中记录了该预测库的版本信息,包括Git Commit ID、使 进入 Paddle/paddle/fluid/inference/api/demo_ci 目录,新建build目录并进入,然后使用cmake生成vs2015的solution文件。 指令为: -`cmake .. -G "Visual Studio 14 2015 Win64" -DWITH_GPU=OFF -DWITH_MKL=ON -DWITH_STATIC_LIB=ON -DCMAKE_BUILD_TYPE=Release -DDEMO_NAME=simple_on_word2vec -DPADDLE_LIB=path_to_the_paddle_lib` +`cmake .. -G "Visual Studio 14 2015" -A x64 -DWITH_GPU=OFF -DWITH_MKL=ON -DWITH_STATIC_LIB=ON -DCMAKE_BUILD_TYPE=Release -DDEMO_NAME=simple_on_word2vec -DPADDLE_LIB=path_to_the_paddle_lib` 注: diff --git a/doc/fluid/advanced_usage/deploy/inference/windows_cpp_inference_en.md b/doc/fluid/advanced_usage/deploy/inference/windows_cpp_inference_en.md index b47edf2b7..f6ae5fd18 100755 --- a/doc/fluid/advanced_usage/deploy/inference/windows_cpp_inference_en.md +++ b/doc/fluid/advanced_usage/deploy/inference/windows_cpp_inference_en.md @@ -42,13 +42,13 @@ Users can also compile C++ inference libraries from the PaddlePaddle core code b # change to the build directory cd build - cmake .. -G "Visual Studio 14 2015 Win64" -DCMAKE_BUILD_TYPE=Release -DWITH_MKL=OFF -DWITH_GPU=OFF -DON_INFER=ON -DWITH_PYTHON=OFF + cmake .. -G "Visual Studio 14 2015" -A x64 -DCMAKE_BUILD_TYPE=Release -DWITH_MKL=OFF -DWITH_GPU=OFF -DON_INFER=ON -DWITH_PYTHON=OFF # use -DWITH_GPU to control we are building CPU or GPU version # use -DWITH_MKL to select math library: Intel MKL or OpenBLAS # By default on Windows we use /MT for C Runtime Library, If you want to use /MD, please use the below command # If you have no ideas the differences between the two, use the above one - cmake .. -G "Visual Studio 14 2015 Win64" -DCMAKE_BUILD_TYPE=Release -DWITH_MKL=OFF -DWITH_GPU=OFF -DON_INFER=ON -DWITH_PYTHON=OFF -DMSVC_STATIC_CRT=OFF + cmake .. -G "Visual Studio 14 2015" -A x64 -DCMAKE_BUILD_TYPE=Release -DWITH_MKL=OFF -DWITH_GPU=OFF -DON_INFER=ON -DWITH_PYTHON=OFF -DMSVC_STATIC_CRT=OFF ``` 3. Open the `paddle.sln` using VisualStudio 2015,choose the`x64` for Solution Platforms,and `Release` for Solution Configurations,then build the `inference_lib_dist` project in the Solution Explorer(Rigth click the project and click Build) @@ -80,8 +80,6 @@ The inference library will be installed in `fluid_inference_install_dir`: │   ├── mkldnn │   ├── mklml │   ├── protobuf - │   ├── snappy - │   ├── snappystream │   ├── xxhash │   └── zlib └── version.txt @@ -129,7 +127,7 @@ Decompress Paddle, Release and fluid_install_dir compressed package. First enter into Paddle/paddle/fluid/inference/api/demo_ci, then create and enter into directory /build, finally use cmake to generate vs2015 solution file. Commands are as follows: -`cmake .. -G "Visual Studio 14 2015 Win64" -DWITH_GPU=OFF -DWITH_MKL=OFF -DWITH_STATIC_LIB=ON -DCMAKE_BUILD_TYPE=Release -DDEMO_NAME=simple_on_word2vec -DPADDLE_LIB=path_to_the_patddle\paddle_fluid.lib` +`cmake .. -G "Visual Studio 14 2015" -A x64 -DWITH_GPU=OFF -DWITH_MKL=OFF -DWITH_STATIC_LIB=ON -DCMAKE_BUILD_TYPE=Release -DDEMO_NAME=simple_on_word2vec -DPADDLE_LIB=path_to_the_patddle\paddle_fluid.lib` Note: -- GitLab