diff --git a/docs/demo_guides/cpp_demo.md b/docs/demo_guides/cpp_demo.md index 55abd3a70fe23dd0e8798d6a772ee216140c2875..5f3a2757b21cffb90ebd214ea6d9525dc3fb6dbd 100644 --- a/docs/demo_guides/cpp_demo.md +++ b/docs/demo_guides/cpp_demo.md @@ -32,14 +32,26 @@ tar zxf mobilenet_v1.tar.gz ![image](https://paddlelite-data.bj.bcebos.com/doc_images/cxx_demo/3inference_model.png) -(2)下载[opt工具](https://github.com/PaddlePaddle/Paddle-Lite/releases/download/v2.3.0/opt)。放入同一文件夹,终端输入命令转化模型: +(2)模型转换 -```shell -wget https://github.com/PaddlePaddle/Paddle-Lite/releases/download/v2.3.0/opt -chmod +x opt -./opt --model_dir=./mobilenet_v1 --optimize_out_type=naive_buffer --optimize_out=./mobilenet_v1_opt -``` + - v2.6.0版本之前 + + 下载[opt工具](https://github.com/PaddlePaddle/Paddle-Lite/releases/download/v2.3.0/opt)。放入同一文件夹,终端输入命令转化模型 + + ```shell + wget https://github.com/PaddlePaddle/Paddle-Lite/releases/download/v2.3.0/opt + chmod +x opt + ./opt --model_dir=./mobilenet_v1 --optimize_out_type=naive_buffer --optimize_out=./mobilenet_v1_opt + ``` + - v2.6.0版本以及后续版本 + + 安装paddlelite,终端输入命令转化模型 + + ```shell + python -m pip install paddlelite + paddle_opt_lite --model_dir=./mobilenet_v1 --optimize_out_type=naive_buffer --optimize_out=./mobilenet_v1_opt + ``` **结果如下图所示:** ![image](https://paddlelite-data.bj.bcebos.com/doc_images/cxx_demo/2opt_model.png) diff --git a/docs/demo_guides/python_demo.md b/docs/demo_guides/python_demo.md new file mode 100644 index 0000000000000000000000000000000000000000..4370a6fbbfb95e68b131cc4bb1a7d51877938655 --- /dev/null +++ b/docs/demo_guides/python_demo.md @@ -0,0 +1,111 @@ +# Python Demo + +## 1. 下载最新版本python预测库 + +```shell +python -m pip install paddlelite +``` + +## 2. 转化模型 + +PaddlePaddle的原生模型需要经过[opt]()工具转化为Paddle-Lite可以支持的naive_buffer格式。 + +以`mobilenet_v1`模型为例: + +(1)下载[mobilenet_v1模型](http://paddle-inference-dist.bj.bcebos.com/mobilenet_v1.tar.gz)后解压: + +```shell +wget http://paddle-inference-dist.bj.bcebos.com/mobilenet_v1.tar.gz +tar zxf mobilenet_v1.tar.gz +``` + +(2)使用opt工具: + + 从磁盘加载模型时,根据模型和参数文件存储方式不同,加载模型和参数的路径有两种形式。 + +- Linux环境 + - 非combined形式:模型文件夹model_dir下存在一个模型文件和多个参数文件时,传入模型文件夹路径,模型文件名默认为__model__。 + + ```shell + paddle_lite_opt --model_dir=./mobilenet_v1 \ + --optimize_out=mobilenet_v1_opt \ + --optimize_out_type=naive_buffer \ + --valid_targets=x86 + ``` + - combined形式:模型文件夹model_dir下只有一个模型文件__model__和一个参数文件__params__时,传入模型文件和参数文件路径 + + ```shell + paddle_lite_opt --model_file=./mobilenet_v1/__model__ \ + --param_file=./mobilenet_v1/__params__ \ + --optimize_out=mobilenet_v1_opt \ + --optimize_out_type=naive_buffer \ + --valid_targets=x86 + ``` + +- windows环境 + +windows 暂不支持命令行方式直接运行模型转换器,需要编写python脚本 + +```python +import paddlelite.lite as lite + +a=lite.Opt() +# 非combined形式 +a.set_model_dir("D:\\YOU_MODEL_PATH\\mobilenet_v1") + +# conmbined形式 +# a.set_model_file("D:\\YOU_MODEL_PATH\\mobilenet_v1\\__model__") +# a.set_param_file("D:\\YOU_MODEL_PATH\\mobilenet_v1\\__params__") + +a.set_optimize_out("mobilenet_v1_opt") +a.set_valid_places("x86") + +a.run() +``` + +- MAC 环境 + +Opt工具使用方式同Linux(MAC环境暂不支持python端预测,下个版本会修复该问题) + +## 3. 编写预测程序 + +准备好预测库和模型,我们便可以编写程序来执行预测。我们提供涵盖图像分类、目标检测等多种应用场景的C++示例demo可供参考,创建文件mobilenetV1_light_api.py, +python demo 完整代码位于 [demo/python](https://github.com/PaddlePaddle/Paddle-Lite/blob/develop/lite/demo/python/mobilenetv1_light_api.py) 。 + +(1) 设置config信息 +```python +from paddlelite.lite import * + +config = MobileConfig() +config.set_model_dir(/YOU_MODEL_PATH/mobilenet_v1_opt.nb) +``` + +(2) 创建predictor + +```python +predictor = create_paddle_predictor(config) +``` + +(3) 设置输入数据 +```python +input_tensor = predictor.get_input(0) +input_tensor.resize([1, 3, 224, 224]) +input_tensor.set_float_data([1.] * 3 * 224 * 224) +``` + +(4) 执行预测 +```python +predictor.run() +``` + +(5) 得到输出数据 +```python +output_tensor = predictor.get_output(0) +print(output_tensor.shape()) +print(output_tensor.float_data()[:10]) +``` + +## 4. 运行文件 +```shell +python mobilenetV1_light_api.py +``` diff --git a/docs/demo_guides/x86.md b/docs/demo_guides/x86.md index 9d31aab05b31df8f96caa1cb70b302cd02f879ff..c910a65907bc6c21ce656c4982f96e2ab30b3f99 100644 --- a/docs/demo_guides/x86.md +++ b/docs/demo_guides/x86.md @@ -4,8 +4,6 @@ Paddle-Lite 支持在Docker或Linux环境编译x86预测库。环境搭建参考[环境准备](../user_guides/source_compile)。 -(注意:非docker Linux环境需要是Ubuntu16.04) - ### 编译 1、 下载代码 @@ -20,10 +18,11 @@ git checkout release/v2.6.0 ```bash cd Paddle-Lite -./lite/tools/build.sh x86 +./lite/tools/build.sh --build_python=ON x86 # 其他可选择编译选项 # --with_log=OFF 关闭LOG信息输出 +# --build_python=OFF 编译python预测库 ``` ### 编译结果说明 @@ -53,8 +52,17 @@ x86编译结果位于 `build.lite.x86/inference_lite_lib` - `mobilenetv1_full` :使用full_api 执行mobilenet_v1预测的C++ demo - `mobilenetv1_light` :使用light_api 执行mobilenet_v1预测的C++ demo +5、 `demo/python`文件夹:x86预测库的Python 示例demo + +- `mobilenetv1_full_api.py` :使用full_api 执行mobilenet_v1预测的Python demo +- `mobilenetv1_light_api.py` :使用light_api 执行mobilenet_v1预测的Python demo +6、 `python`文件夹:包含python的库文件和对应的.whl包 +- `install`文件夹:编译成功的.whl包位于`install/dist/*.whl` +- `lib`文件夹:.whl包依赖的库文件 + +**(若不需要编译python预测库,则将编译命令替换为`./lite/tools/build.sh x86`)** ### x86预测API使用示例 @@ -64,7 +72,8 @@ x86编译结果位于 `build.lite.x86/inference_lite_lib` mobilenetv1_full/ |-- CMakeLists.txt |-- build.sh -`-- mobilenet_full_api.cc +|-- build.bat +-- mobilenet_full_api.cc ``` 本demo使用cmake构建`CMakeLists.txt`为cmake脚本,`mobilenet_full_api.cc`是x86示例的源代码、`build.sh`为编译的脚本。 @@ -168,8 +177,8 @@ int main(int argc, char** argv) { #### 编译环境需求 - Windows 10 专业版 - - 目前Windows暂不支持GPU模式 -- *Python 版本 2.7/3.5.1+/3.6/3.7 (64 bit)* + - 目前Windows暂不支持GPU编译 +- *Python 版本 2.7/3.5.1+ (64 bit)* - *pip 或 pip3 版本 9.0.1+ (64 bit)* - *Visual Studio 2015 Update3* @@ -187,15 +196,15 @@ int main(int argc, char** argv) { ```bash git clone https://github.com/PaddlePaddle/Paddle-Lite.git # 切换到release分支 -git checkout release/v2.3 +git checkout release/v2.6.0 ``` -2、 源码编译 +2、 源码编译(需要按照提示输入对应的参数) -```bash +```dos cd Paddle-Lite -lite/tools/build_windows.bat with_extra with_python with_profile +lite\tools\build_windows.bat with_extra with_python with_profile ``` -编译脚本`lite/tools/build.bat`,追加参数说明: +编译脚本`build_windows.bat`,追加参数说明: | 参数 | 介绍 | 值 | |-----------|-------------|-------------| @@ -203,40 +212,62 @@ lite/tools/build_windows.bat with_extra with_python with_profile | with_python | 可选,是否编译python预测库(默认为OFF) 。 | `ON`、`OFF` | | with_profile | 可选,是否支持分析器模式(默认为OFF) 。 | `ON`、`OFF` | -### 编译结果 +### 编译结果说明 x86编译结果位于 `build.lite.x86/inference_lite_lib` **具体内容**说明: -1、 `bin`文件夹:可执行工具文件 `test_model_bin` - -2、 `cxx`文件夹:包含c++的库文件与相应的头文件 +1、 `cxx`文件夹:包含c++的库文件与相应的头文件 - `include` : 头文件 - `lib` : 库文件 - - 打包的静态库文件: + - 静态库文件: - `libpaddle_api_full_bundled.lib` :full_api 静态库 - `libpaddle_api_light_bundled.lib` :light_api 静态库 -3、 `third_party` 文件夹:第三方库文件 +2、 `third_party` 文件夹:依赖的第三方预测库mklml + +- mklml : Paddle-Lite预测库依赖的mklml数学库 + +3、 `demo/cxx`文件夹:x86预测库的C++ 示例demo + +- `mobilenetv1_full` :使用full_api 执行mobilenet_v1预测的C++ demo +- `mobilenetv1_light` :使用light_api 执行mobilenet_v1预测的C++ demo + +4、 `demo/python`: x86预测库的Python示例demo + +- `mobilenetv1_full_api.py`:使用full_api 执行mobilenet_v1预测的Python demo +- `mobilenetv1_light_api.py`:使用full_api 执行mobilenet_v1预测的Python demo +5、 `python`文件夹:包含python的库文件和对应的.whl包 + +- `install`文件夹:编译成功的.whl包位于`install/dist/*.whl` +- `lib`文件夹:.whl包依赖的库文件 ### x86预测API使用示例 -1、我们提供Windows环境下x86 API运行mobilenet_v1的示例:[mobilenet_full_x86demo](https://paddlelite-data.bj.bcebos.com/x86/mobilenet_full_x86demo.zip)。下载解压后内容如下>: +1、`mobilenetv1_full`目录结构 -![](https://paddlelite-data.bj.bcebos.com/x86/x86-doc/demo.png) +```bash +mobilenetv1_full/ +|-- CMakeLists.txt +|-- build.sh +|-- build.bat +`-- mobilenet_full_api.cc +``` -`mobilenet_v1`为模型文件、`lib`和`include`分别是Paddle-Lite的预测库和头文件、`third_party`下是编译时依赖的第三方库`mklml`、`mobilenet_full_api.cc`是x86示例的源代码、`build.bat`为编译的脚本。 +本demo使用cmake构建`CMakeLists.txt`为cmake脚本,`mobilenet_full_api.cc`是x86示例的源代码、`build.sh`为Linux x86编译的脚本,`build.bat`为windows x86编译脚本。 -2、demo内容与使用方法 +2、demo使用方法 ``` bash -# 1、编译(需在vs2015的命令窗口执行该脚本) +# 1、编译 +cd mobilenetv1_full build.bat +cd build ``` -编译结果为当前目录下的 `Release\\mobilenet_full_api.exe` -``` bash +编译结果为当前目录下的 `Release\mobilenet_full_api.exe ` +``` dos # 2、执行预测 -Release\\mobilenet_full_api.exe ..\mobilenet_v1 +Release\mobilenet_full_api.exe mobilenet_v1 ``` -`mobilenet_v1`为模型路径,`mobilenet_full_api.exe`为第一步编译出的可执行文件。 +下载并解压模型[`mobilenet_v1`](http://paddle-inference-dist.bj.bcebos.com/mobilenet_v1.tar.gz)到当前`build`目录,执行以上命令进行预测。 diff --git a/docs/index.rst b/docs/index.rst index 120af007df4232cfad5c0ff8b61b3aa90458555c..37a5b7e2652d9e4bb34406e28b093debe31e6fbf 100644 --- a/docs/index.rst +++ b/docs/index.rst @@ -47,6 +47,7 @@ Welcome to Paddle-Lite's documentation! demo_guides/cpp_demo demo_guides/java_demo + demo_guides/python_demo demo_guides/android_app_demo demo_guides/ios_app_demo demo_guides/x86 diff --git a/docs/user_guides/source_compile.md b/docs/user_guides/source_compile.md index 00c7329d84316fc6feb603a84e44b67ff67e1959..87639a99a81efbfbcc9c35c41ac530eaae1c6718 100644 --- a/docs/user_guides/source_compile.md +++ b/docs/user_guides/source_compile.md @@ -10,11 +10,12 @@ PaddleLite 提供了移动端的一键源码编译脚本 `lite/tools/build.sh` ## 一、环境准备 -目前支持三种编译的环境: +目前支持四种编译的环境: 1. Docker 容器环境, 2. Linux(推荐 Ubuntu 16.04)环境, -3. Mac OS 环境。 +3. Mac OS 环境, +4. [Windows 环境](../demo_guides/x86.html#windows) ### 1、 Docker开发环境 diff --git a/lite/CMakeLists.txt b/lite/CMakeLists.txt index 2734a684702413d39f3d746d4f995d41122c9f39..ff4d00dbb1051320f817c8220a11a77edde7fb05 100644 --- a/lite/CMakeLists.txt +++ b/lite/CMakeLists.txt @@ -224,11 +224,11 @@ if (LITE_WITH_X86) add_dependencies(publish_inference publish_inference_x86_cxx_lib) add_custom_target(publish_inference_x86_cxx_demos ${TARGET} - COMMAND ${CMAKE_COMMAND} -E make_directory "${INFER_LITE_PUBLISH_ROOT}/third_party" - COMMAND ${CMAKE_COMMAND} -E copy_directory "${CMAKE_BINARY_DIR}/third_party/install" "${INFER_LITE_PUBLISH_ROOT}/third_party" - COMMAND ${CMAKE_COMMAND} -E copy_directory "${CMAKE_BINARY_DIR}/third_party/eigen3" "${INFER_LITE_PUBLISH_ROOT}/third_party" + COMMAND ${CMAKE_COMMAND} -E make_directory "${INFER_LITE_PUBLISH_ROOT}/third_party/mklml" + COMMAND ${CMAKE_COMMAND} -E copy_directory "${CMAKE_BINARY_DIR}/third_party/install/mklml" "${INFER_LITE_PUBLISH_ROOT}/third_party/mklml" COMMAND ${CMAKE_COMMAND} -E make_directory "${INFER_LITE_PUBLISH_ROOT}/demo/cxx" - COMMAND ${CMAKE_COMMAND} -E copy_directory "${CMAKE_SOURCE_DIR}/lite/demo/cxx" "${INFER_LITE_PUBLISH_ROOT}/demo/cxx" + COMMAND ${CMAKE_COMMAND} -E copy_directory "${CMAKE_SOURCE_DIR}/lite/demo/cxx/x86_mobilenetv1_light_demo" "${INFER_LITE_PUBLISH_ROOT}/demo/cxx/mobilenetv1_light" + COMMAND ${CMAKE_COMMAND} -E copy_directory "${CMAKE_SOURCE_DIR}/lite/demo/cxx/x86_mobilenetv1_full_demo" "${INFER_LITE_PUBLISH_ROOT}/demo/cxx/mobilenetv1_full" ) add_dependencies(publish_inference_x86_cxx_lib publish_inference_x86_cxx_demos) add_dependencies(publish_inference_x86_cxx_demos paddle_api_full_bundled eigen3) diff --git a/lite/demo/cxx/x86_mobilenetv1_full_demo/CMakeLists.txt b/lite/demo/cxx/x86_mobilenetv1_full_demo/CMakeLists.txt index a4b8497ebb30630b91d0eee9ebde389ae10f0e2c..8aef18c1f92c84d0e4fd9f96f79c32fa8e2b1285 100644 --- a/lite/demo/cxx/x86_mobilenetv1_full_demo/CMakeLists.txt +++ b/lite/demo/cxx/x86_mobilenetv1_full_demo/CMakeLists.txt @@ -6,16 +6,44 @@ set(TARGET mobilenet_full_api) set(LITE_DIR "${PROJECT_SOURCE_DIR}/../../../cxx") set(MKLML_DIR "${PROJECT_SOURCE_DIR}/../../../third_party/mklml/") +if (WIN32) + add_definitions("/DGOOGLE_GLOG_DLL_DECL=") + option(MSVC_STATIC_CRT "use static C Runtime library by default" ON) + if (MSVC_STATIC_CRT) + set(CMAKE_C_FLAGS_DEBUG "${CMAKE_C_FLAGS_DEBUG} /bigobj /MTd") + set(CMAKE_C_FLAGS_RELEASE "${CMAKE_C_FLAGS_RELEASE} /bigobj /MT") + set(CMAKE_CXX_FLAGS_DEBUG "${CMAKE_CXX_FLAGS_DEBUG} /bigobj /MTd") + set(CMAKE_CXX_FLAGS_RELEASE "${CMAKE_CXX_FLAGS_RELEASE} /bigobj /MT") + endif() +endif() + # 2. link mklml and Paddle-Lite directory link_directories(${LITE_DIR}/lib ${MKLML_DIR}/lib) include_directories(${LITE_DIR}/include/ ${MKLML_DIR}/include) # 3. compile options -add_definitions(-std=c++11 -g -O3 -pthread) -set(EXECUTABLE_OUTPUT_PATH ${PROJECT_SOURCE_DIR}) +if (NOT WIN32) + add_definitions(-std=c++11 -g -O3 -pthread) + set(EXECUTABLE_OUTPUT_PATH ${PROJECT_SOURCE_DIR}) +endif() # 4.add executable output add_executable(${TARGET} ${TARGET}.cc) -target_link_libraries(${TARGET} -lpaddle_full_api_shared) -target_link_libraries(${TARGET} -liomp5) -target_link_libraries(${TARGET} -ldl) +if (WIN32) + set(MATH_LIB ${MKLML_DIR}/lib/mklml${CMAKE_STATIC_LIBRARY_SUFFIX} + ${MKLML_DIR}/lib/libiomp5md${CMAKE_STATIC_LIBRARY_SUFFIX}) + + + target_link_libraries(${TARGET} libpaddle_api_full_bundled.lib) + target_link_libraries(${TARGET} shlwapi.lib) + target_link_libraries(${TARGET} ${MATH_LIB}) + + add_custom_command(TARGET ${TARGET} POST_BUILD + COMMAND ${CMAKE_COMMAND} -E copy ${MKLML_DIR}/lib/mklml.dll ${CMAKE_BINARY_DIR}/Release + COMMAND ${CMAKE_COMMAND} -E copy ${MKLML_DIR}/lib/libiomp5md.dll ${CMAKE_BINARY_DIR}/Release + ) +else() + target_link_libraries(${TARGET} -lpaddle_full_api_shared) + target_link_libraries(${TARGET} -liomp5) + target_link_libraries(${TARGET} -ldl) +endif() diff --git a/lite/demo/cxx/x86_mobilenetv1_full_demo/build.bat b/lite/demo/cxx/x86_mobilenetv1_full_demo/build.bat new file mode 100644 index 0000000000000000000000000000000000000000..968ed3c0776640dc20ed68e86f57ca372d5be129 --- /dev/null +++ b/lite/demo/cxx/x86_mobilenetv1_full_demo/build.bat @@ -0,0 +1,61 @@ +@echo off +setlocal +setlocal enabledelayedexpansion + +set source_path=%~dp0 + +set build_directory=%source_path%\build + +if EXIST "%build_directory%" ( + call:rm_rebuild_dir "%build_directory%" +) + +md "%build_directory%" +set vcvarsall_dir=C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\vcvarsall.bat + +IF NOT EXIST "%vcvarsall_dir%" ( + goto set_vcvarsall_dir +) else ( + goto cmake +) + +:set_vcvarsall_dir +SET /P vcvarsall_dir="Please input the path of visual studio command Prompt, such as C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\vcvarsall.bat =======>" +set tmp_var=!vcvarsall_dir! +call:remove_space +set vcvarsall_dir=!tmp_var! +IF NOT EXIST "!vcvarsall_dir!" ( + echo "------------!vcvarsall_dir! not exist------------" + goto set_vcvarsall_dir +) + +:cmake +D: +cd "%build_directory%" + +cmake .. -G "Visual Studio 14 2015 Win64" -T host=x64 + +call "%vcvarsall_dir%" amd64 + +msbuild /maxcpucount:8 /p:Configuration=Release mobilenet_full_api.vcxproj + +goto:eof + +:rm_rebuild_dir + del /f /s /q "%~1\*.*" >nul 2>&1 + rd /s /q "%~1" >nul 2>&1 +goto:eof + +:remove_space +:remove_left_space +if "%tmp_var:~0,1%"==" " ( + set "tmp_var=%tmp_var:~1%" + goto remove_left_space +) + +:remove_right_space +if "%tmp_var:~-1%"==" " ( + set "tmp_var=%tmp_var:~0,-1%" + goto remove_left_space +) +goto:eof diff --git a/lite/demo/cxx/x86_mobilenetv1_full_demo/mobilenet_full_api.cc b/lite/demo/cxx/x86_mobilenetv1_full_demo/mobilenet_full_api.cc index c2837e0fdd9bfaa9fc146dff9daee963f707b886..48822ce52d29874a3a8ab77511fa57d01467e6b1 100644 --- a/lite/demo/cxx/x86_mobilenetv1_full_demo/mobilenet_full_api.cc +++ b/lite/demo/cxx/x86_mobilenetv1_full_demo/mobilenet_full_api.cc @@ -16,6 +16,11 @@ #include #include "paddle_api.h" // NOLINT +#ifdef _WIN32 +#include "paddle_use_kernels.h" // NOLINT +#include "paddle_use_ops.h" // NOLINT +#endif + using namespace paddle::lite_api; // NOLINT int64_t ShapeProduction(const shape_t& shape) { diff --git a/lite/demo/cxx/x86_mobilenetv1_light_demo/CMakeLists.txt b/lite/demo/cxx/x86_mobilenetv1_light_demo/CMakeLists.txt index e85b8fe67e1a8be859d4d7a9a95a9008802a7521..a4e5e75208f615498bb7da8b2f4718351b2e0071 100644 --- a/lite/demo/cxx/x86_mobilenetv1_light_demo/CMakeLists.txt +++ b/lite/demo/cxx/x86_mobilenetv1_light_demo/CMakeLists.txt @@ -6,16 +6,44 @@ set(TARGET mobilenet_light_api) set(LITE_DIR "${PROJECT_SOURCE_DIR}/../../../cxx") set(MKLML_DIR "${PROJECT_SOURCE_DIR}/../../../third_party/mklml/") +if (WIN32) + add_definitions("/DGOOGLE_GLOG_DLL_DECL=") + option(MSVC_STATIC_CRT "use static C Runtime library by default" ON) + if (MSVC_STATIC_CRT) + set(CMAKE_C_FLAGS_DEBUG "${CMAKE_C_FLAGS_DEBUG} /bigobj /MTd") + set(CMAKE_C_FLAGS_RELEASE "${CMAKE_C_FLAGS_RELEASE} /bigobj /MT") + set(CMAKE_CXX_FLAGS_DEBUG "${CMAKE_CXX_FLAGS_DEBUG} /bigobj /MTd") + set(CMAKE_CXX_FLAGS_RELEASE "${CMAKE_CXX_FLAGS_RELEASE} /bigobj /MT") + endif() +endif() + # 2. link mklml and Paddle-Lite directory link_directories(${LITE_DIR}/lib ${MKLML_DIR}/lib) include_directories(${LITE_DIR}/include/ ${MKLML_DIR}/include) # 3. compile options -add_definitions(-std=c++11 -g -O3 -pthread) -set(EXECUTABLE_OUTPUT_PATH ${PROJECT_SOURCE_DIR}) +if (NOT WIN32) + add_definitions(-std=c++11 -g -O3 -pthread) + set(EXECUTABLE_OUTPUT_PATH ${PROJECT_SOURCE_DIR}) +endif() # 4.add executable output add_executable(${TARGET} ${TARGET}.cc) -target_link_libraries(${TARGET} -lpaddle_light_api_shared) -target_link_libraries(${TARGET} -liomp5) -target_link_libraries(${TARGET} -ldl) +if (WIN32) + set(MATH_LIB ${MKLML_DIR}/lib/mklml${CMAKE_STATIC_LIBRARY_SUFFIX} + ${MKLML_DIR}/lib/libiomp5md${CMAKE_STATIC_LIBRARY_SUFFIX}) + + + target_link_libraries(${TARGET} libpaddle_api_light_bundled.lib) + target_link_libraries(${TARGET} shlwapi.lib) + target_link_libraries(${TARGET} ${MATH_LIB}) + + add_custom_command(TARGET ${TARGET} POST_BUILD + COMMAND ${CMAKE_COMMAND} -E copy ${MKLML_DIR}/lib/mklml.dll ${CMAKE_BINARY_DIR}/Release + COMMAND ${CMAKE_COMMAND} -E copy ${MKLML_DIR}/lib/libiomp5md.dll ${CMAKE_BINARY_DIR}/Release + ) +else() + target_link_libraries(${TARGET} -lpaddle_light_api_shared) + target_link_libraries(${TARGET} -liomp5) + target_link_libraries(${TARGET} -ldl) +endif() diff --git a/lite/demo/cxx/x86_mobilenetv1_light_demo/build.bat b/lite/demo/cxx/x86_mobilenetv1_light_demo/build.bat new file mode 100644 index 0000000000000000000000000000000000000000..bc5ba16f162387f74765f6273123f2f606f0a9e4 --- /dev/null +++ b/lite/demo/cxx/x86_mobilenetv1_light_demo/build.bat @@ -0,0 +1,61 @@ +@echo off +setlocal +setlocal enabledelayedexpansion + +set source_path=%~dp0 + +set build_directory=%source_path%\build + +if EXIST "%build_directory%" ( + call:rm_rebuild_dir "%build_directory%" +) + +md "%build_directory%" +set vcvarsall_dir=C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\vcvarsall.bat + +IF NOT EXIST "%vcvarsall_dir%" ( + goto set_vcvarsall_dir +) else ( + goto cmake +) + +:set_vcvarsall_dir +SET /P vcvarsall_dir="Please input the path of visual studio command Prompt, such as C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\vcvarsall.bat =======>" +set tmp_var=!vcvarsall_dir! +call:remove_space +set vcvarsall_dir=!tmp_var! +IF NOT EXIST "!vcvarsall_dir!" ( + echo "------------!vcvarsall_dir! not exist------------" + goto set_vcvarsall_dir +) + +:cmake +D: +cd "%build_directory%" + +cmake .. -G "Visual Studio 14 2015 Win64" -T host=x64 + +call "%vcvarsall_dir%" amd64 + +msbuild /maxcpucount:8 /p:Configuration=Release mobilenet_light_api.vcxproj + +goto:eof + +:rm_rebuild_dir + del /f /s /q "%~1\*.*" >nul 2>&1 + rd /s /q "%~1" >nul 2>&1 +goto:eof + +:remove_space +:remove_left_space +if "%tmp_var:~0,1%"==" " ( + set "tmp_var=%tmp_var:~1%" + goto remove_left_space +) + +:remove_right_space +if "%tmp_var:~-1%"==" " ( + set "tmp_var=%tmp_var:~0,-1%" + goto remove_left_space +) +goto:eof diff --git a/lite/model_parser/model_parser.cc b/lite/model_parser/model_parser.cc index 43f46dd481d63f9fa9a597fe2fde407fd0ae9688..b253c911a36dc8896f1fd1db6c27c0a4e3d17994 100644 --- a/lite/model_parser/model_parser.cc +++ b/lite/model_parser/model_parser.cc @@ -324,7 +324,7 @@ void SaveCombinedParamsPb(const std::string &path, std::sort(paramlist.begin(), paramlist.end()); // Load vars - std::ofstream file(path); + std::ofstream file(path, std::ios::binary); CHECK(file.is_open()); for (size_t i = 0; i < paramlist.size(); ++i) { SerializeTensor(file, exec_scope, paramlist[i]); diff --git a/lite/utils/io.h b/lite/utils/io.h index 92405cae862f062090665aecc8eb7f207cf059e7..506901bad5f75c5c1564f6340c7f687537de2e68 100644 --- a/lite/utils/io.h +++ b/lite/utils/io.h @@ -38,10 +38,17 @@ static bool IsFileExists(const std::string& path) { // ARM mobile not support mkdir in C++ static void MkDirRecur(const std::string& path) { #ifndef LITE_WITH_ARM + +#ifdef _WIN32 + if (system(string_format("md %s", path.c_str()).c_str()) != 0) { + LOG(ERROR) << "Cann't mkdir " << path; + } +#else if (system(string_format("mkdir -p %s", path.c_str()).c_str()) != 0) { LOG(ERROR) << "Cann't mkdir " << path; } -#else // On ARM +#endif // _WIN32 +#else // On ARM CHECK_NE(mkdir(path.c_str(), S_IRWXU), -1) << "Cann't mkdir " << path; #endif }