提交 3761dbf7 编写于 作者: S silingtong123 提交者: GitHub

add the document of inference library compiled on windows (#3688)

上级 fdc3242e
......@@ -32,14 +32,26 @@ tar zxf mobilenet_v1.tar.gz
![image](https://paddlelite-data.bj.bcebos.com/doc_images/cxx_demo/3inference_model.png)
(2)下载[opt工具](https://github.com/PaddlePaddle/Paddle-Lite/releases/download/v2.3.0/opt)。放入同一文件夹,终端输入命令转化模型:
(2)模型转换
```shell
wget https://github.com/PaddlePaddle/Paddle-Lite/releases/download/v2.3.0/opt
chmod +x opt
./opt --model_dir=./mobilenet_v1 --optimize_out_type=naive_buffer --optimize_out=./mobilenet_v1_opt
```
- v2.6.0版本之前
下载[opt工具](https://github.com/PaddlePaddle/Paddle-Lite/releases/download/v2.3.0/opt)。放入同一文件夹,终端输入命令转化模型
```shell
wget https://github.com/PaddlePaddle/Paddle-Lite/releases/download/v2.3.0/opt
chmod +x opt
./opt --model_dir=./mobilenet_v1 --optimize_out_type=naive_buffer --optimize_out=./mobilenet_v1_opt
```
- v2.6.0版本以及后续版本
安装paddlelite,终端输入命令转化模型
```shell
python -m pip install paddlelite
paddle_opt_lite --model_dir=./mobilenet_v1 --optimize_out_type=naive_buffer --optimize_out=./mobilenet_v1_opt
```
**结果如下图所示:**
![image](https://paddlelite-data.bj.bcebos.com/doc_images/cxx_demo/2opt_model.png)
......
# Python Demo
## 1. 下载最新版本python预测库
```shell
python -m pip install paddlelite
```
## 2. 转化模型
PaddlePaddle的原生模型需要经过[opt]()工具转化为Paddle-Lite可以支持的naive_buffer格式。
`mobilenet_v1`模型为例:
(1)下载[mobilenet_v1模型](http://paddle-inference-dist.bj.bcebos.com/mobilenet_v1.tar.gz)后解压:
```shell
wget http://paddle-inference-dist.bj.bcebos.com/mobilenet_v1.tar.gz
tar zxf mobilenet_v1.tar.gz
```
(2)使用opt工具:
从磁盘加载模型时,根据模型和参数文件存储方式不同,加载模型和参数的路径有两种形式。
- Linux环境
- 非combined形式:模型文件夹model_dir下存在一个模型文件和多个参数文件时,传入模型文件夹路径,模型文件名默认为__model__。
```shell
paddle_lite_opt --model_dir=./mobilenet_v1 \
--optimize_out=mobilenet_v1_opt \
--optimize_out_type=naive_buffer \
--valid_targets=x86
```
- combined形式:模型文件夹model_dir下只有一个模型文件__model__和一个参数文件__params__时,传入模型文件和参数文件路径
```shell
paddle_lite_opt --model_file=./mobilenet_v1/__model__ \
--param_file=./mobilenet_v1/__params__ \
--optimize_out=mobilenet_v1_opt \
--optimize_out_type=naive_buffer \
--valid_targets=x86
```
- windows环境
windows 暂不支持命令行方式直接运行模型转换器,需要编写python脚本
```python
import paddlelite.lite as lite
a=lite.Opt()
# 非combined形式
a.set_model_dir("D:\\YOU_MODEL_PATH\\mobilenet_v1")
# conmbined形式
# a.set_model_file("D:\\YOU_MODEL_PATH\\mobilenet_v1\\__model__")
# a.set_param_file("D:\\YOU_MODEL_PATH\\mobilenet_v1\\__params__")
a.set_optimize_out("mobilenet_v1_opt")
a.set_valid_places("x86")
a.run()
```
- MAC 环境
Opt工具使用方式同Linux(MAC环境暂不支持python端预测,下个版本会修复该问题)
## 3. 编写预测程序
准备好预测库和模型,我们便可以编写程序来执行预测。我们提供涵盖图像分类、目标检测等多种应用场景的C++示例demo可供参考,创建文件mobilenetV1_light_api.py,
python demo 完整代码位于 [demo/python](https://github.com/PaddlePaddle/Paddle-Lite/blob/develop/lite/demo/python/mobilenetv1_light_api.py)
(1) 设置config信息
```python
from paddlelite.lite import *
config = MobileConfig()
config.set_model_dir(/YOU_MODEL_PATH/mobilenet_v1_opt.nb)
```
(2) 创建predictor
```python
predictor = create_paddle_predictor(config)
```
(3) 设置输入数据
```python
input_tensor = predictor.get_input(0)
input_tensor.resize([1, 3, 224, 224])
input_tensor.set_float_data([1.] * 3 * 224 * 224)
```
(4) 执行预测
```python
predictor.run()
```
(5) 得到输出数据
```python
output_tensor = predictor.get_output(0)
print(output_tensor.shape())
print(output_tensor.float_data()[:10])
```
## 4. 运行文件
```shell
python mobilenetV1_light_api.py
```
......@@ -4,8 +4,6 @@
Paddle-Lite 支持在Docker或Linux环境编译x86预测库。环境搭建参考[环境准备](../user_guides/source_compile)
(注意:非docker Linux环境需要是Ubuntu16.04)
### 编译
1、 下载代码
......@@ -20,10 +18,11 @@ git checkout release/v2.6.0
```bash
cd Paddle-Lite
./lite/tools/build.sh x86
./lite/tools/build.sh --build_python=ON x86
# 其他可选择编译选项
# --with_log=OFF 关闭LOG信息输出
# --build_python=OFF 编译python预测库
```
### 编译结果说明
......@@ -53,8 +52,17 @@ x86编译结果位于 `build.lite.x86/inference_lite_lib`
- `mobilenetv1_full` :使用full_api 执行mobilenet_v1预测的C++ demo
- `mobilenetv1_light` :使用light_api 执行mobilenet_v1预测的C++ demo
5、 `demo/python`文件夹:x86预测库的Python 示例demo
- `mobilenetv1_full_api.py` :使用full_api 执行mobilenet_v1预测的Python demo
- `mobilenetv1_light_api.py` :使用light_api 执行mobilenet_v1预测的Python demo
6、 `python`文件夹:包含python的库文件和对应的.whl包
- `install`文件夹:编译成功的.whl包位于`install/dist/*.whl`
- `lib`文件夹:.whl包依赖的库文件
**(若不需要编译python预测库,则将编译命令替换为`./lite/tools/build.sh x86`)**
### x86预测API使用示例
......@@ -64,7 +72,8 @@ x86编译结果位于 `build.lite.x86/inference_lite_lib`
mobilenetv1_full/
|-- CMakeLists.txt
|-- build.sh
`-- mobilenet_full_api.cc
|-- build.bat
-- mobilenet_full_api.cc
```
本demo使用cmake构建`CMakeLists.txt`为cmake脚本,`mobilenet_full_api.cc`是x86示例的源代码、`build.sh`为编译的脚本。
......@@ -168,8 +177,8 @@ int main(int argc, char** argv) {
#### 编译环境需求
- Windows 10 专业版
- 目前Windows暂不支持GPU模式
- *Python 版本 2.7/3.5.1+/3.6/3.7 (64 bit)*
- 目前Windows暂不支持GPU编译
- *Python 版本 2.7/3.5.1+ (64 bit)*
- *pip 或 pip3 版本 9.0.1+ (64 bit)*
- *Visual Studio 2015 Update3*
......@@ -187,15 +196,15 @@ int main(int argc, char** argv) {
```bash
git clone https://github.com/PaddlePaddle/Paddle-Lite.git
# 切换到release分支
git checkout release/v2.3
git checkout release/v2.6.0
```
2、 源码编译
2、 源码编译(需要按照提示输入对应的参数)
```bash
```dos
cd Paddle-Lite
lite/tools/build_windows.bat with_extra with_python with_profile
lite\tools\build_windows.bat with_extra with_python with_profile
```
编译脚本`lite/tools/build.bat`,追加参数说明:
编译脚本`build_windows.bat`,追加参数说明:
| 参数 | 介绍 | 值 |
|-----------|-------------|-------------|
......@@ -203,40 +212,62 @@ lite/tools/build_windows.bat with_extra with_python with_profile
| with_python | 可选,是否编译python预测库(默认为OFF) 。 | `ON``OFF` |
| with_profile | 可选,是否支持分析器模式(默认为OFF) 。 | `ON``OFF` |
### 编译结果
### 编译结果说明
x86编译结果位于 `build.lite.x86/inference_lite_lib`
**具体内容**说明:
1、 `bin`文件夹:可执行工具文件 `test_model_bin`
2、 `cxx`文件夹:包含c++的库文件与相应的头文件
1、 `cxx`文件夹:包含c++的库文件与相应的头文件
- `include` : 头文件
- `lib` : 库文件
- 打包的静态库文件:
- 静态库文件:
- `libpaddle_api_full_bundled.lib` :full_api 静态库
- `libpaddle_api_light_bundled.lib` :light_api 静态库
3、 `third_party` 文件夹:第三方库文件
2、 `third_party` 文件夹:依赖的第三方预测库mklml
- mklml : Paddle-Lite预测库依赖的mklml数学库
3、 `demo/cxx`文件夹:x86预测库的C++ 示例demo
- `mobilenetv1_full` :使用full_api 执行mobilenet_v1预测的C++ demo
- `mobilenetv1_light` :使用light_api 执行mobilenet_v1预测的C++ demo
4、 `demo/python`: x86预测库的Python示例demo
- `mobilenetv1_full_api.py`:使用full_api 执行mobilenet_v1预测的Python demo
- `mobilenetv1_light_api.py`:使用full_api 执行mobilenet_v1预测的Python demo
5、 `python`文件夹:包含python的库文件和对应的.whl包
- `install`文件夹:编译成功的.whl包位于`install/dist/*.whl`
- `lib`文件夹:.whl包依赖的库文件
### x86预测API使用示例
1、我们提供Windows环境下x86 API运行mobilenet_v1的示例:[mobilenet_full_x86demo](https://paddlelite-data.bj.bcebos.com/x86/mobilenet_full_x86demo.zip)。下载解压后内容如下>:
1、`mobilenetv1_full`目录结构
![](https://paddlelite-data.bj.bcebos.com/x86/x86-doc/demo.png)
```bash
mobilenetv1_full/
|-- CMakeLists.txt
|-- build.sh
|-- build.bat
`-- mobilenet_full_api.cc
```
`mobilenet_v1`为模型文件、`lib``include`分别是Paddle-Lite的预测库和头文件、`third_party`下是编译时依赖的第三方库`mklml``mobilenet_full_api.cc`是x86示例的源代码、`build.bat`为编译的脚本。
本demo使用cmake构建`CMakeLists.txt`为cmake脚本,`mobilenet_full_api.cc`是x86示例的源代码、`build.sh`为Linux x86编译的脚本,`build.bat`为windows x86编译脚本。
2、demo内容与使用方法
2、demo使用方法
``` bash
# 1、编译(需在vs2015的命令窗口执行该脚本)
# 1、编译
cd mobilenetv1_full
build.bat
cd build
```
编译结果为当前目录下的 `Release\\mobilenet_full_api.exe`
``` bash
编译结果为当前目录下的 `Release\mobilenet_full_api.exe `
``` dos
# 2、执行预测
Release\\mobilenet_full_api.exe ..\mobilenet_v1
Release\mobilenet_full_api.exe mobilenet_v1
```
`mobilenet_v1`为模型路径,`mobilenet_full_api.exe`为第一步编译出的可执行文件
下载并解压模型[`mobilenet_v1`](http://paddle-inference-dist.bj.bcebos.com/mobilenet_v1.tar.gz)到当前`build`目录,执行以上命令进行预测
......@@ -47,6 +47,7 @@ Welcome to Paddle-Lite's documentation!
demo_guides/cpp_demo
demo_guides/java_demo
demo_guides/python_demo
demo_guides/android_app_demo
demo_guides/ios_app_demo
demo_guides/x86
......
......@@ -10,11 +10,12 @@ PaddleLite 提供了移动端的一键源码编译脚本 `lite/tools/build.sh`
## 一、环境准备
目前支持种编译的环境:
目前支持种编译的环境:
1. Docker 容器环境,
2. Linux(推荐 Ubuntu 16.04)环境,
3. Mac OS 环境。
3. Mac OS 环境,
4. [Windows 环境](../demo_guides/x86.html#windows)
### 1、 Docker开发环境
......
......@@ -224,11 +224,11 @@ if (LITE_WITH_X86)
add_dependencies(publish_inference publish_inference_x86_cxx_lib)
add_custom_target(publish_inference_x86_cxx_demos ${TARGET}
COMMAND ${CMAKE_COMMAND} -E make_directory "${INFER_LITE_PUBLISH_ROOT}/third_party"
COMMAND ${CMAKE_COMMAND} -E copy_directory "${CMAKE_BINARY_DIR}/third_party/install" "${INFER_LITE_PUBLISH_ROOT}/third_party"
COMMAND ${CMAKE_COMMAND} -E copy_directory "${CMAKE_BINARY_DIR}/third_party/eigen3" "${INFER_LITE_PUBLISH_ROOT}/third_party"
COMMAND ${CMAKE_COMMAND} -E make_directory "${INFER_LITE_PUBLISH_ROOT}/third_party/mklml"
COMMAND ${CMAKE_COMMAND} -E copy_directory "${CMAKE_BINARY_DIR}/third_party/install/mklml" "${INFER_LITE_PUBLISH_ROOT}/third_party/mklml"
COMMAND ${CMAKE_COMMAND} -E make_directory "${INFER_LITE_PUBLISH_ROOT}/demo/cxx"
COMMAND ${CMAKE_COMMAND} -E copy_directory "${CMAKE_SOURCE_DIR}/lite/demo/cxx" "${INFER_LITE_PUBLISH_ROOT}/demo/cxx"
COMMAND ${CMAKE_COMMAND} -E copy_directory "${CMAKE_SOURCE_DIR}/lite/demo/cxx/x86_mobilenetv1_light_demo" "${INFER_LITE_PUBLISH_ROOT}/demo/cxx/mobilenetv1_light"
COMMAND ${CMAKE_COMMAND} -E copy_directory "${CMAKE_SOURCE_DIR}/lite/demo/cxx/x86_mobilenetv1_full_demo" "${INFER_LITE_PUBLISH_ROOT}/demo/cxx/mobilenetv1_full"
)
add_dependencies(publish_inference_x86_cxx_lib publish_inference_x86_cxx_demos)
add_dependencies(publish_inference_x86_cxx_demos paddle_api_full_bundled eigen3)
......
......@@ -6,16 +6,44 @@ set(TARGET mobilenet_full_api)
set(LITE_DIR "${PROJECT_SOURCE_DIR}/../../../cxx")
set(MKLML_DIR "${PROJECT_SOURCE_DIR}/../../../third_party/mklml/")
if (WIN32)
add_definitions("/DGOOGLE_GLOG_DLL_DECL=")
option(MSVC_STATIC_CRT "use static C Runtime library by default" ON)
if (MSVC_STATIC_CRT)
set(CMAKE_C_FLAGS_DEBUG "${CMAKE_C_FLAGS_DEBUG} /bigobj /MTd")
set(CMAKE_C_FLAGS_RELEASE "${CMAKE_C_FLAGS_RELEASE} /bigobj /MT")
set(CMAKE_CXX_FLAGS_DEBUG "${CMAKE_CXX_FLAGS_DEBUG} /bigobj /MTd")
set(CMAKE_CXX_FLAGS_RELEASE "${CMAKE_CXX_FLAGS_RELEASE} /bigobj /MT")
endif()
endif()
# 2. link mklml and Paddle-Lite directory
link_directories(${LITE_DIR}/lib ${MKLML_DIR}/lib)
include_directories(${LITE_DIR}/include/ ${MKLML_DIR}/include)
# 3. compile options
add_definitions(-std=c++11 -g -O3 -pthread)
set(EXECUTABLE_OUTPUT_PATH ${PROJECT_SOURCE_DIR})
if (NOT WIN32)
add_definitions(-std=c++11 -g -O3 -pthread)
set(EXECUTABLE_OUTPUT_PATH ${PROJECT_SOURCE_DIR})
endif()
# 4.add executable output
add_executable(${TARGET} ${TARGET}.cc)
target_link_libraries(${TARGET} -lpaddle_full_api_shared)
target_link_libraries(${TARGET} -liomp5)
target_link_libraries(${TARGET} -ldl)
if (WIN32)
set(MATH_LIB ${MKLML_DIR}/lib/mklml${CMAKE_STATIC_LIBRARY_SUFFIX}
${MKLML_DIR}/lib/libiomp5md${CMAKE_STATIC_LIBRARY_SUFFIX})
target_link_libraries(${TARGET} libpaddle_api_full_bundled.lib)
target_link_libraries(${TARGET} shlwapi.lib)
target_link_libraries(${TARGET} ${MATH_LIB})
add_custom_command(TARGET ${TARGET} POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy ${MKLML_DIR}/lib/mklml.dll ${CMAKE_BINARY_DIR}/Release
COMMAND ${CMAKE_COMMAND} -E copy ${MKLML_DIR}/lib/libiomp5md.dll ${CMAKE_BINARY_DIR}/Release
)
else()
target_link_libraries(${TARGET} -lpaddle_full_api_shared)
target_link_libraries(${TARGET} -liomp5)
target_link_libraries(${TARGET} -ldl)
endif()
@echo off
setlocal
setlocal enabledelayedexpansion
set source_path=%~dp0
set build_directory=%source_path%\build
if EXIST "%build_directory%" (
call:rm_rebuild_dir "%build_directory%"
)
md "%build_directory%"
set vcvarsall_dir=C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\vcvarsall.bat
IF NOT EXIST "%vcvarsall_dir%" (
goto set_vcvarsall_dir
) else (
goto cmake
)
:set_vcvarsall_dir
SET /P vcvarsall_dir="Please input the path of visual studio command Prompt, such as C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\vcvarsall.bat =======>"
set tmp_var=!vcvarsall_dir!
call:remove_space
set vcvarsall_dir=!tmp_var!
IF NOT EXIST "!vcvarsall_dir!" (
echo "------------!vcvarsall_dir! not exist------------"
goto set_vcvarsall_dir
)
:cmake
D:
cd "%build_directory%"
cmake .. -G "Visual Studio 14 2015 Win64" -T host=x64
call "%vcvarsall_dir%" amd64
msbuild /maxcpucount:8 /p:Configuration=Release mobilenet_full_api.vcxproj
goto:eof
:rm_rebuild_dir
del /f /s /q "%~1\*.*" >nul 2>&1
rd /s /q "%~1" >nul 2>&1
goto:eof
:remove_space
:remove_left_space
if "%tmp_var:~0,1%"==" " (
set "tmp_var=%tmp_var:~1%"
goto remove_left_space
)
:remove_right_space
if "%tmp_var:~-1%"==" " (
set "tmp_var=%tmp_var:~0,-1%"
goto remove_left_space
)
goto:eof
......@@ -16,6 +16,11 @@
#include <vector>
#include "paddle_api.h" // NOLINT
#ifdef _WIN32
#include "paddle_use_kernels.h" // NOLINT
#include "paddle_use_ops.h" // NOLINT
#endif
using namespace paddle::lite_api; // NOLINT
int64_t ShapeProduction(const shape_t& shape) {
......
......@@ -6,16 +6,44 @@ set(TARGET mobilenet_light_api)
set(LITE_DIR "${PROJECT_SOURCE_DIR}/../../../cxx")
set(MKLML_DIR "${PROJECT_SOURCE_DIR}/../../../third_party/mklml/")
if (WIN32)
add_definitions("/DGOOGLE_GLOG_DLL_DECL=")
option(MSVC_STATIC_CRT "use static C Runtime library by default" ON)
if (MSVC_STATIC_CRT)
set(CMAKE_C_FLAGS_DEBUG "${CMAKE_C_FLAGS_DEBUG} /bigobj /MTd")
set(CMAKE_C_FLAGS_RELEASE "${CMAKE_C_FLAGS_RELEASE} /bigobj /MT")
set(CMAKE_CXX_FLAGS_DEBUG "${CMAKE_CXX_FLAGS_DEBUG} /bigobj /MTd")
set(CMAKE_CXX_FLAGS_RELEASE "${CMAKE_CXX_FLAGS_RELEASE} /bigobj /MT")
endif()
endif()
# 2. link mklml and Paddle-Lite directory
link_directories(${LITE_DIR}/lib ${MKLML_DIR}/lib)
include_directories(${LITE_DIR}/include/ ${MKLML_DIR}/include)
# 3. compile options
add_definitions(-std=c++11 -g -O3 -pthread)
set(EXECUTABLE_OUTPUT_PATH ${PROJECT_SOURCE_DIR})
if (NOT WIN32)
add_definitions(-std=c++11 -g -O3 -pthread)
set(EXECUTABLE_OUTPUT_PATH ${PROJECT_SOURCE_DIR})
endif()
# 4.add executable output
add_executable(${TARGET} ${TARGET}.cc)
target_link_libraries(${TARGET} -lpaddle_light_api_shared)
target_link_libraries(${TARGET} -liomp5)
target_link_libraries(${TARGET} -ldl)
if (WIN32)
set(MATH_LIB ${MKLML_DIR}/lib/mklml${CMAKE_STATIC_LIBRARY_SUFFIX}
${MKLML_DIR}/lib/libiomp5md${CMAKE_STATIC_LIBRARY_SUFFIX})
target_link_libraries(${TARGET} libpaddle_api_light_bundled.lib)
target_link_libraries(${TARGET} shlwapi.lib)
target_link_libraries(${TARGET} ${MATH_LIB})
add_custom_command(TARGET ${TARGET} POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy ${MKLML_DIR}/lib/mklml.dll ${CMAKE_BINARY_DIR}/Release
COMMAND ${CMAKE_COMMAND} -E copy ${MKLML_DIR}/lib/libiomp5md.dll ${CMAKE_BINARY_DIR}/Release
)
else()
target_link_libraries(${TARGET} -lpaddle_light_api_shared)
target_link_libraries(${TARGET} -liomp5)
target_link_libraries(${TARGET} -ldl)
endif()
@echo off
setlocal
setlocal enabledelayedexpansion
set source_path=%~dp0
set build_directory=%source_path%\build
if EXIST "%build_directory%" (
call:rm_rebuild_dir "%build_directory%"
)
md "%build_directory%"
set vcvarsall_dir=C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\vcvarsall.bat
IF NOT EXIST "%vcvarsall_dir%" (
goto set_vcvarsall_dir
) else (
goto cmake
)
:set_vcvarsall_dir
SET /P vcvarsall_dir="Please input the path of visual studio command Prompt, such as C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\vcvarsall.bat =======>"
set tmp_var=!vcvarsall_dir!
call:remove_space
set vcvarsall_dir=!tmp_var!
IF NOT EXIST "!vcvarsall_dir!" (
echo "------------!vcvarsall_dir! not exist------------"
goto set_vcvarsall_dir
)
:cmake
D:
cd "%build_directory%"
cmake .. -G "Visual Studio 14 2015 Win64" -T host=x64
call "%vcvarsall_dir%" amd64
msbuild /maxcpucount:8 /p:Configuration=Release mobilenet_light_api.vcxproj
goto:eof
:rm_rebuild_dir
del /f /s /q "%~1\*.*" >nul 2>&1
rd /s /q "%~1" >nul 2>&1
goto:eof
:remove_space
:remove_left_space
if "%tmp_var:~0,1%"==" " (
set "tmp_var=%tmp_var:~1%"
goto remove_left_space
)
:remove_right_space
if "%tmp_var:~-1%"==" " (
set "tmp_var=%tmp_var:~0,-1%"
goto remove_left_space
)
goto:eof
......@@ -324,7 +324,7 @@ void SaveCombinedParamsPb(const std::string &path,
std::sort(paramlist.begin(), paramlist.end());
// Load vars
std::ofstream file(path);
std::ofstream file(path, std::ios::binary);
CHECK(file.is_open());
for (size_t i = 0; i < paramlist.size(); ++i) {
SerializeTensor(file, exec_scope, paramlist[i]);
......
......@@ -38,9 +38,16 @@ static bool IsFileExists(const std::string& path) {
// ARM mobile not support mkdir in C++
static void MkDirRecur(const std::string& path) {
#ifndef LITE_WITH_ARM
#ifdef _WIN32
if (system(string_format("md %s", path.c_str()).c_str()) != 0) {
LOG(ERROR) << "Cann't mkdir " << path;
}
#else
if (system(string_format("mkdir -p %s", path.c_str()).c_str()) != 0) {
LOG(ERROR) << "Cann't mkdir " << path;
}
#endif // _WIN32
#else // On ARM
CHECK_NE(mkdir(path.c_str(), S_IRWXU), -1) << "Cann't mkdir " << path;
#endif
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册