提交 d547cc8e 编写于 作者: M mindspore-ci-bot 提交者: Gitee

!946 add Convertor tools docs in windows enviroment

Merge pull request !946 from liuwenhao/master
......@@ -10,6 +10,11 @@
- [Output Description](#output-description)
- [Description of Converter's Directory Structure](#description-of-converters-directory-structure)
- [Description of Runtime and Other tools' Directory Structure](#description-of-runtime-and-other-tools-directory-structure)
- [Windows Environment Compilation](#windows-environment-compilation)
- [Environment Requirements](#environment-requirements-1)
- [Compilation Options](#compilation-options-1)
- [Compilation Example](#compilation-example-1)
- [Output Description](#output-description-1)
<!-- /TOC -->
......@@ -178,3 +183,49 @@ The inference framework can be obtained under `-I x86_64`, `-I arm64` and `-I ar
> 1. `liboptimize.so` only exists in the output package of runtime-arm64 and is only used on ARMv8.2 and CPUs that support fp16.
> 2. Compile ARM64 to get the inference framework output of arm64-cpu by default, if you add `-e gpu`, you will get the inference framework output of arm64-gpu, and the package name is `mindspore-lite-{version}-runtime-arm64-gpu.tar.gz`, compiling ARM32 is in the same way.
> 3. Before running the tools in the converter, benchmark or time_profile directory, you need to configure environment variables, and configure the path where the dynamic libraries of MindSpore Lite and Protobuf are located to the path where the system searches for dynamic libraries. Take the compiled under version 0.7.0-beta as an example: configure converter: `export LD_LIBRARY_PATH=./output/mindspore-lite-0.7.0-converter-ubuntu/third_party/protobuf/lib:./output/mindspore-lite-0.7.0-converter-ubuntu/third_party/flatbuffers/lib:${LD_LIBRARY_PATH}`; configure benchmark and timeprofiler: `export LD_LIBRARY_PATH= ./output/mindspore-lite-0.7.0-runtime-x86-cpu/lib:${LD_LIBRARY_PATH}`.
## Windows Environment Compilation
### Environment Requirements
- The supported compilation environment is: Windows 10, 64-bit.
- Compilation dependencies are:
- [CMake](https://cmake.org/download/) >= 3.14.1
- [MinGW GCC](https://sourceforge.net/projects/mingw-w64/files/Toolchains%20targetting%20Win64/Personal%20Builds/mingw-builds/7.3.0/threads-posix/seh/x86_64-7.3.0-release-posix-seh-rt_v5-rev0.7z/download) = 7.3.0
- [Python](https://www.python.org/) >= 3.7.5
> The compilation script will execute `git clone` to obtain the code of the third-party dependent libraries. Please make sure that the git network settings are correct and available in advance.
### Compilation Options
The compilation options of MindSpore Lite are as follows:
| Parameter | Parameter Description | Mandatory or Not |
| -------- | ----- | ---- |
| **lite** | **Set this parameter to compile the Mindspore Lite project.** | **Yes** |
| [n] | Set the number of threads used during compilation, otherwise the default is set to 6 threads. | No |
### Compilation Example
First, use the git tool to download the source code from the MindSpore code repository.
```bash
git clone https://gitee.com/mindspore/mindspore.git
```
Then, use the cmd tool to compile MindSpore Lite in the root directory of the source code and execute the following commands.
- Compile the Windows version with the default number of threads (6 threads).
```bash
call build.bat lite
```
- Compile the Windows version with the specified number of threads 8.
```bash
call build.bat lite 8
```
### Output Description
After the compilation is complete, enter the `mindspore/output/` directory, unzip the output file `mindspore-lite-{version}-converter-win-cpu.zip`, which contains the conversion tool executable file.
> version: version of the output, consistent with that of the MindSpore.
......@@ -8,6 +8,10 @@
- [Environment Preparation](#environment-preparation)
- [Example](#example)
- [Parameter Description](#parameter-description)
- [Windows Environment Instructions](#windows-environment-instructions)
- [Environment Preparation](#environment-preparation-1)
- [Parameter Description](#parameter-description-1)
- [Example](#example-1)
<!-- /TOC -->
......@@ -107,4 +111,73 @@ The following describes the parameters in detail.
|`--mean=<MEAN>`| No(supported by aware quant models only) | Sets the mean value of the input data. | [-128, 127] | -0.5 |
> - The parameter name and parameter value are separated by an equal sign (=) and no space is allowed between them.
> - The Caffe model is divided into two files: model structure `*.prototxt`, corresponding to the `--modelFile` parameter; model weight `*.caffemodel`, corresponding to the `--weightFile` parameter
\ No newline at end of file
> - The Caffe model is divided into two files: model structure `*.prototxt`, corresponding to the `--modelFile` parameter; model weight `*.caffemodel`, corresponding to the `--weightFile` parameter
## Windows Environment Instructions
### Environment Preparation
To use the MindSpore Lite model conversion tool, the following environment preparations are required.
- Compile: The model conversion tool code is in the `mindspore/lite/tools/converter` directory of the MindSpore source code, refer to the [Environment Requirements](https://www.mindspore.cn/lite/tutorial/en/master/build.html#environment-requirements-1) and [Compilation Example](https://www.mindspore.cn/lite/tutorial/en/master/build.html#compilation-example-1) in the build document.
- Run: Refer to [Output Description](https://www.mindspore.cn/lite/tutorial/en/master/build.html#output-description-1) in the deployment document to obtain the `converter` tool, and set the environment variable of MinGW(Add the bin directory of MinGW in the system variable Path).
### Parameter Description
Reference description Linux environment model conversion tool [parameter description](https://www.mindspore.cn/lite/tutorial/en/master/use/converter_tool.html#parameter-description).
### Example
First, use the cmd tool to enter the command to compile in the root directory of the source code, refer to `build.md`.
```bash
call build.bat lite
```
Then, set the log printing level to INFO.
```bash
set MSLOG=INFO
```
Several common examples are selected below to illustrate the use of conversion commands.
- Take Caffe model LeNet as an example to execute the conversion command.
```bash
call converter_lite --fmk=CAFFE --modelFile=lenet.prototxt --weightFile=lenet.caffemodel --outputFile=lenet
```
In this example, because the Caffe model is used, two input files of model structure and model weight are required. Then plus fmk type and output path two parameters which are required, you can successfully execute.
The result is shown as:
```
INFO [converter/converter.cc:190] Runconverter] CONVERTER RESULT: SUCCESS!
```
This means that the Caffe model has been successfully converted to the MindSpore Lite model and the new file `lenet.ms` has been obtained.
- Take MindSpore, TensorFlow Lite, ONNX model format and perceptual quantization model as examples to execute conversion commands.
- MindSpore model `model.mindir`
```bash
call converter_lite --fmk=MS --modelFile=model.mindir --outputFile=model
```
- TensorFlow Lite model`model.tflite`
```bash
call converter_lite --fmk=TFLITE --modelFile=model.tflite --outputFile=model
```
- ONNX model`model.onnx`
```bash
call converter_lite --fmk=ONNX --modelFile=model.onnx --outputFile=model
```
- TensorFlow Lite awaring quant model `model_quant.tflite`
```bash
call converter_lite --fmk=TFLITE --modelFile=model_quant.tflite --outputFile=model --quantType=AwareTraining
```
In the above cases, the following conversion success prompt is displayed, and the `model.ms` target file is obtained at the same time.
```
INFO [converter/converter.cc:190] Runconverter] CONVERTER RESULT: SUCCESS!
```
......@@ -10,6 +10,11 @@
- [编译输出](#编译输出)
- [模型转换工具converter目录结构说明](#模型转换工具converter目录结构说明)
- [模型推理框架runtime及其他工具目录结构说明](#模型推理框架runtime及其他工具目录结构说明)
- [Windows环境编译](#windows环境编译)
- [环境要求](#环境要求-1)
- [编译选项](#编译选项-1)
- [编译示例](#编译示例-1)
- [编译输出](#编译输出-1)
<!-- /TOC -->
......@@ -178,4 +183,51 @@ tar -xvf mindspore-lite-{version}-runtime-{os}-{device}.tar.gz
> 1. `liboptimize.so`仅在runtime-arm64的输出包中存在,仅在ARMv8.2和支持fp16特性的CPU上使用。
> 2. 编译ARM64默认可获得arm64-cpu的推理框架输出件,若添加`-e gpu`则获得arm64-gpu的推理框架输出件,此时包名为`mindspore-lite-{version}-runtime-arm64-gpu.tar.gz`,编译ARM32同理。
> 3. 运行converter、benchmark或time_profile目录下的工具前,都需配置环境变量,将MindSpore Lite和Protobuf的动态库所在的路径配置到系统搜索动态库的路径中。以0.7.0-beta版本下编译为例:配置converter:`export LD_LIBRARY_PATH=./output/mindspore-lite-0.7.0-converter-ubuntu/third_party/protobuf/lib:./output/mindspore-lite-0.7.0-converter-ubuntu/third_party/flatbuffers/lib:${LD_LIBRARY_PATH}`;配置benchmark和timeprofiler:`export LD_LIBRARY_PATH=./output/mindspore-lite-0.7.0-runtime-x86-cpu/lib:${LD_LIBRARY_PATH}`
> 3. 运行converter、benchmark或time_profile目录下的工具前,都需配置环境变量,将MindSpore Lite和Protobuf的动态库所在的路径配置到系统搜索动态库的路径中。以0.7.0-beta版本下编译为例:配置converter:`export LD_LIBRARY_PATH=./output/mindspore-lite-0.7.0-converter-ubuntu/third_party/protobuf/lib:./output/mindspore-lite-0.7.0-converter-ubuntu/third_party/flatbuffers/lib:${LD_LIBRARY_PATH}`;配置benchmark和timeprofiler:`export LD_LIBRARY_PATH=./output/mindspore-lite-0.7.0-runtime-x86-cpu/lib:${LD_LIBRARY_PATH}`。
## Windows环境编译
### 环境要求
- 支持的编译环境为:Windows 10,64位。
- 编译依赖
- [CMake](https://cmake.org/download/) >= 3.14.1
- [MinGW GCC](https://sourceforge.net/projects/mingw-w64/files/Toolchains%20targetting%20Win64/Personal%20Builds/mingw-builds/7.3.0/threads-posix/seh/x86_64-7.3.0-release-posix-seh-rt_v5-rev0.7z/download) = 7.3.0
- [Python](https://www.python.org/) >= 3.7.5
> 编译脚本中会执行`git clone`获取第三方依赖库的代码,请提前确保git的网络设置正确可用。
### 编译选项
MindSpore Lite的编译选项如下。
| 参数 | 参数说明 | 是否必选 |
| -------- | ----- | ---- |
| **lite** | **设置该参数,则对Mindspore Lite工程进行编译** | **是** |
| [n] | 设定编译时所用的线程数,否则默认设定为6线程 | 否 |
### 编译示例
首先,使用git工具从MindSpore代码仓下载源码。
```bash
git clone https://gitee.com/mindspore/mindspore.git
```
然后,使用cmd工具在源码根目录下,执行如下命令即可编译MindSpore Lite。
- 以默认线程数(6线程)编译Windows版本。
```bash
call build.bat lite
```
- 以指定线程数8编译Windows版本。
```bash
call build.bat lite 8
```
### 编译输出
编译完成之后,进入`mindspore/output/`目录,解压后即可获取输出件`mindspore-lite-{version}-converter-win-cpu.zip`,其中含有转换工具可执行文件。
> version:输出件版本号,与所编译的分支代码对应的版本一致。
......@@ -8,6 +8,10 @@
- [环境准备](#环境准备)
- [使用示例](#使用示例)
- [参数说明](#参数说明)
- [Windows环境使用说明](#windows环境使用说明)
- [环境准备](#环境准备-1)
- [参数说明](#参数说明-1)
- [使用示例](#使用示例-1)
<!-- /TOC -->
......@@ -108,4 +112,73 @@ MindSpore Lite模型转换工具提供了多种参数设置,用户可根据需
| `--mean=<MEAN>` | 否 | 感知量化模型转换时用于设置输入数据的均值。 | [-128, 127] | -0.5 |
> - 参数名和参数值之间用等号连接,中间不能有空格。
> - Caffe模型一般分为两个文件:`*.prototxt`模型结构,对应`--modelFile`参数;`*.caffemodel`模型权值,对应`--weightFile`参数。
\ No newline at end of file
> - Caffe模型一般分为两个文件:`*.prototxt`模型结构,对应`--modelFile`参数;`*.caffemodel`模型权值,对应`--weightFile`参数。
## Windows环境使用说明
### 环境准备
使用MindSpore Lite模型转换工具,需要进行如下环境准备工作。
- 编译:模型转换工具代码在MindSpore源码的`mindspore/lite/tools/converter`目录中,参考部署文档中的[环境要求](https://www.mindspore.cn/lite/tutorial/zh-CN/master/build.html#id5)[编译示例](https://www.mindspore.cn/lite/tutorial/zh-CN/master/build.html#id7)编译Windows版本。
- 运行:参考部署文档中的[编译输出](https://www.mindspore.cn/lite/tutorial/zh-CN/master/build.html#id8),获得`converter`工具,,并配置MinGW环境变量(在系统变量Path里添加MinGW的bin目录)。
### 参数说明
参考Linux环境模型转换工具的[参数说明](https://www.mindspore.cn/lite/tutorial/zh-CN/master/use/converter_tool.html#id4)
### 使用示例
首先,使用cmd工具在源码根目录下,输入命令进行编译,可参考`build.md`
```bash
call build.bat lite
```
然后,设置日志打印级别为INFO。
```bash
set MSLOG=INFO
```
下面选取了几个常用示例,说明转换命令的使用方法。
- 以Caffe模型LeNet为例,执行转换命令。
```bash
call converter_lite --fmk=CAFFE --modelFile=lenet.prototxt --weightFile=lenet.caffemodel --outputFile=lenet
```
本例中,因为采用了Caffe模型,所以需要模型结构、模型权值两个输入文件。再加上其他必需的fmk类型和输出路径两个参数,即可成功执行。
结果显示为:
```
INFO [converter/converter.cc:190] Runconverter] CONVERTER RESULT: SUCCESS!
```
这表示已经成功将Caffe模型转化为MindSpore Lite模型,获得新文件`lenet.ms`
- 以MindSpore、TensorFlow Lite、ONNX模型格式和感知量化模型为例,执行转换命令。
- MindSpore模型`model.mindir`
```bash
call converter_lite --fmk=MS --modelFile=model.mindir --outputFile=model
```
- TensorFlow Lite模型`model.tflite`
```bash
call converter_lite --fmk=TFLITE --modelFile=model.tflite --outputFile=model
```
- ONNX模型`model.onnx`
```bash
call converter_lite --fmk=ONNX --modelFile=model.onnx --outputFile=model
```
- TensorFlow Lite感知量化模型`model_quant.tflite`
```bash
call converter_lite --fmk=TFLITE --modelFile=model_quant.tflite --outputFile=model --quantType=AwareTraining
```
以上几种情况下,均显示如下转换成功提示,且同时获得`model.ms`目标文件。
```
INFO [converter/converter.cc:190] Runconverter] CONVERTER RESULT: SUCCESS!
```
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册