@@ -178,3 +183,49 @@ The inference framework can be obtained under `-I x86_64`, `-I arm64` and `-I ar
> 1. `liboptimize.so` only exists in the output package of runtime-arm64 and is only used on ARMv8.2 and CPUs that support fp16.
> 2. Compile ARM64 to get the inference framework output of arm64-cpu by default, if you add `-e gpu`, you will get the inference framework output of arm64-gpu, and the package name is `mindspore-lite-{version}-runtime-arm64-gpu.tar.gz`, compiling ARM32 is in the same way.
> 3. Before running the tools in the converter, benchmark or time_profile directory, you need to configure environment variables, and configure the path where the dynamic libraries of MindSpore Lite and Protobuf are located to the path where the system searches for dynamic libraries. Take the compiled under version 0.7.0-beta as an example: configure converter: `export LD_LIBRARY_PATH=./output/mindspore-lite-0.7.0-converter-ubuntu/third_party/protobuf/lib:./output/mindspore-lite-0.7.0-converter-ubuntu/third_party/flatbuffers/lib:${LD_LIBRARY_PATH}`; configure benchmark and timeprofiler: `export LD_LIBRARY_PATH= ./output/mindspore-lite-0.7.0-runtime-x86-cpu/lib:${LD_LIBRARY_PATH}`.
## Windows Environment Compilation
### Environment Requirements
- The supported compilation environment is: Windows 10, 64-bit.
> The compilation script will execute `git clone` to obtain the code of the third-party dependent libraries. Please make sure that the git network settings are correct and available in advance.
### Compilation Options
The compilation options of MindSpore Lite are as follows:
| Parameter | Parameter Description | Mandatory or Not |
| -------- | ----- | ---- |
| **lite** | **Set this parameter to compile the Mindspore Lite project.** | **Yes** |
| [n] | Set the number of threads used during compilation, otherwise the default is set to 6 threads. | No |
### Compilation Example
First, use the git tool to download the source code from the MindSpore code repository.
Then, use the cmd tool to compile MindSpore Lite in the root directory of the source code and execute the following commands.
- Compile the Windows version with the default number of threads (6 threads).
```bash
call build.bat lite
```
- Compile the Windows version with the specified number of threads 8.
```bash
call build.bat lite 8
```
### Output Description
After the compilation is complete, enter the `mindspore/output/` directory, unzip the output file `mindspore-lite-{version}-converter-win-cpu.zip`, which contains the conversion tool executable file.
> version: version of the output, consistent with that of the MindSpore.
@@ -107,4 +111,73 @@ The following describes the parameters in detail.
|`--mean=<MEAN>`| No(supported by aware quant models only) | Sets the mean value of the input data. | [-128, 127] | -0.5 |
> - The parameter name and parameter value are separated by an equal sign (=) and no space is allowed between them.
> - The Caffe model is divided into two files: model structure `*.prototxt`, corresponding to the `--modelFile` parameter; model weight `*.caffemodel`, corresponding to the `--weightFile` parameter
\ No newline at end of file
> - The Caffe model is divided into two files: model structure `*.prototxt`, corresponding to the `--modelFile` parameter; model weight `*.caffemodel`, corresponding to the `--weightFile` parameter
## Windows Environment Instructions
### Environment Preparation
To use the MindSpore Lite model conversion tool, the following environment preparations are required.
- Compile: The model conversion tool code is in the `mindspore/lite/tools/converter` directory of the MindSpore source code, refer to the [Environment Requirements](https://www.mindspore.cn/lite/tutorial/en/master/build.html#environment-requirements-1) and [Compilation Example](https://www.mindspore.cn/lite/tutorial/en/master/build.html#compilation-example-1) in the build document.
- Run: Refer to [Output Description](https://www.mindspore.cn/lite/tutorial/en/master/build.html#output-description-1) in the deployment document to obtain the `converter` tool, and set the environment variable of MinGW(Add the bin directory of MinGW in the system variable Path).
### Parameter Description
Reference description Linux environment model conversion tool [parameter description](https://www.mindspore.cn/lite/tutorial/en/master/use/converter_tool.html#parameter-description).
### Example
First, use the cmd tool to enter the command to compile in the root directory of the source code, refer to `build.md`.
```bash
call build.bat lite
```
Then, set the log printing level to INFO.
```bash
set MSLOG=INFO
```
Several common examples are selected below to illustrate the use of conversion commands.
- Take Caffe model LeNet as an example to execute the conversion command.
In this example, because the Caffe model is used, two input files of model structure and model weight are required. Then plus fmk type and output path two parameters which are required, you can successfully execute.
The result is shown as:
```
INFO [converter/converter.cc:190] Runconverter] CONVERTER RESULT: SUCCESS!
```
This means that the Caffe model has been successfully converted to the MindSpore Lite model and the new file `lenet.ms` has been obtained.
- Take MindSpore, TensorFlow Lite, ONNX model format and perceptual quantization model as examples to execute conversion commands.