MindSpore can compile user source code based on the Python syntax into computational graphs, and can convert common functions or instances inherited from nn.Cell into computational graphs. Currently, MindSpore does not support conversion of any Python source code into computational graphs. Therefore, there are constraints on source code compilation, including syntax constraints and network definition constraints. As MindSpore evolves, the constraints may change.
MindSpore can compile user source code based on the Python syntax into computational graphs, and can convert common functions or instances inherited from nn.Cell into computational graphs. Currently, MindSpore does not support conversion of any Python source code into computational graphs. Therefore, there are constraints on source code compilation, including syntax constraints and network definition constraints. As MindSpore evolves, the constraints may change.
@@ -21,7 +21,7 @@ This document describes how to quickly install MindSpore on a Ubuntu system with
...
@@ -21,7 +21,7 @@ This document describes how to quickly install MindSpore on a Ubuntu system with
| Version | Operating System | Executable File Installation Dependencies | Source Code Compilation and Installation Dependencies |
| Version | Operating System | Executable File Installation Dependencies | Source Code Compilation and Installation Dependencies |
| ---- | :--- | :--- | :--- |
| ---- | :--- | :--- | :--- |
| MindSpore master | Ubuntu 16.04 or later x86_64 | - [Python](https://www.python.org/downloads/) 3.7.5 <br> - For details about other dependency items, see [requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt). | **Compilation dependencies:**<br> - [Python](https://www.python.org/downloads/) 3.7.5 <br> - [wheel](https://pypi.org/project/wheel/) >= 0.32.0 <br> - [GCC](https://gcc.gnu.org/releases.html) 7.3.0 <br> - [CMake](https://cmake.org/download/) >= 3.14.1 <br> - [patch](http://ftp.gnu.org/gnu/patch/) >= 2.5 <br> same as the executable file installation dependencies. |
| MindSpore master | Ubuntu 16.04 or later x86_64 | - [Python](https://www.python.org/downloads/) 3.7.5 <br> - For details about other dependency items, see [requirements.txt](https://gitee.com/mindspore/mindspore/blob/r0.3/requirements.txt). | **Compilation dependencies:**<br> - [Python](https://www.python.org/downloads/) 3.7.5 <br> - [wheel](https://pypi.org/project/wheel/) >= 0.32.0 <br> - [GCC](https://gcc.gnu.org/releases.html) 7.3.0 <br> - [CMake](https://cmake.org/download/) >= 3.14.1 <br> - [patch](http://ftp.gnu.org/gnu/patch/) >= 2.5 <br> same as the executable file installation dependencies. |
- When Ubuntu version is 18.04, GCC 7.3.0 can be installed by using apt command.
- When Ubuntu version is 18.04, GCC 7.3.0 can be installed by using apt command.
- When the network is connected, dependency items in the requirements.txt file are automatically downloaded during .whl package installation. In other cases, you need to manually install dependency items.
- When the network is connected, dependency items in the requirements.txt file are automatically downloaded during .whl package installation. In other cases, you need to manually install dependency items.
...
@@ -62,7 +62,7 @@ This document describes how to quickly install MindSpore on a Ubuntu system with
...
@@ -62,7 +62,7 @@ This document describes how to quickly install MindSpore on a Ubuntu system with
1. Download the source code from the code repository.
1. Download the source code from the code repository.
2. Run the following command in the root directory of the source code to compile MindSpore:
2. Run the following command in the root directory of the source code to compile MindSpore:
...
@@ -97,7 +97,7 @@ If you need to conduct AI model security research or enhance the security of the
...
@@ -97,7 +97,7 @@ If you need to conduct AI model security research or enhance the security of the
| Version | Operating System | Executable File Installation Dependencies | Source Code Compilation and Installation Dependencies |
| Version | Operating System | Executable File Installation Dependencies | Source Code Compilation and Installation Dependencies |
| ---- | :--- | :--- | :--- |
| ---- | :--- | :--- | :--- |
| MindArmour master | Ubuntu 16.04 or later x86_64 | - [Python](https://www.python.org/downloads/) 3.7.5 <br> - MindSpore master <br> - For details about other dependency items, see [setup.py](https://gitee.com/mindspore/mindarmour/blob/master/setup.py). | Same as the executable file installation dependencies. |
| MindArmour master | Ubuntu 16.04 or later x86_64 | - [Python](https://www.python.org/downloads/) 3.7.5 <br> - MindSpore master <br> - For details about other dependency items, see [setup.py](https://gitee.com/mindspore/mindarmour/blob/r0.3/setup.py). | Same as the executable file installation dependencies. |
- When the network is connected, dependency items in the setup.py file are automatically downloaded during .whl package installation. In other cases, you need to manually install dependency items.
- When the network is connected, dependency items in the setup.py file are automatically downloaded during .whl package installation. In other cases, you need to manually install dependency items.
...
@@ -122,7 +122,7 @@ If you need to conduct AI model security research or enhance the security of the
...
@@ -122,7 +122,7 @@ If you need to conduct AI model security research or enhance the security of the
1. Download the source code from the code repository.
1. Download the source code from the code repository.
@@ -20,7 +20,7 @@ This document describes how to quickly install MindSpore on a Windows system wit
...
@@ -20,7 +20,7 @@ This document describes how to quickly install MindSpore on a Windows system wit
| Version | Operating System | Executable File Installation Dependencies | Source Code Compilation and Installation Dependencies |
| Version | Operating System | Executable File Installation Dependencies | Source Code Compilation and Installation Dependencies |
| ---- | :--- | :--- | :--- |
| ---- | :--- | :--- | :--- |
| MindSpore master | Windows 10 x86_64 | - [Python](https://www.python.org/downloads/) 3.7.5 <br> - For details about other dependency items, see [requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt). | **Compilation dependencies:**<br> - [Python](https://www.python.org/downloads/) 3.7.5 <br> - [MinGW-W64 GCC-7.3.0](https://sourceforge.net/projects/mingw-w64/files/Toolchains%20targetting%20Win64/Personal%20Builds/mingw-builds/7.3.0/threads-posix/seh/x86_64-7.3.0-release-posix-seh-rt_v5-rev0.7z) x86_64-posix-seh <br> - [ActivePerl](http://downloads.activestate.com/ActivePerl/releases/5.24.3.2404/ActivePerl-5.24.3.2404-MSWin32-x64-404865.exe) 5.24.3.2404 <br> - [CMake](https://cmake.org/download/) 3.14.1 <br> - [wheel](https://pypi.org/project/wheel/) >= 0.32.0 <br>**Installation dependencies:**<br> same as the executable file installation dependencies. |
| MindSpore master | Windows 10 x86_64 | - [Python](https://www.python.org/downloads/) 3.7.5 <br> - For details about other dependency items, see [requirements.txt](https://gitee.com/mindspore/mindspore/blob/r0.3/requirements.txt). | **Compilation dependencies:**<br> - [Python](https://www.python.org/downloads/) 3.7.5 <br> - [MinGW-W64 GCC-7.3.0](https://sourceforge.net/projects/mingw-w64/files/Toolchains%20targetting%20Win64/Personal%20Builds/mingw-builds/7.3.0/threads-posix/seh/x86_64-7.3.0-release-posix-seh-rt_v5-rev0.7z) x86_64-posix-seh <br> - [ActivePerl](http://downloads.activestate.com/ActivePerl/releases/5.24.3.2404/ActivePerl-5.24.3.2404-MSWin32-x64-404865.exe) 5.24.3.2404 <br> - [CMake](https://cmake.org/download/) 3.14.1 <br> - [wheel](https://pypi.org/project/wheel/) >= 0.32.0 <br>**Installation dependencies:**<br> same as the executable file installation dependencies. |
- When the network is connected, dependency items in the requirements.txt file are automatically downloaded during .whl package installation. In other cases, you need to manually install dependency items.
- When the network is connected, dependency items in the requirements.txt file are automatically downloaded during .whl package installation. In other cases, you need to manually install dependency items.
...
@@ -62,7 +62,7 @@ This document describes how to quickly install MindSpore on a Windows system wit
...
@@ -62,7 +62,7 @@ This document describes how to quickly install MindSpore on a Windows system wit
1. Download the source code from the code repository.
1. Download the source code from the code repository.
@@ -32,7 +32,7 @@ This document describes how to quickly install MindSpore on an Ascend AI process
...
@@ -32,7 +32,7 @@ This document describes how to quickly install MindSpore on an Ascend AI process
| Version | Operating System | Executable File Installation Dependencies | Source Code Compilation and Installation Dependencies |
| Version | Operating System | Executable File Installation Dependencies | Source Code Compilation and Installation Dependencies |
| ---- | :--- | :--- | :--- |
| ---- | :--- | :--- | :--- |
| MindSpore master | - Ubuntu 16.04 or later aarch64 <br> - Ubuntu 16.04 or later x86_64 <br> - EulerOS 2.8 aarch64 <br> - EulerOS 2.5 x86_64 | - [Python](https://www.python.org/downloads/) 3.7.5 <br> - Ascend 910 AI processor software package(Version:Atlas T 1.1.T107) <br> - For details about other dependency items, see [requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt). | **Compilation dependencies:**<br> - [Python](https://www.python.org/downloads/) 3.7.5 <br> - Ascend 910 AI processor software package(Version:Atlas T 1.1.T107) <br> - [wheel](https://pypi.org/project/wheel/) >= 0.32.0 <br> - [GCC](https://gcc.gnu.org/releases.html) 7.3.0 <br> - [CMake](https://cmake.org/download/) >= 3.14.1 <br> - [patch](http://ftp.gnu.org/gnu/patch/) >= 2.5 <br>**Installation dependencies:**<br> same as the executable file installation dependencies. |
| MindSpore master | - Ubuntu 16.04 or later aarch64 <br> - Ubuntu 16.04 or later x86_64 <br> - EulerOS 2.8 aarch64 <br> - EulerOS 2.5 x86_64 | - [Python](https://www.python.org/downloads/) 3.7.5 <br> - Ascend 910 AI processor software package(Version:Atlas T 1.1.T107) <br> - For details about other dependency items, see [requirements.txt](https://gitee.com/mindspore/mindspore/blob/r0.3/requirements.txt). | **Compilation dependencies:**<br> - [Python](https://www.python.org/downloads/) 3.7.5 <br> - Ascend 910 AI processor software package(Version:Atlas T 1.1.T107) <br> - [wheel](https://pypi.org/project/wheel/) >= 0.32.0 <br> - [GCC](https://gcc.gnu.org/releases.html) 7.3.0 <br> - [CMake](https://cmake.org/download/) >= 3.14.1 <br> - [patch](http://ftp.gnu.org/gnu/patch/) >= 2.5 <br>**Installation dependencies:**<br> same as the executable file installation dependencies. |
- Confirm that the current user has the right to access the installation path `/usr/local/Ascend `of Ascend 910 AI processor software package(Version:Atlas T 1.1.T107). If not, the root user needs to add the current user to the user group where `/usr/local/Ascend` is located. For the specific configuration, please refer to the software package instruction document.
- Confirm that the current user has the right to access the installation path `/usr/local/Ascend `of Ascend 910 AI processor software package(Version:Atlas T 1.1.T107). If not, the root user needs to add the current user to the user group where `/usr/local/Ascend` is located. For the specific configuration, please refer to the software package instruction document.
- When Ubuntu version is 18.04, GCC 7.3.0 can be installed by using apt command.
- When Ubuntu version is 18.04, GCC 7.3.0 can be installed by using apt command.
...
@@ -81,7 +81,7 @@ The compilation and installation must be performed on the Ascend 910 AI processo
...
@@ -81,7 +81,7 @@ The compilation and installation must be performed on the Ascend 910 AI processo
1. Download the source code from the code repository.
1. Download the source code from the code repository.
2. Run the following command in the root directory of the source code to compile MindSpore:
2. Run the following command in the root directory of the source code to compile MindSpore:
...
@@ -159,7 +159,7 @@ If you need to analyze information such as model scalars, graphs, and model trac
...
@@ -159,7 +159,7 @@ If you need to analyze information such as model scalars, graphs, and model trac
| Version | Operating System | Executable File Installation Dependencies | Source Code Compilation and Installation Dependencies |
| Version | Operating System | Executable File Installation Dependencies | Source Code Compilation and Installation Dependencies |
| ---- | :--- | :--- | :--- |
| ---- | :--- | :--- | :--- |
| MindInsight master | - Ubuntu 16.04 or later aarch64 <br> - Ubuntu 16.04 or later x86_64 <br> - EulerOS 2.8 aarch64 <br> - EulerOS 2.5 x86_64 <br> | - [Python](https://www.python.org/downloads/) 3.7.5 <br> - MindSpore master <br> - For details about other dependency items, see [requirements.txt](https://gitee.com/mindspore/mindinsight/blob/master/requirements.txt). | **Compilation dependencies:**<br> - [Python](https://www.python.org/downloads/) 3.7.5 <br> - [CMake](https://cmake.org/download/) >= 3.14.1 <br> - [GCC](https://gcc.gnu.org/releases.html) 7.3.0 <br> - [node.js](https://nodejs.org/en/download/) >= 10.19.0 <br> - [wheel](https://pypi.org/project/wheel/) >= 0.32.0 <br> - [pybind11](https://pypi.org/project/pybind11/) >= 2.4.3 <br>**Installation dependencies:**<br> same as the executable file installation dependencies. |
| MindInsight master | - Ubuntu 16.04 or later aarch64 <br> - Ubuntu 16.04 or later x86_64 <br> - EulerOS 2.8 aarch64 <br> - EulerOS 2.5 x86_64 <br> | - [Python](https://www.python.org/downloads/) 3.7.5 <br> - MindSpore master <br> - For details about other dependency items, see [requirements.txt](https://gitee.com/mindspore/mindinsight/blob/r0.3/requirements.txt). | **Compilation dependencies:**<br> - [Python](https://www.python.org/downloads/) 3.7.5 <br> - [CMake](https://cmake.org/download/) >= 3.14.1 <br> - [GCC](https://gcc.gnu.org/releases.html) 7.3.0 <br> - [node.js](https://nodejs.org/en/download/) >= 10.19.0 <br> - [wheel](https://pypi.org/project/wheel/) >= 0.32.0 <br> - [pybind11](https://pypi.org/project/pybind11/) >= 2.4.3 <br>**Installation dependencies:**<br> same as the executable file installation dependencies. |
- When the network is connected, dependency items in the requirements.txt file are automatically downloaded during .whl package installation. In other cases, you need to manually install dependency items.
- When the network is connected, dependency items in the requirements.txt file are automatically downloaded during .whl package installation. In other cases, you need to manually install dependency items.
...
@@ -184,7 +184,7 @@ If you need to analyze information such as model scalars, graphs, and model trac
...
@@ -184,7 +184,7 @@ If you need to analyze information such as model scalars, graphs, and model trac
1. Download the source code from the code repository.
1. Download the source code from the code repository.
> You are **not** supposed to obtain the source code from the zip package downloaded from the repository homepage.
> You are **not** supposed to obtain the source code from the zip package downloaded from the repository homepage.
...
@@ -226,7 +226,7 @@ If you need to conduct AI model security research or enhance the security of the
...
@@ -226,7 +226,7 @@ If you need to conduct AI model security research or enhance the security of the
| Version | Operating System | Executable File Installation Dependencies | Source Code Compilation and Installation Dependencies |
| Version | Operating System | Executable File Installation Dependencies | Source Code Compilation and Installation Dependencies |
| ---- | :--- | :--- | :--- |
| ---- | :--- | :--- | :--- |
| MindArmour master | - Ubuntu 16.04 or later aarch64 <br> - Ubuntu 16.04 or later x86_64 <br> - EulerOS 2.8 aarch64 <br> - EulerOS 2.5 x86_64 <br> | - [Python](https://www.python.org/downloads/) 3.7.5 <br> - MindSpore master <br> - For details about other dependency items, see [setup.py](https://gitee.com/mindspore/mindarmour/blob/master/setup.py). | Same as the executable file installation dependencies. |
| MindArmour master | - Ubuntu 16.04 or later aarch64 <br> - Ubuntu 16.04 or later x86_64 <br> - EulerOS 2.8 aarch64 <br> - EulerOS 2.5 x86_64 <br> | - [Python](https://www.python.org/downloads/) 3.7.5 <br> - MindSpore master <br> - For details about other dependency items, see [setup.py](https://gitee.com/mindspore/mindarmour/blob/r0.3/setup.py). | Same as the executable file installation dependencies. |
- When the network is connected, dependency items in the setup.py file are automatically downloaded during .whl package installation. In other cases, you need to manually install dependency items.
- When the network is connected, dependency items in the setup.py file are automatically downloaded during .whl package installation. In other cases, you need to manually install dependency items.
...
@@ -251,7 +251,7 @@ If you need to conduct AI model security research or enhance the security of the
...
@@ -251,7 +251,7 @@ If you need to conduct AI model security research or enhance the security of the
1. Download the source code from the code repository.
1. Download the source code from the code repository.
@@ -28,7 +28,7 @@ This document describes how to quickly install MindSpore on a NVIDIA GPU environ
...
@@ -28,7 +28,7 @@ This document describes how to quickly install MindSpore on a NVIDIA GPU environ
| Version | Operating System | Executable File Installation Dependencies | Source Code Compilation and Installation Dependencies |
| Version | Operating System | Executable File Installation Dependencies | Source Code Compilation and Installation Dependencies |
| ---- | :--- | :--- | :--- |
| ---- | :--- | :--- | :--- |
| MindSpore master | Ubuntu 16.04 or later x86_64 | - [Python](https://www.python.org/downloads/) 3.7.5 <br> - [CUDA 9.2](https://developer.nvidia.com/cuda-92-download-archive) / [CUDA 10.1](https://developer.nvidia.com/cuda-10.1-download-archive-base)<br> - [CuDNN](https://developer.nvidia.com/rdp/cudnn-archive) >= 7.6 <br> - [OpenMPI](https://www.open-mpi.org/faq/?category=building#easy-build) 3.1.5 (optional, required for single-node/multi-GPU and multi-node/multi-GPU training) <br> - [NCCL](https://docs.nvidia.com/deeplearning/sdk/nccl-install-guide/index.html#debian) 2.4.8-1 (optional, required for single-node/multi-GPU and multi-node/multi-GPU training) <br> - For details about other dependency items, see [requirements.txt](https://gitee.com/mindspore/mindspore/blob/master/requirements.txt). | **Compilation dependencies:**<br> - [Python](https://www.python.org/downloads/) 3.7.5 <br> - [wheel](https://pypi.org/project/wheel/) >= 0.32.0 <br> - [CMake](https://cmake.org/download/) >= 3.14.1 <br> - [GCC](https://gcc.gnu.org/releases.html) 7.3.0 <br> - [patch](http://ftp.gnu.org/gnu/patch/) >= 2.5 <br> - [Autoconf](https://www.gnu.org/software/autoconf) >= 2.69 <br> - [Libtool](https://www.gnu.org/software/libtool) >= 2.4.6-29.fc30 <br> - [Automake](https://www.gnu.org/software/automake) >= 1.15.1 <br> - [CUDA 9.2](https://developer.nvidia.com/cuda-92-download-archive) / [CUDA 10.1](https://developer.nvidia.com/cuda-10.1-download-archive-base)<br> - [CuDNN](https://developer.nvidia.com/rdp/cudnn-archive) >= 7.6 <br>**Installation dependencies:**<br> same as the executable file installation dependencies. |
| MindSpore master | Ubuntu 16.04 or later x86_64 | - [Python](https://www.python.org/downloads/) 3.7.5 <br> - [CUDA 9.2](https://developer.nvidia.com/cuda-92-download-archive) / [CUDA 10.1](https://developer.nvidia.com/cuda-10.1-download-archive-base)<br> - [CuDNN](https://developer.nvidia.com/rdp/cudnn-archive) >= 7.6 <br> - [OpenMPI](https://www.open-mpi.org/faq/?category=building#easy-build) 3.1.5 (optional, required for single-node/multi-GPU and multi-node/multi-GPU training) <br> - [NCCL](https://docs.nvidia.com/deeplearning/sdk/nccl-install-guide/index.html#debian) 2.4.8-1 (optional, required for single-node/multi-GPU and multi-node/multi-GPU training) <br> - For details about other dependency items, see [requirements.txt](https://gitee.com/mindspore/mindspore/blob/r0.3/requirements.txt). | **Compilation dependencies:**<br> - [Python](https://www.python.org/downloads/) 3.7.5 <br> - [wheel](https://pypi.org/project/wheel/) >= 0.32.0 <br> - [CMake](https://cmake.org/download/) >= 3.14.1 <br> - [GCC](https://gcc.gnu.org/releases.html) 7.3.0 <br> - [patch](http://ftp.gnu.org/gnu/patch/) >= 2.5 <br> - [Autoconf](https://www.gnu.org/software/autoconf) >= 2.69 <br> - [Libtool](https://www.gnu.org/software/libtool) >= 2.4.6-29.fc30 <br> - [Automake](https://www.gnu.org/software/automake) >= 1.15.1 <br> - [CUDA 9.2](https://developer.nvidia.com/cuda-92-download-archive) / [CUDA 10.1](https://developer.nvidia.com/cuda-10.1-download-archive-base)<br> - [CuDNN](https://developer.nvidia.com/rdp/cudnn-archive) >= 7.6 <br>**Installation dependencies:**<br> same as the executable file installation dependencies. |
- When Ubuntu version is 18.04, GCC 7.3.0 can be installed by using apt command.
- When Ubuntu version is 18.04, GCC 7.3.0 can be installed by using apt command.
- When the network is connected, dependency items in the requirements.txt file are automatically downloaded during .whl package installation. In other cases, you need to manually install dependency items.
- When the network is connected, dependency items in the requirements.txt file are automatically downloaded during .whl package installation. In other cases, you need to manually install dependency items.
...
@@ -64,7 +64,7 @@ This document describes how to quickly install MindSpore on a NVIDIA GPU environ
...
@@ -64,7 +64,7 @@ This document describes how to quickly install MindSpore on a NVIDIA GPU environ
1. Download the source code from the code repository.
1. Download the source code from the code repository.
2. Run the following command in the root directory of the source code to compile MindSpore:
2. Run the following command in the root directory of the source code to compile MindSpore:
...
@@ -124,7 +124,7 @@ If you need to analyze information such as model scalars, graphs, and model trac
...
@@ -124,7 +124,7 @@ If you need to analyze information such as model scalars, graphs, and model trac
| Version | Operating System | Executable File Installation Dependencies | Source Code Compilation and Installation Dependencies |
| Version | Operating System | Executable File Installation Dependencies | Source Code Compilation and Installation Dependencies |
| ---- | :--- | :--- | :--- |
| ---- | :--- | :--- | :--- |
| MindInsight master | - Ubuntu 16.04 or later x86_64 | - [Python](https://www.python.org/downloads/) 3.7.5 <br> - MindSpore master <br> - For details about other dependency items, see [requirements.txt](https://gitee.com/mindspore/mindinsight/blob/master/requirements.txt). | **Compilation dependencies:**<br> - [Python](https://www.python.org/downloads/) 3.7.5 <br> - [CMake](https://cmake.org/download/) >= 3.14.1 <br> - [GCC](https://gcc.gnu.org/releases.html) 7.3.0 <br> - [node.js](https://nodejs.org/en/download/) >= 10.19.0 <br> - [wheel](https://pypi.org/project/wheel/) >= 0.32.0 <br> - [pybind11](https://pypi.org/project/pybind11/) >= 2.4.3 <br>**Installation dependencies:**<br> same as the executable file installation dependencies. |
| MindInsight master | - Ubuntu 16.04 or later x86_64 | - [Python](https://www.python.org/downloads/) 3.7.5 <br> - MindSpore master <br> - For details about other dependency items, see [requirements.txt](https://gitee.com/mindspore/mindinsight/blob/r0.3/requirements.txt). | **Compilation dependencies:**<br> - [Python](https://www.python.org/downloads/) 3.7.5 <br> - [CMake](https://cmake.org/download/) >= 3.14.1 <br> - [GCC](https://gcc.gnu.org/releases.html) 7.3.0 <br> - [node.js](https://nodejs.org/en/download/) >= 10.19.0 <br> - [wheel](https://pypi.org/project/wheel/) >= 0.32.0 <br> - [pybind11](https://pypi.org/project/pybind11/) >= 2.4.3 <br>**Installation dependencies:**<br> same as the executable file installation dependencies. |
- When the network is connected, dependency items in the requirements.txt file are automatically downloaded during .whl package installation. In other cases, you need to manually install dependency items.
- When the network is connected, dependency items in the requirements.txt file are automatically downloaded during .whl package installation. In other cases, you need to manually install dependency items.
...
@@ -149,7 +149,7 @@ If you need to analyze information such as model scalars, graphs, and model trac
...
@@ -149,7 +149,7 @@ If you need to analyze information such as model scalars, graphs, and model trac
1. Download the source code from the code repository.
1. Download the source code from the code repository.
> You are **not** supposed to obtain the source code from the zip package downloaded from the repository homepage.
> You are **not** supposed to obtain the source code from the zip package downloaded from the repository homepage.
...
@@ -191,7 +191,7 @@ If you need to conduct AI model security research or enhance the security of the
...
@@ -191,7 +191,7 @@ If you need to conduct AI model security research or enhance the security of the
| Version | Operating System | Executable File Installation Dependencies | Source Code Compilation and Installation Dependencies |
| Version | Operating System | Executable File Installation Dependencies | Source Code Compilation and Installation Dependencies |
| ---- | :--- | :--- | :--- |
| ---- | :--- | :--- | :--- |
| MindArmour master | Ubuntu 16.04 or later x86_64 | - [Python](https://www.python.org/downloads/) 3.7.5 <br> - MindSpore master <br> - For details about other dependency items, see [setup.py](https://gitee.com/mindspore/mindarmour/blob/master/setup.py). | Same as the executable file installation dependencies. |
| MindArmour master | Ubuntu 16.04 or later x86_64 | - [Python](https://www.python.org/downloads/) 3.7.5 <br> - MindSpore master <br> - For details about other dependency items, see [setup.py](https://gitee.com/mindspore/mindarmour/blob/r0.3/setup.py). | Same as the executable file installation dependencies. |
- When the network is connected, dependency items in the setup.py file are automatically downloaded during .whl package installation. In other cases, you need to manually install dependency items.
- When the network is connected, dependency items in the setup.py file are automatically downloaded during .whl package installation. In other cases, you need to manually install dependency items.
...
@@ -216,7 +216,7 @@ If you need to conduct AI model security research or enhance the security of the
...
@@ -216,7 +216,7 @@ If you need to conduct AI model security research or enhance the security of the
1. Download the source code from the code repository.
1. Download the source code from the code repository.
@@ -68,13 +68,13 @@ A: Please install the software manually if there is any suggestion of certain `s
...
@@ -68,13 +68,13 @@ A: Please install the software manually if there is any suggestion of certain `s
Q: What types of model is currently supported by MindSpore for training ?
Q: What types of model is currently supported by MindSpore for training ?
A: MindSpore has basic support for common training scenarios, please refer to [Release note](https://gitee.com/mindspore/mindspore/blob/master/RELEASE.md) for detailed information.
A: MindSpore has basic support for common training scenarios, please refer to [Release note](https://gitee.com/mindspore/mindspore/blob/r0.3/RELEASE.md) for detailed information.
<br/>
<br/>
Q: What are the available recommendation or text generation networks or models provided by MindSpore?
Q: What are the available recommendation or text generation networks or models provided by MindSpore?
A: Currently, recommendation models such as Wide & Deep, DeepFM, and NCF are under development. In the natural language processing (NLP) field, Bert\_NEZHA is available and models such as MASS are under development. You can rebuild the network into a text generation network based on the scenario requirements. Please stay tuned for updates on the [MindSpore Model Zoo](https://gitee.com/mindspore/mindspore/tree/master/mindspore/model_zoo).
A: Currently, recommendation models such as Wide & Deep, DeepFM, and NCF are under development. In the natural language processing (NLP) field, Bert\_NEZHA is available and models such as MASS are under development. You can rebuild the network into a text generation network based on the scenario requirements. Please stay tuned for updates on the [MindSpore Model Zoo](https://gitee.com/mindspore/mindspore/tree/r0.3/mindspore/model_zoo).
### Backend Support
### Backend Support
...
@@ -92,7 +92,7 @@ A: MindSpore provides pluggable device management interface so that developer co
...
@@ -92,7 +92,7 @@ A: MindSpore provides pluggable device management interface so that developer co
Q: What hardware does MindSpore require?
Q: What hardware does MindSpore require?
A: Currently, you can try out MindSpore through Docker images on laptops or in environments with GPUs. Some models in MindSpore Model Zoo support GPU-based training and inference, and other models are being improved. For distributed parallel training, MindSpore supports multi-GPU training. You can obtain the latest information from [RoadMap](https://www.mindspore.cn/docs/en/master/roadmap.html) and project [Release Notes](https://gitee.com/mindspore/mindspore/blob/master/RELEASE.md).
A: Currently, you can try out MindSpore through Docker images on laptops or in environments with GPUs. Some models in MindSpore Model Zoo support GPU-based training and inference, and other models are being improved. For distributed parallel training, MindSpore supports multi-GPU training. You can obtain the latest information from [RoadMap](https://www.mindspore.cn/docs/en/master/roadmap.html) and project [Release Notes](https://gitee.com/mindspore/mindspore/blob/r0.3/RELEASE.md).
A:目前正在开发Wide & Deep、DeepFM、NCF等推荐类模型,NLP领域已经支持Bert_NEZHA,正在开发MASS等模型,用户可根据场景需要改造为生成类网络,可以关注[MindSpore Model Zoo](https://gitee.com/mindspore/mindspore/tree/master/mindspore/model_zoo)。
A:目前正在开发Wide & Deep、DeepFM、NCF等推荐类模型,NLP领域已经支持Bert_NEZHA,正在开发MASS等模型,用户可根据场景需要改造为生成类网络,可以关注[MindSpore Model Zoo](https://gitee.com/mindspore/mindspore/tree/r0.3/mindspore/model_zoo)。
A:目前笔记本电脑或者有GPU的环境,都可以通过Docker镜像来试用。当前MindSpore Model Zoo中有部分模型已经支持GPU的训练和推理,其他模型也在不断地进行完善。在分布式并行训练方面,MindSpore当前支持GPU多卡训练。你可以通过[RoadMap](https://www.mindspore.cn/docs/zh-CN/master/roadmap.html)和项目[Release note](https://gitee.com/mindspore/mindspore/blob/master/RELEASE.md)获取最新信息。
A:目前笔记本电脑或者有GPU的环境,都可以通过Docker镜像来试用。当前MindSpore Model Zoo中有部分模型已经支持GPU的训练和推理,其他模型也在不断地进行完善。在分布式并行训练方面,MindSpore当前支持GPU多卡训练。你可以通过[RoadMap](https://www.mindspore.cn/docs/zh-CN/master/roadmap.html)和项目[Release note](https://gitee.com/mindspore/mindspore/blob/r0.3/RELEASE.md)获取最新信息。
@@ -64,7 +64,7 @@ Next, let's use MindSpore to solve the image classification task. The overall pr
...
@@ -64,7 +64,7 @@ Next, let's use MindSpore to solve the image classification task. The overall pr
5. Call the high-level `Model` API to train and save the model file.
5. Call the high-level `Model` API to train and save the model file.
6. Load the saved model for inference.
6. Load the saved model for inference.
> This example is for the hardware platform of the Ascend 910 AI processor. You can find the complete executable sample code at: <https://gitee.com/mindspore/docs/blob/master/tutorials/tutorial_code/resnet>.
> This example is for the hardware platform of the Ascend 910 AI processor. You can find the complete executable sample code at: <https://gitee.com/mindspore/docs/blob/r0.3/tutorials/tutorial_code/resnet>.
The key parts of the task process code are explained below.
The key parts of the task process code are explained below.
In deep learning, the increasing number of datasets and parameters prolongs the training time and requires more hardware resources, becoming a training bottleneck. Parallel distributed training is an important optimization method for training, which can reduce requirements on hardware, such as memory and computing performance. Based on different parallel principles and modes, parallelism is generally classified into the following types:
In deep learning, the increasing number of datasets and parameters prolongs the training time and requires more hardware resources, becoming a training bottleneck. Parallel distributed training is an important optimization method for training, which can reduce requirements on hardware, such as memory and computing performance. Based on different parallel principles and modes, parallelism is generally classified into the following types:
...
@@ -34,7 +34,7 @@ MindSpore also provides the parallel distributed training function. It supports
...
@@ -34,7 +34,7 @@ MindSpore also provides the parallel distributed training function. It supports
This tutorial describes how to train the ResNet-50 network in data parallel and automatic parallel modes on MindSpore.
This tutorial describes how to train the ResNet-50 network in data parallel and automatic parallel modes on MindSpore.
> The example in this tutorial applies to hardware platforms based on the Ascend 910 AI processor, whereas does not support CPU and GPU scenarios.
> The example in this tutorial applies to hardware platforms based on the Ascend 910 AI processor, whereas does not support CPU and GPU scenarios.
> Download address of the complete sample code: <https://gitee.com/mindspore/docs/blob/master/tutorials/tutorial_code/distributed_training/resnet50_distributed_training.py>
> Download address of the complete sample code: <https://gitee.com/mindspore/docs/blob/r0.3/tutorials/tutorial_code/distributed_training/resnet50_distributed_training.py>
## Preparations
## Preparations
...
@@ -177,7 +177,7 @@ Different from the single-node system, the multi-node system needs to transfer t
...
@@ -177,7 +177,7 @@ Different from the single-node system, the multi-node system needs to transfer t
## Defining the Network
## Defining the Network
In data parallel and automatic parallel modes, the network definition method is the same as that in a single-node system. The reference code is as follows: <https://gitee.com/mindspore/docs/blob/master/tutorials/tutorial_code/resnet/resnet.py>
In data parallel and automatic parallel modes, the network definition method is the same as that in a single-node system. The reference code is as follows: <https://gitee.com/mindspore/docs/blob/r0.3/tutorials/tutorial_code/resnet/resnet.py>
@@ -29,7 +29,7 @@ At the beginning of AI algorithm design, related security threats are sometimes
...
@@ -29,7 +29,7 @@ At the beginning of AI algorithm design, related security threats are sometimes
This section describes how to use MindArmour in adversarial attack and defense by taking the Fast Gradient Sign Method (FGSM) attack algorithm and Natural Adversarial Defense (NAD) algorithm as examples.
This section describes how to use MindArmour in adversarial attack and defense by taking the Fast Gradient Sign Method (FGSM) attack algorithm and Natural Adversarial Defense (NAD) algorithm as examples.
> The current sample is for CPU, GPU and Ascend 910 AI processor. You can find the complete executable sample code at:<https://gitee.com/mindspore/docs/tree/master/tutorials/tutorial_code/model_safety>
> The current sample is for CPU, GPU and Ascend 910 AI processor. You can find the complete executable sample code at:<https://gitee.com/mindspore/docs/tree/r0.3/tutorials/tutorial_code/model_safety>
@@ -57,7 +57,7 @@ Prepare the hardware environment, find a platform corresponding to your environm
...
@@ -57,7 +57,7 @@ Prepare the hardware environment, find a platform corresponding to your environm
MindSpore differs from TensorFlow and PyTorch in the network structure. Before migration, you need to clearly understand the original script and information of each layer, such as shape.
MindSpore differs from TensorFlow and PyTorch in the network structure. Before migration, you need to clearly understand the original script and information of each layer, such as shape.
> You can also use [MindConverter Tool](https://gitee.com/mindspore/mindinsight/tree/master/mindinsight/mindconverter) to automatically convert the PyTorch network definition script to MindSpore network definition script.
> You can also use [MindConverter Tool](https://gitee.com/mindspore/mindinsight/tree/r0.3/mindinsight/mindconverter) to automatically convert the PyTorch network definition script to MindSpore network definition script.
The ResNet-50 network migration and training on the Ascend 910 is used as an example.
The ResNet-50 network migration and training on the Ascend 910 is used as an example.
...
@@ -79,7 +79,7 @@ The ResNet-50 network migration and training on the Ascend 910 is used as an exa
...
@@ -79,7 +79,7 @@ The ResNet-50 network migration and training on the Ascend 910 is used as an exa
num_shards=device_num,shard_id=rank_id)
num_shards=device_num,shard_id=rank_id)
```
```
Then, perform data augmentation, data cleaning, and batch processing. For details about the code, see <https://gitee.com/mindspore/mindspore/blob/master/example/resnet50_cifar10/dataset.py>.
Then, perform data augmentation, data cleaning, and batch processing. For details about the code, see <https://gitee.com/mindspore/mindspore/blob/r0.3/example/resnet50_cifar10/dataset.py>.
3. Build a network.
3. Build a network.
...
@@ -214,7 +214,7 @@ The ResNet-50 network migration and training on the Ascend 910 is used as an exa
...
@@ -214,7 +214,7 @@ The ResNet-50 network migration and training on the Ascend 910 is used as an exa
6. Build the entire network.
6. Build the entire network.
The [ResNet-50](https://gitee.com/mindspore/mindspore/blob/master/mindspore/model_zoo/resnet.py) network structure is formed by connecting multiple defined subnets. Follow the rule of defining subnets before using them and define all the subnets used in the `__init__` and connect subnets in the `construct`.
The [ResNet-50](https://gitee.com/mindspore/mindspore/blob/r0.3/mindspore/model_zoo/resnet.py) network structure is formed by connecting multiple defined subnets. Follow the rule of defining subnets before using them and define all the subnets used in the `__init__` and connect subnets in the `construct`.
7. Define a loss function and an optimizer.
7. Define a loss function and an optimizer.
...
@@ -272,9 +272,9 @@ Models trained on the Ascend 910 AI processor can be used for inference on diffe
...
@@ -272,9 +272,9 @@ Models trained on the Ascend 910 AI processor can be used for inference on diffe
@@ -85,7 +85,7 @@ Currently, MindSpore GPU supports the long short-term memory (LSTM) network for
...
@@ -85,7 +85,7 @@ Currently, MindSpore GPU supports the long short-term memory (LSTM) network for
Long short-term memory (LSTM) is an artificial recurrent neural network (RNN) architecture used for processing and predicting an important event with a long interval and delay in a time sequence. For details, refer to online documentation.
Long short-term memory (LSTM) is an artificial recurrent neural network (RNN) architecture used for processing and predicting an important event with a long interval and delay in a time sequence. For details, refer to online documentation.
3. After the model is obtained, use the validation dataset to check the accuracy of model.
3. After the model is obtained, use the validation dataset to check the accuracy of model.
> The current sample is for the Ascend 910 AI processor. You can find the complete executable sample code at:<https://gitee.com/mindspore/docs/tree/master/tutorials/tutorial_code/lstm>
> The current sample is for the Ascend 910 AI processor. You can find the complete executable sample code at:<https://gitee.com/mindspore/docs/tree/r0.3/tutorials/tutorial_code/lstm>
> - main.py: code file, including code for data preprocessing, network definition, and model training.
> - main.py: code file, including code for data preprocessing, network definition, and model training.
> - config.py: some configurations on the network, including the batch size and number of training epochs.
> - config.py: some configurations on the network, including the batch size and number of training epochs.
@@ -27,14 +27,14 @@ The related concepts are as follows:
...
@@ -27,14 +27,14 @@ The related concepts are as follows:
- Operator implementation: describes the implementation of the internal computation logic for an operator through the DSL API provided by the Tensor Boost Engine (TBE). The TBE supports the development of custom operators based on the Ascend AI chip. You can apply for Open Beta Tests (OBTs) by visiting <https://www.huaweicloud.com/ascend/tbe>.
- Operator implementation: describes the implementation of the internal computation logic for an operator through the DSL API provided by the Tensor Boost Engine (TBE). The TBE supports the development of custom operators based on the Ascend AI chip. You can apply for Open Beta Tests (OBTs) by visiting <https://www.huaweicloud.com/ascend/tbe>.
- Operator information: describes basic information about a TBE operator, such as the operator name and supported input and output types. It is the basis for the backend to select and map operators.
- Operator information: describes basic information about a TBE operator, such as the operator name and supported input and output types. It is the basis for the backend to select and map operators.
This section takes a Square operator as an example to describe how to customize an operator. For details, see cases in [tests/st/ops/custom_ops_tbe](https://gitee.com/mindspore/mindspore/tree/master/tests/st/ops/custom_ops_tbe) in the MindSpore source code.
This section takes a Square operator as an example to describe how to customize an operator. For details, see cases in [tests/st/ops/custom_ops_tbe](https://gitee.com/mindspore/mindspore/tree/r0.3/tests/st/ops/custom_ops_tbe) in the MindSpore source code.
## Registering the Operator Primitive
## Registering the Operator Primitive
The primitive of an operator is a subclass inherited from `PrimitiveWithInfer`. The type name of the subclass is the operator name.
The primitive of an operator is a subclass inherited from `PrimitiveWithInfer`. The type name of the subclass is the operator name.
The definition of the custom operator primitive is the same as that of the built-in operator primitive.
The definition of the custom operator primitive is the same as that of the built-in operator primitive.
- The attribute is defined by the input parameter of the constructor function `__init__()`. The operator in this test case has no attribute. Therefore, `__init__()` has only one input parameter. For details about test cases in which operators have attributes, see [custom add3](https://gitee.com/mindspore/mindspore/tree/master/tests/st/ops/custom_ops_tbe/cus_add3.py) in the MindSpore source code.
- The attribute is defined by the input parameter of the constructor function `__init__()`. The operator in this test case has no attribute. Therefore, `__init__()` has only one input parameter. For details about test cases in which operators have attributes, see [custom add3](https://gitee.com/mindspore/mindspore/tree/r0.3/tests/st/ops/custom_ops_tbe/cus_add3.py) in the MindSpore source code.
- The input and output names are defined by the `init_prim_io_names()` function.
- The input and output names are defined by the `init_prim_io_names()` function.
- The shape inference method of the output tensor is defined in the `infer_shape()` function, and the dtype inference method of the output tensor is defined in the `infer_dtype()` function.
- The shape inference method of the output tensor is defined in the `infer_shape()` function, and the dtype inference method of the output tensor is defined in the `infer_dtype()` function.
@@ -16,7 +16,7 @@ Models based on MindSpore training can be used for inference on different hardwa
...
@@ -16,7 +16,7 @@ Models based on MindSpore training can be used for inference on different hardwa
1. Inference on the Ascend 910 AI processor
1. Inference on the Ascend 910 AI processor
MindSpore provides the `model.eval()` API for model validation. You only need to import the validation dataset. The processing method of the validation dataset is the same as that of the training dataset. For details about the complete code, see <https://gitee.com/mindspore/mindspore/blob/master/example/resnet50_cifar10/eval.py>.
MindSpore provides the `model.eval()` API for model validation. You only need to import the validation dataset. The processing method of the validation dataset is the same as that of the training dataset. For details about the complete code, see <https://gitee.com/mindspore/mindspore/blob/r0.3/example/resnet50_cifar10/eval.py>.