Topic1_Low-bit-Neural-Networks-Training.md 1.2 KB
Newer Older
1
# Topic1:  Low-bit Neural Networks Training
G
godbaiqi 已提交
2 3

## Motivation:
G
godbaiqi 已提交
4
​At present, mixed precision can automatically adjust the accuracy of fp16 and fp32 for the network to improve training performance and memory optimization. Because operators have different costs on different AI chips, all optimization strategies for different AI chips are different. The network configuration of different hardware is different, so how to automatically generate the precision adjustment strategy that adapts to various hardware, especially the low bit strategy has become a difficult problem.
G
godbaiqi 已提交
5 6

## Target:
G
godbaiqi 已提交
7
​Self-adaptively provides a low-bit precision training mechanism for various networks.
G
godbaiqi 已提交
8 9 10 11

![target](target.PNG)

## Method:
G
godbaiqi 已提交
12
​We expect the applicant can conduct low-bit neural networks training  research based on MindSpore, and hope to get your valuable suggestions to MindSpore in the process. We will do our best to improve the capabilities of the MindSpore framework and  provide you with the most powerful technical support.
G
godbaiqi 已提交
13

G
godbaiqi 已提交
14 15
## How To Join:
* Submit an issue/PR based on community discussion for consultation or claim on related topics
16
* Submit your proposal to us by email <baochong@huawei.com>