提交 af3903ec 编写于 作者: M mindspore-ci-bot 提交者: Gitee

!519 [Mixed-precision] Changing the mixed-precision level

Merge pull request !519 from Xiaoda/master
...@@ -84,7 +84,7 @@ label = Tensor(np.zeros([64, 128]).astype(np.float32)) ...@@ -84,7 +84,7 @@ label = Tensor(np.zeros([64, 128]).astype(np.float32))
# Define Loss and Optimizer # Define Loss and Optimizer
loss = nn.SoftmaxCrossEntropyWithLogits() loss = nn.SoftmaxCrossEntropyWithLogits()
optimizer = Momentum(params=net.trainable_params(), learning_rate=0.1, momentum=0.9) optimizer = Momentum(params=net.trainable_params(), learning_rate=0.1, momentum=0.9)
train_network = amp.build_train_network(net, optimizer, loss, level="O2", loss_scale_manager=None) train_network = amp.build_train_network(net, optimizer, loss, level="O3", loss_scale_manager=None)
# Run training # Run training
output = train_network(predict, label) output = train_network(predict, label)
......
...@@ -83,7 +83,7 @@ label = Tensor(np.zeros([64, 128]).astype(np.float32)) ...@@ -83,7 +83,7 @@ label = Tensor(np.zeros([64, 128]).astype(np.float32))
# Define Loss and Optimizer # Define Loss and Optimizer
loss = nn.SoftmaxCrossEntropyWithLogits() loss = nn.SoftmaxCrossEntropyWithLogits()
optimizer = Momentum(params=net.trainable_params(), learning_rate=0.1, momentum=0.9) optimizer = Momentum(params=net.trainable_params(), learning_rate=0.1, momentum=0.9)
train_network = amp.build_train_network(net, optimizer, loss, level="O2", loss_scale_manager=None) train_network = amp.build_train_network(net, optimizer, loss, level="O3", loss_scale_manager=None)
# Run training # Run training
output = train_network(predict, label) output = train_network(predict, label)
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册